{"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsets the order of the elements in the order of the elements in the SplitPiece a and b.", "response": "def set_order_by_clip(self, a, b):\n '''\n Determine which SplitPiece is the leftmost based\n on the side of the longest clipping operation\n '''\n if self.is_left_clip(a.cigar):\n self.query_left = b\n self.query_right = a\n else:\n self.query_left = a\n self.query_right = b"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef is_left_clip(self, cigar):\n '''\n whether the left side of the read (w/ respect to reference) is clipped.\n Clipping side is determined as the side with the longest clip.\n Adjacent clipping operations are not considered\n '''\n left_tuple = cigar[0]\n right_tuple = cigar[-1]\n left_clipped = self.is_clip_op(left_tuple[0])\n right_clipped = self.is_clip_op(right_tuple[0])\n\n return (left_clipped and not right_clipped) or (left_clipped and right_clipped and left_tuple[1] > right_tuple[1])", "response": "Return True if the left side of the read is clipped."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck request is streaming request or not", "response": "def _is_streaming_request(self):\n \"\"\"check request is stream request or not\"\"\"\n arg2 = self.argstreams[1]\n arg3 = self.argstreams[2]\n return not (isinstance(arg2, InMemStream) and\n isinstance(arg3, InMemStream) and\n ((arg2.auto_close and arg3.auto_close) or (\n arg2.state == StreamState.completed and\n arg3.state == StreamState.completed)))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning True if we should retry on an error", "response": "def should_retry_on_error(self, error):\n \"\"\"rules for retry\n\n :param error:\n ProtocolException that returns from Server\n \"\"\"\n\n if self.is_streaming_request:\n # not retry for streaming request\n return False\n\n retry_flag = self.headers.get('re', retry.DEFAULT)\n\n if retry_flag == retry.NEVER:\n return False\n\n if isinstance(error, StreamClosedError):\n return True\n\n if error.code in [ErrorCode.bad_request, ErrorCode.cancelled,\n ErrorCode.unhealthy]:\n return False\n elif error.code in [ErrorCode.busy, ErrorCode.declined]:\n return True\n elif error.code is ErrorCode.timeout:\n return retry_flag is not retry.CONNECTION_ERROR\n elif error.code in [ErrorCode.network_error,\n ErrorCode.fatal,\n ErrorCode.unexpected]:\n return retry_flag is not retry.TIMEOUT\n else:\n return False"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef client_for(service, service_module, thrift_service_name=None):\n assert service_module, 'service_module is required'\n\n service = service or '' # may be blank for non-hyperbahn use cases\n\n if not thrift_service_name:\n thrift_service_name = service_module.__name__.rsplit('.', 1)[-1]\n\n method_names = get_service_methods(service_module.Iface)\n\n def init(\n self,\n tchannel,\n hostport=None,\n trace=False,\n protocol_headers=None,\n ):\n self.async_thrift = self.__async_client_class__(\n tchannel=tchannel,\n hostport=hostport,\n trace=trace,\n protocol_headers=protocol_headers,\n )\n self.threadloop = tchannel._threadloop\n\n init.__name__ = '__init__'\n\n methods = {\n '__init__': init,\n '__async_client_class__': async_client_for(\n service=service,\n service_module=service_module,\n thrift_service_name=thrift_service_name,\n )\n }\n\n methods.update({\n method_name: generate_method(method_name)\n for method_name in method_names\n })\n\n return type(thrift_service_name + 'Client', (object,), methods)", "response": "Build a synchronous client class for the given Thrift service."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef generate_method(method_name):\n\n def call(self, *args, **kwargs):\n \"\"\"Forward RPC call to TChannelSyncClient\n\n :return concurrent.futures.Future:\n \"\"\"\n if not self.threadloop.is_ready():\n self.threadloop.start()\n\n return self.threadloop.submit(\n getattr(self.async_thrift, method_name), *args, **kwargs\n )\n\n return call", "response": "Generates a method that invokes the given method using the provided Thrift service."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef read_full(stream):\n assert stream, \"stream is required\"\n\n chunks = []\n chunk = yield stream.read()\n\n while chunk:\n chunks.append(chunk)\n chunk = yield stream.read()\n\n raise tornado.gen.Return(b''.join(chunks))", "response": "Read the full contents of the given stream into memory."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef maybe_stream(s):\n if isinstance(s, Stream):\n return s\n\n if s is None:\n stream = InMemStream()\n stream.close() # we don't intend to write anything\n return stream\n\n if isinstance(s, unicode):\n s = s.encode('utf-8')\n if isinstance(s, bytearray):\n s = bytes(s)\n\n if isinstance(s, bytes):\n stream = InMemStream(s)\n stream.close() # we don't intend to write anything\n return stream\n\n # s may still conform to the Stream interface. Yay duck typing.\n return s", "response": "Ensure that the given argument is a stream."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef build_raw_error_message(protocol_exception):\n message = ErrorMessage(\n id=protocol_exception.id,\n code=protocol_exception.code,\n tracing=protocol_exception.tracing,\n description=protocol_exception.description,\n )\n\n return message", "response": "build a raw error message based on an error object"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nbuilding protocol level message based on request and args.", "response": "def build_raw_request_message(self, request, args, is_completed=False):\n \"\"\"build protocol level message based on request and args.\n\n request object contains meta information about outgoing request.\n args are the currently chunk data from argstreams\n is_completed tells the flags of the message\n\n :param request: Request\n :param args: array of arg streams\n :param is_completed: message flags\n :return: CallRequestMessage/CallRequestContinueMessage\n \"\"\"\n request.flags = FlagsType.none if is_completed else FlagsType.fragment\n\n # TODO decide what need to pass from request\n if request.state == StreamState.init:\n message = CallRequestMessage(\n flags=request.flags,\n ttl=request.ttl * 1000,\n tracing=request.tracing,\n service=request.service,\n headers=request.headers,\n checksum=request.checksum,\n args=args\n )\n request.state = (StreamState.completed if is_completed\n else StreamState.streaming)\n elif request.state == StreamState.streaming:\n message = CallRequestContinueMessage(\n flags=request.flags,\n checksum=request.checksum,\n args=args\n )\n request.state = (StreamState.completed if is_completed\n else StreamState.streaming)\n\n message.id = request.id\n return message"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nbuilding protocol level message based on response and args.", "response": "def build_raw_response_message(self, response, args, is_completed=False):\n \"\"\"build protocol level message based on response and args.\n\n response object contains meta information about outgoing response.\n args are the currently chunk data from argstreams\n is_completed tells the flags of the message\n\n :param response: Response\n :param args: array of arg streams\n :param is_completed: message flags\n :return: CallResponseMessage/CallResponseContinueMessage\n \"\"\"\n response.flags = FlagsType.none if is_completed else FlagsType.fragment\n\n # TODO decide what need to pass from request\n if response.state == StreamState.init:\n message = CallResponseMessage(\n flags=response.flags,\n code=response.code,\n tracing=response.tracing,\n headers=response.headers,\n checksum=response.checksum,\n args=args\n )\n response.state = (StreamState.completed if is_completed\n else StreamState.streaming)\n elif response.state == StreamState.streaming:\n message = CallResponseContinueMessage(\n flags=response.flags,\n checksum=response.checksum,\n args=args\n )\n response.state = (StreamState.completed if is_completed\n else StreamState.streaming)\n\n message.id = response.id\n return message"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nbuilding inbound request object from protocol level message info.", "response": "def build_request(self, message):\n \"\"\"Build inbound request object from protocol level message info.\n\n It is allowed to take incompleted CallRequestMessage. Therefore the\n created request may not contain whole three arguments.\n\n :param message: CallRequestMessage\n :return: request object\n \"\"\"\n\n args = self.prepare_args(message)\n\n # TODO decide what to pass to Request from message\n req = Request(\n flags=message.flags,\n ttl=message.ttl / 1000.0,\n tracing=message.tracing,\n service=message.service,\n headers=message.headers,\n checksum=message.checksum,\n argstreams=args,\n id=message.id,\n )\n return req"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nbuilding a Response object from a protocol level message info .", "response": "def build_response(self, message):\n \"\"\"Build response object from protocol level message info\n\n It is allowed to take incompleted CallResponseMessage. Therefore the\n created request may not contain whole three arguments.\n\n :param message: CallResponseMessage\n :return: response object\n \"\"\"\n\n args = self.prepare_args(message)\n\n # TODO decide what to pass to Response from message\n res = Response(\n flags=message.flags,\n code=message.code,\n headers=message.headers,\n checksum=message.checksum,\n argstreams=args,\n id=message.id,\n )\n return res"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nbuilds the next complete message based on the incoming message.", "response": "def build(self, message):\n \"\"\"buffer all the streaming messages based on the\n message id. Reconstruct all fragments together.\n\n :param message:\n incoming message\n :return: next complete message or None if streaming\n is not done\n \"\"\"\n context = None\n\n if message.message_type in [Types.CALL_REQ,\n Types.CALL_RES]:\n self.verify_message(message)\n\n context = self.build_context(message)\n # streaming message\n if message.flags == common.FlagsType.fragment:\n self.message_buffer[message.id] = context\n\n # find the incompleted stream\n num = 0\n for i, arg in enumerate(context.argstreams):\n if arg.state != StreamState.completed:\n num = i\n break\n\n self.close_argstream(context, num)\n return context\n\n elif message.message_type in [Types.CALL_REQ_CONTINUE,\n Types.CALL_RES_CONTINUE]:\n context = self.message_buffer.get(message.id)\n if context is None:\n # missing call msg before continue msg\n raise FatalProtocolError(\n \"missing call message after receiving continue message\",\n message.id,\n )\n\n # find the incompleted stream\n dst = 0\n for i, arg in enumerate(context.argstreams):\n if arg.state != StreamState.completed:\n dst = i\n break\n\n try:\n self.verify_message(message)\n except InvalidChecksumError as e:\n context.argstreams[dst].set_exception(e)\n raise\n\n src = 0\n while src < len(message.args):\n context.argstreams[dst].write(message.args[src])\n dst += 1\n src += 1\n\n if message.flags != FlagsType.fragment:\n # get last fragment. mark it as completed\n assert (len(context.argstreams) ==\n CallContinueMessage.max_args_num)\n self.message_buffer.pop(message.id, None)\n context.flags = FlagsType.none\n\n self.close_argstream(context, dst - 1)\n return None\n elif message.message_type == Types.ERROR:\n context = self.message_buffer.pop(message.id, None)\n if context is None:\n log.info('Unconsumed error %s', message)\n return None\n else:\n error = TChannelError.from_code(\n message.code,\n description=message.description,\n tracing=context.tracing,\n )\n\n context.set_exception(error)\n return error\n else:\n return message"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nfragment a message based on max payload size", "response": "def fragment(self, message):\n \"\"\"Fragment message based on max payload size\n\n note: if the message doesn't need to fragment,\n it will return a list which only contains original\n message itself.\n\n :param message: raw message\n :return: list of messages whose sizes <= max\n payload size\n \"\"\"\n if message.message_type in [Types.CALL_RES,\n Types.CALL_REQ,\n Types.CALL_REQ_CONTINUE,\n Types.CALL_RES_CONTINUE]:\n rw = RW[message.message_type]\n payload_space = (common.MAX_PAYLOAD_SIZE -\n rw.length_no_args(message))\n # split a call/request message into an array\n # with a call/request message and {0~n} continue\n # message\n fragment_msg = message.fragment(payload_space)\n self.generate_checksum(message)\n\n yield message\n while fragment_msg is not None:\n message = fragment_msg\n rw = RW[message.message_type]\n payload_space = (common.MAX_PAYLOAD_SIZE -\n rw.length_no_args(message))\n fragment_msg = message.fragment(payload_space)\n self.generate_checksum(message)\n yield message\n else:\n yield message"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef verify_message(self, message):\n if verify_checksum(\n message,\n self.in_checksum.get(message.id, 0),\n ):\n self.in_checksum[message.id] = message.checksum[1]\n\n if message.flags == FlagsType.none:\n self.in_checksum.pop(message.id)\n else:\n self.in_checksum.pop(message.id, None)\n raise InvalidChecksumError(\n description=\"Checksum does not match!\",\n id=message.id,\n )", "response": "Verify the checksum of the message."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nbuild a ReadWriter from a list of ReadWriters in - order.", "response": "def chain(*rws):\n \"\"\"Build a ReadWriter from the given list of ReadWriters.\n\n .. code-block:: python\n\n chain(\n number(1),\n number(8),\n len_prefixed_string(number(2)),\n ) # == n1:1 n2:8 s~2\n\n Reads/writes from the given ReadWriters in-order. Returns lists of values\n in the same order as the ReadWriters.\n\n :param rws:\n One or more ReadWriters\n \"\"\"\n assert rws is not None\n if len(rws) == 1 and isinstance(rws[0], list):\n # In case someone does chain([l0, l1, ...])\n rws = rws[0]\n return ChainReadWriter(rws)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nread the given number of bytes from the given stream.", "response": "def take(self, stream, num):\n \"\"\"Read the given number of bytes from the stream.\n\n :param stream:\n stream to read from\n :param num:\n number of bytes to read\n :raises ReadError:\n if the stream did not yield the exact number of bytes expected\n \"\"\"\n s = stream.read(num)\n slen = len(s)\n if slen != num:\n raise ReadError(\n \"Expected %d bytes but got %d bytes.\" % (num, slen)\n )\n return s"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_service_methods(iface):\n methods = inspect.getmembers(iface, predicate=inspect.ismethod)\n\n return set(\n name for (name, method) in methods if not name.startswith('__')\n )", "response": "Get a list of methods defined in the Thrift - generated interface for a Thrift service."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nwarn every time a fn is called.", "response": "def deprecated(message):\n \"\"\"Warn every time a fn is called.\"\"\"\n def decorator(fn):\n @functools.wraps(fn)\n def new_fn(*args, **kwargs):\n deprecate(message)\n return fn(*args, **kwargs)\n return new_fn\n return decorator"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef load(path, service=None, hostport=None, module_name=None):\n # TODO replace with more specific exceptions\n # assert service, 'service is required'\n # assert path, 'path is required'\n\n # Backwards compatibility for callers passing in service name as first arg.\n if not path.endswith('.thrift'):\n service, path = path, service\n\n module = thriftrw.load(path=path, name=module_name)\n return TChannelThriftModule(service, module, hostport)", "response": "Loads the Thrift file at the specified path and returns a Python module containing the result of the load."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef register(dispatcher, service, handler=None, method=None):\n\n def decorator(method, handler):\n if not method:\n method = handler.__name__\n\n function = getattr(service, method, None)\n assert function, (\n 'Service \"%s\" does not define method \"%s\"' % (service.name, method)\n )\n assert not function.oneway\n\n dispatcher.register(\n function.endpoint,\n build_handler(function, handler),\n ThriftRWSerializer(service._module, function._request_cls),\n ThriftRWSerializer(service._module, function._response_cls),\n )\n return handler\n\n if handler is None:\n return partial(decorator, method)\n else:\n return decorator(method, handler)", "response": "Decorator for registering a Thrift function."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef interface_ip(interface):\n sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)\n return socket.inet_ntoa(\n fcntl.ioctl(\n sock.fileno(), 0x8915, struct.pack('256s', interface[:15])\n )[20:24]\n )", "response": "Determine the IP assigned to us by the given network interface."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting the local network IP of this machine", "response": "def local_ip():\n \"\"\"Get the local network IP of this machine\"\"\"\n ip = socket.gethostbyname(socket.gethostname())\n if ip.startswith('127.'):\n # Check eth0, eth1, eth2, en0, ...\n interfaces = [\n i + str(n) for i in (\"eth\", \"en\", \"wlan\") for n in xrange(3)\n ] # :(\n for interface in interfaces:\n try:\n ip = interface_ip(interface)\n break\n except IOError:\n pass\n return ip"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nbuilds a client class for the given Thrift service.", "response": "def client_for(service, service_module, thrift_service_name=None):\n \"\"\"Build a client class for the given Thrift service.\n\n The generated class accepts a TChannel and an optional hostport as\n initialization arguments.\n\n Given ``CommentService`` defined in ``comment.thrift`` and registered with\n Hyperbahn under the name \"comment\", here's how this may be used:\n\n .. code-block:: python\n\n from comment import CommentService\n\n CommentServiceClient = client_for(\"comment\", CommentService)\n\n @gen.coroutine\n def post_comment(articleId, msg, hostport=None):\n client = CommentServiceClient(tchannel, hostport)\n yield client.postComment(articleId, CommentService.Comment(msg))\n\n :param service:\n Name of the Hyperbahn service being called. This is the name with\n which the service registered with Hyperbahn.\n :param service_module:\n The Thrift-generated module for that service. This usually has the\n same name as defined for the service in the IDL.\n :param thrift_service_name:\n If the Thrift service has a different name than its module, use this\n parameter to specify it.\n :returns:\n An object with the same interface as the service that uses the given\n TChannel to call the service.\n \"\"\"\n assert service_module, 'service_module is required'\n service = service or '' # may be blank for non-hyperbahn use cases\n if not thrift_service_name:\n thrift_service_name = service_module.__name__.rsplit('.', 1)[-1]\n\n method_names = get_service_methods(service_module.Iface)\n\n def new(cls, tchannel, hostport=None, trace=False, protocol_headers=None):\n \"\"\"\n :param tchannel:\n TChannel through which the requests will be sent.\n :param hostport:\n Address of the machine to which the requests will be sent, or None\n if the TChannel will do peer selection on a per-request basis.\n :param trace:\n Whether tracing is enabled.\n :param protocol_headers:\n Protocol-level headers to send with the request.\n \"\"\"\n protocol_headers = protocol_headers or {}\n protocol_headers['as'] = 'thrift'\n return _ClientBase.__new__(\n cls, tchannel, hostport, service, trace, protocol_headers\n )\n\n new.__name__ = '__new__'\n methods = {'__new__': new}\n\n for method_name in method_names:\n methods[method_name] = generate_method(\n service_module, thrift_service_name, method_name\n )\n\n return type(thrift_service_name + 'Client', (_ClientBase,), methods)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngenerates a method for the given Thrift service.", "response": "def generate_method(service_module, service_name, method_name):\n \"\"\"Generate a method for the given Thrift service.\n\n :param service_module:\n Thrift-generated service module\n :param service_name:\n Name of the Thrift service\n :param method_name:\n Method being called\n \"\"\"\n assert service_module\n assert service_name\n assert method_name\n\n args_type = getattr(service_module, method_name + '_args')\n result_type = getattr(service_module, method_name + '_result', None)\n serializer = ThriftSerializer(result_type)\n # oneway not currently supported\n # TODO - write test for this\n if result_type is None:\n def not_supported(self, *args, **kwags):\n raise OneWayNotSupportedError(\n 'TChannel+Thrift does not currently support oneway procedues'\n )\n return not_supported\n\n result_spec = result_type.thrift_spec\n # result_spec is a tuple of tuples in the form:\n #\n # (fieldId, fieldType, fieldName, ...)\n #\n # Where \"...\" is other information we don't care about right now.\n #\n # result_spec will be empty if there is no return value or exception for\n # the method.\n #\n # Its first element, with field ID 0, contains the spec for the return\n # value. It is None if the result type is void but the method may still\n # throw exceptions.\n #\n # Elements after the first one are specs for the exceptions.\n\n endpoint = '%s::%s' % (service_name, method_name)\n\n @gen.coroutine\n def send(self, *args, **kwargs):\n params = inspect.getcallargs(\n getattr(service_module.Iface, method_name), self, *args, **kwargs\n )\n params.pop('self') # self is already known\n\n # $methodName_args is the implicit struct containing the various\n # method parameters.\n call_args = args_type()\n for name, value in params.items():\n setattr(call_args, name, value)\n\n tracer = tracing.ClientTracer(channel=self.tchannel)\n span, headers = tracer.start_span(\n service=service_name, endpoint=method_name, headers={}\n )\n\n body = serializer.serialize_body(call_args)\n header = serializer.serialize_header(headers)\n\n # Glue for old API.\n if hasattr(self.tchannel, 'request'):\n tracing.apply_trace_flag(span, self.trace, True)\n with span:\n response = yield self.tchannel.request(\n hostport=self.hostport, service=self.service\n ).send(\n arg1=endpoint,\n arg2=header,\n arg3=body, # body\n headers=self.protocol_headers,\n )\n body = yield response.get_body()\n else:\n with span:\n response = yield self.tchannel.call(\n scheme=schemes.THRIFT,\n service=self.service,\n arg1=endpoint,\n arg2=header,\n arg3=body,\n hostport=self.hostport,\n trace=self.trace,\n tracing_span=span\n # TODO: Need to handle these!\n # headers=self.protocol_headers,\n )\n body = response.body\n\n call_result = serializer.deserialize_body(body)\n\n if not result_spec:\n # void return type and no exceptions allowed\n raise gen.Return(None)\n\n for exc_spec in result_spec[1:]:\n # May have failed with an exception\n exc = getattr(call_result, exc_spec[2])\n if exc is not None:\n raise exc\n\n if result_spec[0]:\n # Non-void return type. Return the result.\n success = getattr(call_result, result_spec[0][2])\n if success is not None:\n raise gen.Return(success)\n else:\n # No return type specified and no exceptions raised.\n raise gen.Return(None)\n\n # Expected a result but nothing was present in the object. Something\n # went wrong.\n from thrift import Thrift\n raise Thrift.TApplicationException(\n Thrift.TApplicationException.MISSING_RESULT,\n '%s failed: did not receive a result as expected' % method_name\n )\n # TODO: We should probably throw a custom exception instead.\n\n send.__name__ = method_name\n return send"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nconnecting to this peer.", "response": "def connect(self):\n \"\"\"Get a connection to this peer.\n\n If an connection to the peer already exists (either incoming or\n outgoing), that's returned. Otherwise, a new outgoing connection to\n this peer is created.\n\n :return:\n A future containing a connection to this host.\n \"\"\"\n # Prefer incoming connections over outgoing connections.\n if self.connections:\n # First value is an incoming connection\n future = gen.Future()\n future.set_result(self.connections[0])\n return future\n\n if self._connecting:\n # If we're in the process of connecting to the peer, just wait\n # and re-use that connection.\n return self._connecting\n\n conn_future = self._connecting = self.connection_class.outgoing(\n hostport=self.hostport,\n process_name=self.tchannel.process_name,\n serve_hostport=self.tchannel.hostport,\n handler=self.tchannel.receive_call,\n tchannel=self.tchannel,\n )\n\n def on_connect(_):\n if not conn_future.exception():\n # We don't actually need to handle the exception. That's on\n # the caller.\n connection = conn_future.result()\n self.register_outgoing_conn(connection)\n self._connecting = None\n\n conn_future.add_done_callback(on_connect)\n return conn_future"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef register_outgoing_conn(self, conn):\n assert conn, \"conn is required\"\n conn.set_outbound_pending_change_callback(self._on_conn_change)\n self.connections.append(conn)\n self._set_on_close_cb(conn)\n self._on_conn_change()", "response": "Add an outgoing connection into the heap."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef register_incoming_conn(self, conn):\n assert conn, \"conn is required\"\n conn.set_outbound_pending_change_callback(self._on_conn_change)\n self.connections.appendleft(conn)\n self._set_on_close_cb(conn)\n self._on_conn_change()", "response": "Add incoming connection into the heap."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef outgoing_connections(self):\n\n # Outgoing connections are on the right\n return list(\n dropwhile(lambda c: c.direction != OUTGOING, self.connections)\n )", "response": "Returns a list of all outgoing connections for this peer."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn a list of all incoming connections for this peer.", "response": "def incoming_connections(self):\n \"\"\"Returns a list of all incoming connections for this peer.\"\"\"\n\n # Incoming connections are on the left.\n return list(\n takewhile(lambda c: c.direction == INCOMING, self.connections)\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfinds a peer and connect to it.", "response": "def _get_peer_connection(self, blacklist=None):\n \"\"\"Find a peer and connect to it.\n\n Returns a ``(peer, connection)`` tuple.\n\n Raises ``NoAvailablePeerError`` if no healthy peers are found.\n\n :param blacklist:\n If given, a set of hostports for peers that we must not try.\n \"\"\"\n\n blacklist = blacklist or set()\n\n peer = None\n connection = None\n\n while connection is None:\n peer = self._choose(blacklist)\n\n if not peer:\n raise NoAvailablePeerError(\n \"Can't find an available peer for '%s'\" % self.service\n )\n\n try:\n connection = yield peer.connect()\n except NetworkError as e:\n log.info(\n 'Failed to connect to %s. Trying a different host.',\n peer.hostport,\n exc_info=e,\n )\n connection = None\n blacklist.add(peer.hostport)\n\n raise gen.Return((peer, connection))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nsend a message to the peer.", "response": "def send(\n self, arg1, arg2, arg3,\n headers=None,\n retry_limit=None,\n ttl=None,\n ):\n \"\"\"Make a request to the Peer.\n\n :param arg1:\n String or Stream containing the contents of arg1. If None, an empty\n stream is used.\n :param arg2:\n String or Stream containing the contents of arg2. If None, an empty\n stream is used.\n :param arg3:\n String or Stream containing the contents of arg3. If None, an empty\n stream is used.\n :param headers:\n Headers will be put in the message as protocol header.\n :param retry_limit:\n Maximum number of retries will perform on the message. If the number\n is 0, it means no retry.\n :param ttl:\n Timeout for each request (second).\n :return:\n Future that contains the response from the peer.\n \"\"\"\n\n # find a peer connection\n # If we can't find available peer at the first time, we throw\n # NoAvailablePeerError. Later during retry, if we can't find available\n # peer, we throw exceptions from retry not NoAvailablePeerError.\n peer, connection = yield self._get_peer_connection()\n\n arg1, arg2, arg3 = (\n maybe_stream(arg1), maybe_stream(arg2), maybe_stream(arg3)\n )\n\n if retry_limit is None:\n retry_limit = DEFAULT_RETRY_LIMIT\n\n ttl = ttl or DEFAULT_TIMEOUT\n # hack to get endpoint from arg_1 for trace name\n arg1.close()\n endpoint = yield read_full(arg1)\n\n # set default transport headers\n headers = headers or {}\n for k, v in self.headers.iteritems():\n headers.setdefault(k, v)\n\n if self.tracing_span is None:\n tracer = ClientTracer(channel=self.tchannel)\n self.tracing_span, _ = tracer.start_span(\n service=self.service, endpoint=endpoint,\n hostport=self._hostport, encoding=self.headers.get('as')\n )\n\n request = Request(\n service=self.service,\n argstreams=[InMemStream(endpoint), arg2, arg3],\n id=connection.writer.next_message_id(),\n headers=headers,\n endpoint=endpoint,\n ttl=ttl,\n tracing=tracing.span_to_tracing_field(self.tracing_span)\n )\n\n # only retry on non-stream request\n if request.is_streaming_request or self._hostport:\n retry_limit = 0\n\n if request.is_streaming_request:\n request.ttl = 0\n\n try:\n with self.tracing_span: # to ensure span is finished\n response = yield self.send_with_retry(\n request, peer, retry_limit, connection\n )\n except Exception as e:\n # event: on_exception\n exc_info = sys.exc_info()\n yield self.tchannel.event_emitter.fire(\n EventType.on_exception, request, e,\n )\n six.reraise(*exc_info)\n\n log.debug(\"Got response %s\", response)\n\n raise gen.Return(response)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef clear(self):\n try:\n for peer in self._peers.values():\n peer.close()\n finally:\n self._peers = {}\n self._resetting = False", "response": "Reset this PeerGroup.\n\n This closes all connections to all known peers and forgets about\n these peers.\n\n :returns:\n A Future that resolves with a value of None when the operation\n has finished"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef remove(self, hostport):\n assert hostport, \"hostport is required\"\n peer = self._peers.pop(hostport, None)\n peer_in_heap = peer and peer.index != -1\n if peer_in_heap:\n self.peer_heap.remove_peer(peer)\n return peer", "response": "Removes the Peer for the given host port. Does nothing if a matching Peer does not exist. Returns None if no matching Peer is found."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngets a Peer for the given destination.", "response": "def get(self, hostport):\n \"\"\"Get a Peer for the given destination.\n\n A new Peer is added to the peer heap and returned if one does\n not already exist for the given host-port. Otherwise, the\n existing Peer is returned.\n \"\"\"\n assert hostport, \"hostport is required\"\n assert isinstance(hostport, basestring), \"hostport must be a string\"\n\n if hostport not in self._peers:\n self._add(hostport)\n\n return self._peers[hostport]"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _add(self, hostport):\n peer = self.peer_class(\n tchannel=self.tchannel,\n hostport=hostport,\n on_conn_change=self._update_heap,\n )\n peer.rank = self.rank_calculator.get_rank(peer)\n self._peers[peer.hostport] = peer\n\n self.peer_heap.add_and_shuffle(peer)", "response": "Adds a new peer to the peer heap"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget a Peer for the given destination for a request.", "response": "def _get_isolated(self, hostport):\n \"\"\"Get a Peer for the given destination for a request.\n\n A new Peer is added and returned if one does not already exist for the\n given host-port. Otherwise, the existing Peer is returned.\n\n **NOTE** new peers will not be added to the peer heap.\n \"\"\"\n assert hostport, \"hostport is required\"\n if hostport not in self._peers:\n # Add a peer directly from a hostport, do NOT add it to the peer\n # heap\n peer = self.peer_class(\n tchannel=self.tchannel,\n hostport=hostport,\n )\n self._peers[peer.hostport] = peer\n\n return self._peers[hostport]"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ninitiates a new request through this PeerGroup.", "response": "def request(self, service, hostport=None, **kwargs):\n \"\"\"Initiate a new request through this PeerGroup.\n\n :param hostport:\n If specified, requests will be sent to the specific host.\n Otherwise, a known peer will be picked at random.\n :param service:\n Name of the service being called. Defaults to an empty string.\n \"\"\"\n return PeerClientOperation(\n peer_group=self,\n service=service,\n hostport=hostport,\n **kwargs)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn a Peer that matches the given criteria.", "response": "def choose(self, hostport=None, blacklist=None):\n \"\"\"Choose a Peer that matches the given criteria.\n\n :param hostport:\n Specifies that the returned Peer must be for the given host-port.\n Without this, all peers managed by this PeerGroup are\n candidates.\n :param blacklist:\n Peers on the blacklist won't be chosen.\n :returns:\n A Peer that matches all the requested criteria or None if no such\n Peer was found.\n \"\"\"\n\n blacklist = blacklist or set()\n if hostport:\n return self._get_isolated(hostport)\n\n return self.peer_heap.smallest_peer(\n (lambda p: p.hostport not in blacklist and not p.is_ephemeral),\n )"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_arg(context, index):\n if index < len(context.argstreams):\n arg = \"\"\n chunk = yield context.argstreams[index].read()\n while chunk:\n arg += chunk\n chunk = yield context.argstreams[index].read()\n\n raise tornado.gen.Return(arg)\n else:\n raise TChannelError()", "response": "get value from arg stream in async way"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nputting an item into the queue. Returns a Future that resolves to None when the value has been successfully put into the queue.", "response": "def put(self, value):\n \"\"\"Puts an item into the queue.\n\n Returns a Future that resolves to None once the value has been\n accepted by the queue.\n \"\"\"\n io_loop = IOLoop.current()\n new_hole = Future()\n\n new_put = Future()\n new_put.set_result(new_hole)\n\n with self._lock:\n self._put, put = new_put, self._put\n\n answer = Future()\n\n def _on_put(future):\n if future.exception(): # pragma: no cover (never happens)\n return answer.set_exc_info(future.exc_info())\n\n old_hole = put.result()\n old_hole.set_result(Node(value, new_hole))\n answer.set_result(None)\n\n io_loop.add_future(put, _on_put)\n return answer"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn a value from the queue without waiting. Raises QueueEmpty if no values are available right now.", "response": "def get_nowait(self):\n \"\"\"Returns a value from the queue without waiting.\n\n Raises ``QueueEmpty`` if no values are available right now.\n \"\"\"\n new_get = Future()\n\n with self._lock:\n if not self._get.done():\n raise QueueEmpty\n get, self._get = self._get, new_get\n\n hole = get.result()\n if not hole.done():\n # Restore the unfinished hole.\n new_get.set_result(hole)\n raise QueueEmpty\n\n node = hole.result()\n value = node.value\n new_hole, node.next = node.next, None\n new_get.set_result(new_hole)\n return value"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get(self):\n io_loop = IOLoop.current()\n new_get = Future()\n\n with self._lock:\n get, self._get = self._get, new_get\n\n answer = Future()\n\n def _on_node(future):\n if future.exception(): # pragma: no cover (never happens)\n return answer.set_exc_info(future.exc_info())\n\n node = future.result()\n value = node.value\n new_hole, node.next = node.next, None\n new_get.set_result(new_hole)\n answer.set_result(value)\n\n def _on_get(future):\n if future.exception(): # pragma: no cover (never happens)\n return answer.set_exc_info(future.exc_info())\n\n hole = future.result()\n io_loop.add_future(hole, _on_node)\n\n io_loop.add_future(get, _on_get)\n return answer", "response": "Gets the next item from the queue."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nstream Message got fragmented based on payload size. All the data within space_left will be kept. All the rest will be shifted to next fragment message. :param space_left: space left for current frame :param fragment_msg: the type is either CallRequestMessage or CallResponseMessage :return: None if there is space left or next fragment message", "response": "def fragment(self, space_left, fragment_msg):\n \"\"\"Streaming Message got fragmented based on\n payload size. All the data within space_left\n will be kept. All the rest will be shifted to\n next fragment message.\n\n :param space_left:\n space left for current frame\n :param fragment_msg:\n the type is either CallRequestMessage or\n CallResponseMessage\n :return: None if there is space left\n or next fragment message\n \"\"\"\n new_args = []\n key_length = 2 # 2bytes for size\n for i, arg in enumerate(self.args):\n if space_left >= key_length:\n space_left -= key_length\n\n if arg is not None:\n arg_length = len(arg)\n if space_left < arg_length:\n fragment_msg.args.append(arg[space_left:])\n new_args.append(arg[:space_left])\n space_left = 0\n else:\n new_args.append(arg)\n space_left -= arg_length\n if space_left <= key_length:\n # boundary for arg\n fragment_msg.args.append(\"\")\n else:\n new_args.append(\"\")\n else:\n for l in range(i, len(self.args)):\n fragment_msg.args.append(self.args[l])\n break\n\n self.args = new_args\n if space_left >= 0 and len(fragment_msg.args) == 0:\n # don't need to fragment any more\n return None\n else:\n self.flags = FlagsType.fragment\n fragment_msg.id = self.id\n return fragment_msg"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef response_from_mixed(mixed):\n\n # if none then give empty Response\n if mixed is None:\n return Response()\n\n # if not Response, then treat like body\n if not isinstance(mixed, Response):\n return Response(mixed)\n\n # it's already a Response\n return mixed", "response": "Create Response from mixed input."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nregistering a hook to be called when an event is fired.", "response": "def register_hook(self, hook, event_type=None):\n \"\"\"\n If ``event_type`` is provided, then ``hook`` will be called whenever\n that event is fired.\n\n If no ``event_type`` is specifid, but ``hook`` implements any methods\n with names matching an event hook, then those will be registered with\n their corresponding events. This allows for more stateful, class-based\n event handlers.\n \"\"\"\n if event_type is not None:\n assert type(event_type) is int, \"register hooks with int values\"\n return self.hooks[event_type].append(hook)\n\n for event_type in EventType._fields:\n func = getattr(hook, event_type, None)\n if callable(func):\n event_value = getattr(EventType, event_type)\n self.register_hook(func, event_value)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ninitializing existing object into the heap.", "response": "def init(h):\n \"\"\"Initialize existing object into the heap.\"\"\"\n # heapify\n n = h.size()\n for i in six.moves.range(int(math.floor(n/2)) - 1, -1, -1):\n down(h, i, n)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\npushes a new value into heap.", "response": "def push(h, x):\n \"\"\"Push a new value into heap.\"\"\"\n h.push(x)\n up(h, h.size()-1)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\npop the value from the heap.", "response": "def pop(h):\n \"\"\"Pop the heap value from the heap.\"\"\"\n n = h.size() - 1\n h.swap(0, n)\n down(h, 0, n)\n return h.pop()"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nremoves the item at position i from the heap.", "response": "def remove(h, i):\n \"\"\"Remove the item at position i of the heap.\"\"\"\n n = h.size() - 1\n if n != i:\n h.swap(i, n)\n down(h, i, n)\n up(h, i)\n\n return h.pop()"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fix(h, i):\n down(h, i, h.size())\n up(h, i)", "response": "Rearrange the heap after the item at position i got updated."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfinds the index of the smallest item in the given heap that matches the given predicate.", "response": "def smallest(heap, predicate):\n \"\"\"Finds the index of the smallest item in the heap that matches the given\n predicate.\n\n :param heap:\n Heap on which this search is being performed.\n :param predicate:\n Function that accepts an item from the heap and returns true or false.\n :returns:\n Index of the first item for which ``predicate`` returned true.\n :raises NoMatchError:\n If no matching items were found.\n \"\"\"\n n = heap.size()\n\n # items contains indexes of items yet to be checked.\n items = deque([0])\n while items:\n current = items.popleft()\n if current >= n:\n continue\n\n if predicate(heap.peek(current)):\n return current\n\n child1 = 2 * current + 1\n child2 = child1 + 1\n\n if child1 < n and child2 < n and heap.lt(child2, child1):\n # make sure we check the smaller child first.\n child1, child2 = child2, child1\n\n if child1 < n:\n items.append(child1)\n\n if child2 < n:\n items.append(child2)\n\n raise NoMatchError()"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ninject the span into the Trace field", "response": "def span_to_tracing_field(span):\n \"\"\"\n Inject the span into Trace field, if Zipkin format is supported\n :param span: OpenTracing Span\n \"\"\"\n if span is None:\n return common.random_tracing()\n # noinspection PyBroadException\n try:\n carrier = {}\n span.tracer.inject(span, ZIPKIN_SPAN_FORMAT, carrier)\n tracing = Tracing(span_id=carrier['span_id'],\n trace_id=carrier['trace_id'],\n parent_id=carrier['parent_id'] or 0L,\n traceflags=carrier['traceflags'])\n return tracing\n except opentracing.UnsupportedFormatException:\n pass # tracer might not support Zipkin format\n except:\n log.exception('Failed to inject tracing span into headers')\n return common.random_tracing()"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef apply_trace_flag(span, trace, default_trace):\n if trace is None:\n trace = default_trace\n trace = trace() if callable(trace) else trace\n if trace is False and span:\n span.set_tag(tags.SAMPLING_PRIORITY, 0)", "response": "Apply tracing flag to a node."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef start_basic_span(self, request):\n # noinspection PyBroadException\n try:\n # Currently Java does not populate Tracing field, so do not\n # mistaken it for a real trace ID.\n if request.tracing.trace_id:\n context = self.tracer.extract(\n format=ZIPKIN_SPAN_FORMAT,\n carrier=request.tracing)\n self.span = self.tracer.start_span(\n operation_name=request.endpoint,\n child_of=context,\n tags={tags.SPAN_KIND: tags.SPAN_KIND_RPC_SERVER},\n )\n except opentracing.UnsupportedFormatException:\n pass # tracer might not support Zipkin format\n except:\n log.exception('Cannot extract tracing span from Trace field')", "response": "Start tracing span from the protocol s tracing fields."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef start_span(self, request, headers, peer_host, peer_port):\n parent_context = None\n # noinspection PyBroadException\n try:\n if headers and hasattr(headers, 'iteritems'):\n tracing_headers = {\n k[len(TRACING_KEY_PREFIX):]: v\n for k, v in headers.iteritems()\n if k.startswith(TRACING_KEY_PREFIX)\n }\n parent_context = self.tracer.extract(\n format=opentracing.Format.TEXT_MAP,\n carrier=tracing_headers\n )\n if self.span and parent_context:\n # we already started a span from Tracing fields,\n # so only copy baggage from the headers.\n for k, v in parent_context.baggage.iteritems():\n self.span.set_baggage_item(k, v)\n except:\n log.exception('Cannot extract tracing span from headers')\n if self.span is None:\n self.span = self.tracer.start_span(\n operation_name=request.endpoint,\n child_of=parent_context,\n tags={tags.SPAN_KIND: tags.SPAN_KIND_RPC_SERVER},\n )\n if 'cn' in request.headers:\n self.span.set_tag(tags.PEER_SERVICE, request.headers['cn'])\n if peer_host:\n self.span.set_tag(tags.PEER_HOST_IPV4, peer_host)\n if peer_port:\n self.span.set_tag(tags.PEER_PORT, peer_port)\n if 'as' in request.headers:\n self.span.set_tag('as', request.headers['as'])\n return self.span", "response": "Start a new server - side span."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nconstructs a TChannelError instance from an error code.", "response": "def from_code(cls, code, **kw):\n \"\"\"Construct a ``TChannelError`` instance from an error code.\n\n This will return the appropriate class type for the given code.\n \"\"\"\n return {\n TIMEOUT: TimeoutError,\n CANCELED: CanceledError,\n BUSY: BusyError,\n DECLINED: DeclinedError,\n UNEXPECTED_ERROR: UnexpectedError,\n BAD_REQUEST: BadRequestError,\n NETWORK_ERROR: NetworkError,\n UNHEALTHY: UnhealthyError,\n FATAL: FatalProtocolError,\n }[code](**kw)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef register(dispatcher, service_module, handler, method=None, service=None):\n if not service:\n service = service_module.__name__.rsplit('.', 1)[-1]\n if not method:\n method = handler.__name__\n assert service, 'A service name could not be determined'\n assert method, 'A method name could not be determined'\n\n assert hasattr(service_module.Iface, method), (\n \"Service %s doesn't define method %s\" % (service, method)\n )\n assert hasattr(service_module, method + '_result'), (\n \"oneway methods are not yet supported\"\n )\n\n endpoint = '%s::%s' % (service, method)\n args_type = getattr(service_module, method + '_args')\n result_type = getattr(service_module, method + '_result')\n\n # if the dispatcher is set to deal with handlers that\n # return responses, then use new api, else use deprecated\n if dispatcher._handler_returns_response:\n new_handler = build_handler(result_type, handler)\n else:\n new_handler = deprecated_build_handler(result_type, handler)\n\n dispatcher.register(\n endpoint,\n new_handler,\n ThriftSerializer(args_type),\n ThriftSerializer(result_type)\n )\n return handler", "response": "Registers a Thrift service method with the given RequestDispatcher."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsends back the result of this call. Only one of this and write_exc_info may be called.", "response": "def write_result(self, result):\n \"\"\"Send back the result of this call.\n\n Only one of this and `write_exc_info` may be called.\n\n :param result:\n Return value of the call\n \"\"\"\n assert not self.finished, \"Already sent a response\"\n\n if not self.result.thrift_spec:\n self.finished = True\n return\n\n spec = self.result.thrift_spec[0]\n if result is not None:\n assert spec, \"Tried to return a result for a void method.\"\n setattr(self.result, spec[2], result)\n\n self.finished = True"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nwriting exception information to the response.", "response": "def write_exc_info(self, exc_info=None):\n \"\"\"Write exception information to the response.\n\n Only one of this and ``write_result`` may be called.\n\n :param exc_info:\n 3-tuple of exception information. If omitted, the last exception\n will be retrieved using ``sys.exc_info()``.\n \"\"\"\n exc_info = exc_info or sys.exc_info()\n exc = exc_info[1]\n self.code = StatusCode.error\n for spec in self.result.thrift_spec[1:]:\n if spec and isinstance(exc, spec[3][0]):\n assert not self.finished, \"Already sent a response\"\n\n setattr(self.result, spec[2], exc)\n self.finished = True\n return\n\n # Re-raise the exception (with the same traceback) if it didn't match.\n raise exc_info[0], exc_info[1], exc_info[2]"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef call(\n self,\n scheme,\n service,\n arg1,\n arg2=None,\n arg3=None,\n timeout=None,\n retry_on=None,\n retry_limit=None,\n routing_delegate=None,\n hostport=None,\n shard_key=None,\n tracing_span=None,\n trace=None, # to trace or not, defaults to self._dep_tchannel.trace\n caller_name=None,\n ):\n \"\"\"Make low-level requests to TChannel services.\n\n **Note:** Usually you would interact with a higher-level arg scheme\n like :py:class:`tchannel.schemes.JsonArgScheme` or\n :py:class:`tchannel.schemes.ThriftArgScheme`.\n \"\"\"\n\n # TODO - don't use asserts for public API\n assert format, \"format is required\"\n assert service, \"service is required\"\n assert arg1, \"arg1 is required\"\n\n # default args\n if arg2 is None:\n arg2 = \"\"\n if arg3 is None:\n arg3 = \"\"\n if timeout is None:\n timeout = DEFAULT_TIMEOUT\n if retry_on is None:\n retry_on = retry.DEFAULT\n if retry_limit is None:\n retry_limit = retry.DEFAULT_RETRY_LIMIT\n\n # TODO - allow filters/steps for serialization, tracing, etc...\n\n tracing.apply_trace_flag(tracing_span, trace, self._dep_tchannel.trace)\n\n # calls tchannel.tornado.peer.PeerClientOperation.__init__\n operation = self._dep_tchannel.request(\n service=service,\n hostport=hostport,\n arg_scheme=scheme,\n retry=retry_on,\n tracing_span=tracing_span\n )\n\n # fire operation\n transport_headers = {\n transport.SCHEME: scheme,\n transport.CALLER_NAME: caller_name or self.name,\n }\n if shard_key:\n transport_headers[transport.SHARD_KEY] = shard_key\n if routing_delegate:\n transport_headers[transport.ROUTING_DELEGATE] = routing_delegate\n\n response = yield operation.send(\n arg1=arg1,\n arg2=arg2,\n arg3=arg3,\n headers=transport_headers,\n retry_limit=retry_limit,\n ttl=timeout,\n )\n\n # unwrap response\n body = yield response.get_body()\n headers = yield response.get_header()\n t = TransportHeaders.from_dict(response.headers)\n result = Response(\n body=body,\n headers=headers,\n transport=t,\n status=response.code,\n )\n\n raise gen.Return(result)", "response": "Make low - level requests to TChannel services."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef advertise(self, routers=None, name=None, timeout=None,\n router_file=None, jitter=None):\n \"\"\"Advertise with Hyperbahn.\n\n After a successful advertisement, Hyperbahn will establish long-lived\n connections with your application. These connections are used to load\n balance inbound and outbound requests to other applications on the\n Hyperbahn network.\n\n Re-advertisement happens periodically after calling this method (every\n minute). Hyperbahn will eject us from the network if it doesn't get a\n re-advertise from us after 5 minutes.\n\n This function may be called multiple times if it fails. If it\n succeeds, all consecutive calls are ignored.\n\n :param list routers:\n A seed list of known Hyperbahn addresses to attempt contact with.\n Entries should be of the form ``\"host:port\"``.\n\n :param string name:\n The name your application identifies itself as. This is usually\n unneeded because in the common case it will match the ``name`` you\n initialized the ``TChannel`` instance with. This is the identifier\n other services will use to make contact with you.\n\n :param timeout:\n The timeout (in sec) for the initial advertise attempt.\n Defaults to 30 seconds.\n\n :param jitter:\n Variance allowed in the interval per request. Defaults to 5\n seconds. The jitter applies to the initial advertise request as\n well.\n\n :param router_file:\n The host file that contains the routers information. The file\n should contain a JSON stringified format of the routers parameter.\n Either routers or router_file should be provided. If both provided,\n a ValueError will be raised.\n\n :returns:\n A future that resolves to the remote server's response after the\n first advertise finishes.\n\n :raises TimeoutError:\n When unable to make our first advertise request to Hyperbahn.\n Subsequent requests may fail but will be ignored.\n \"\"\"\n if routers is not None and router_file is not None:\n raise ValueError(\n 'Only one of routers and router_file can be provided.')\n\n if routers is None and router_file is not None:\n # should just let the exceptions fly\n try:\n with open(router_file, 'r') as json_data:\n routers = json.load(json_data)\n except (IOError, OSError, ValueError):\n log.exception('Failed to read seed routers list.')\n raise\n\n @gen.coroutine\n def _advertise():\n result = yield self._dep_tchannel.advertise(\n routers=routers,\n name=name,\n timeout=timeout,\n )\n body = yield result.get_body()\n headers = yield result.get_header()\n response = Response(json.loads(body), headers or {})\n raise gen.Return(response)\n\n def _on_advertise(future):\n if not future.exception():\n return\n\n # If the request failed, clear the response so that we can try\n # again.\n with self._advertise_lock:\n # `is` comparison to ensure we're not deleting another Future.\n if self._advertise_response is future:\n self._advertise_response = None\n\n with self._advertise_lock:\n if self._advertise_response is not None:\n return self._advertise_response\n future = self._advertise_response = _advertise()\n\n # We call add_done_callback here rather than when we call _advertise()\n # because if the future has already resolved by the time we call\n # add_done_callback, the callback will immediately be executed. The\n # callback will try to acquire the advertise_lock which we already\n # hold and end up in a deadlock.\n future.add_done_callback(_on_advertise)\n return future", "response": "Advertise a new TChannel instance to Hyperbahn."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef thrift_request_builder(service, thrift_module, hostport=None,\n thrift_class_name=None):\n \"\"\"Provide TChannel compatibility with Thrift-generated modules.\n\n The service this creates is meant to be used with TChannel like so:\n\n .. code-block:: python\n\n from tchannel import TChannel, thrift_request_builder\n from some_other_service_thrift import some_other_service\n\n tchannel = TChannel('my-service')\n\n some_service = thrift_request_builder(\n service='some-other-service',\n thrift_module=some_other_service\n )\n\n resp = tchannel.thrift(\n some_service.fetchPotatoes()\n )\n\n .. deprecated:: 0.18.0\n\n Please switch to :py:func:`tchannel.thrift.load`.\n\n .. warning::\n\n This API is deprecated and will be removed in a future version.\n\n :param string service:\n Name of Thrift service to call. This is used internally for\n grouping and stats, but also to route requests over Hyperbahn.\n\n :param thrift_module:\n The top-level module of the Apache Thrift generated code for\n the service that will be called.\n\n :param string hostport:\n When calling the Thrift service directly, and not over Hyperbahn,\n this 'host:port' value should be provided.\n\n :param string thrift_class_name:\n When the Apache Thrift generated Iface class name does not match\n thrift_module, then this should be provided.\n \"\"\"\n\n # start with a request maker instance\n maker = ThriftRequestMaker(\n service=service,\n thrift_module=thrift_module,\n hostport=hostport,\n thrift_class_name=thrift_class_name\n )\n\n # create methods that mirror thrift client\n # and each return ThriftRequest\n methods = _create_methods(thrift_module)\n\n # then attach to instane\n for name, method in methods.iteritems():\n method = types.MethodType(method, maker, ThriftRequestMaker)\n setattr(maker, name, method)\n\n return maker", "response": "This function creates a Thrift - compatible client for a given service."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef read_body(self, body):\n result_spec = self.result_type.thrift_spec\n\n # raise application exception, if present\n for exc_spec in result_spec[1:]:\n exc = getattr(body, exc_spec[2])\n if exc is not None:\n raise exc\n\n # success - non-void\n if len(result_spec) >= 1 and result_spec[0] is not None:\n\n # value expected, but got none\n # TODO - server side should use this same logic\n if body.success is None:\n raise ValueExpectedError(\n 'Expected a value to be returned for %s, '\n 'but recieved None - only void procedures can '\n 'return None.' % self.endpoint\n )\n\n return body.success\n\n # success - void\n else:\n return None", "response": "Handles the response body for this request. Returns the result unwrapped\n from the response union."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_rank(self, peer):\n if not peer.connections:\n return self.TIERS[0]\n\n if not peer.has_incoming_connections:\n return self.TIERS[1] + peer.total_outbound_pendings\n\n return self.TIERS[2] + peer.total_outbound_pendings", "response": "Calculate the rank of the peer based on the connections."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef prepare(cls, *args, **kwargs):\n cls.args = args\n cls.kwargs = kwargs\n cls.prepared = True", "response": "Set the arguments to be used when instantiating a TChannel instance."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nundoing call to prepare useful for testing.", "response": "def reset(cls, *args, **kwargs):\n \"\"\"Undo call to prepare, useful for testing.\"\"\"\n cls.local.tchannel = None\n cls.args = None\n cls.kwargs = None\n cls.prepared = False"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_instance(cls):\n if not cls.prepared:\n raise SingletonNotPreparedError(\n \"prepare must be called before get_instance\"\n )\n\n if hasattr(cls.local, 'tchannel') and cls.local.tchannel is not None:\n return cls.local.tchannel\n\n cls.local.tchannel = cls.tchannel_cls(*cls.args, **cls.kwargs)\n\n return cls.local.tchannel", "response": "Get a configured thread - safe singleton TChannel instance."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nhandling incoming request message including CallRequestMessage and CallRequestContinueMessage.", "response": "def handle_pre_call(self, message, connection):\n \"\"\"Handle incoming request message including CallRequestMessage and\n CallRequestContinueMessage\n\n This method will build the User friendly request object based on the\n incoming messages.\n\n It passes all the messages into the message_factory to build the init\n request object. Only when it get a CallRequestMessage and a completed\n arg_1=argstream[0], the message_factory will return a request object.\n Then it will trigger the async handle_call method.\n\n :param message: CallRequestMessage or CallRequestContinueMessage\n :param connection: tornado connection\n \"\"\"\n req = None\n try:\n req = connection.request_message_factory.build(message)\n # message_factory will create Request only when it receives\n # CallRequestMessage. It will return None, if it receives\n # CallRequestContinueMessage.\n if req:\n self.handle_call(req, connection)\n\n except TChannelError as e:\n log.warn('Received a bad request.', exc_info=True)\n if req:\n e.tracing = req.tracing\n connection.send_error(e)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef register(\n self,\n rule,\n handler,\n req_serializer=None,\n resp_serializer=None\n ):\n \"\"\"Register a new endpoint with the given name.\n\n .. code-block:: python\n\n @dispatcher.register('is_healthy')\n def check_health(request, response):\n # ...\n\n :param rule:\n Name of the endpoint. Incoming Call Requests must have this as\n ``arg1`` to dispatch to this handler.\n\n If ``RequestHandler.FALLBACK`` is specified as a rule, the given\n handler will be used as the 'fallback' handler when requests don't\n match any registered rules.\n\n :param handler:\n A function that gets called with ``Request`` and ``Response``.\n\n :param req_serializer:\n Arg scheme serializer of this endpoint. It should be\n ``RawSerializer``, ``JsonSerializer``, and ``ThriftSerializer``.\n\n :param resp_serializer:\n Arg scheme serializer of this endpoint. It should be\n ``RawSerializer``, ``JsonSerializer``, and ``ThriftSerializer``.\n \"\"\"\n\n assert handler, \"handler must not be None\"\n req_serializer = req_serializer or RawSerializer()\n resp_serializer = resp_serializer or RawSerializer()\n self.handlers[rule] = Handler(handler, req_serializer, resp_serializer)", "response": "Register a new endpoint with the given name."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate a new Tracing with random IDs.", "response": "def random_tracing():\n \"\"\"\n Create new Tracing() tuple with random IDs.\n \"\"\"\n new_id = _uniq_id()\n return Tracing(\n span_id=new_id,\n parent_id=0,\n trace_id=new_id,\n traceflags=0)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngenerates checksum for messages with the given message.", "response": "def generate_checksum(message, previous_csum=0):\n \"\"\"Generate checksum for messages with\n CALL_REQ, CALL_REQ_CONTINUE,\n CALL_RES,CALL_RES_CONTINUE types.\n\n :param message: outgoing message\n :param previous_csum: accumulated checksum value\n \"\"\"\n if message.message_type in CHECKSUM_MSG_TYPES:\n csum = compute_checksum(\n message.checksum[0],\n message.args,\n previous_csum,\n )\n\n message.checksum = (message.checksum[0], csum)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nverify checksum for incoming message.", "response": "def verify_checksum(message, previous_csum=0):\n \"\"\"Verify checksum for incoming message.\n\n :param message: incoming message\n :param previous_csum: accumulated checksum value\n\n :return return True if message checksum type is None\n or checksum is correct\n \"\"\"\n if message.message_type in CHECKSUM_MSG_TYPES:\n csum = compute_checksum(\n message.checksum[0],\n message.args,\n previous_csum,\n )\n\n if csum == message.checksum[1]:\n return True\n else:\n return False\n else:\n return True"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadvertising with Hyperbahn. See :py:class:`tchannel.TChannel.advertise`.", "response": "def advertise(tchannel, service, routers=None, timeout=None, router_file=None,\n jitter=None):\n \"\"\"Advertise with Hyperbahn.\n\n See :py:class:`tchannel.TChannel.advertise`.\n \"\"\"\n timeout = timeout or FIRST_ADVERTISE_TIME\n if routers is not None and router_file is not None:\n raise ValueError(\n 'Only one of routers and router_file can be provided.')\n\n if routers is None and router_file is not None:\n # should just let the exceptions fly\n try:\n with open(router_file, 'r') as json_data:\n routers = json.load(json_data)\n except (IOError, OSError, ValueError):\n log.exception('Failed to read seed routers list.')\n raise\n\n for router in routers:\n # We use .get here instead of .add because we don't want to fail if a\n # TChannel already knows about some of the routers.\n tchannel.peers.get(router)\n\n adv = Advertiser(service, tchannel, ttl_secs=timeout,\n interval_max_jitter_secs=jitter)\n return adv.start()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nstarting the advertise loop.", "response": "def start(self):\n \"\"\"Starts the advertise loop.\n\n Returns the result of the first ad request.\n \"\"\"\n if self.running:\n raise Exception('Advertiser is already running')\n if self.io_loop is None:\n self.io_loop = tornado.ioloop.IOLoop.current()\n\n self.running = True\n answer = tornado.gen.Future()\n self._schedule_ad(0, answer)\n return answer"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _schedule_ad(self, delay=None, response_future=None):\n if not self.running:\n return\n\n if delay is None:\n delay = self.interval_secs\n\n delay += random.uniform(0, self.interval_max_jitter_secs)\n self._next_ad = self.io_loop.call_later(delay, self._ad,\n response_future)", "response": "Schedules an ad request."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nadds a new request to the Cemetery that is known to have timed out.", "response": "def add(self, id, ttl_secs):\n \"\"\"Adds a new request to the Cemetery that is known to have timed out.\n\n The request will be forgotten after ``ttl_secs + ttl_offset_secs``\n seconds.\n\n :param id:\n ID of the request\n :param ttl_secs:\n TTL of the request (in seconds)\n \"\"\"\n ttl_secs = min(ttl_secs + self.ttl_offset_secs, self.max_ttl_secs)\n self._tombstones[id] = IOLoop.current().call_later(\n ttl_secs, self.forget, id,\n )"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef clear(self):\n io_loop = IOLoop.current()\n while self._tombstones:\n _, req_timeout = self._tombstones.popitem()\n io_loop.remove_timeout(req_timeout)", "response": "Forget about all requests."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the header value from the response.", "response": "def get_header(self):\n \"\"\"Get the header value from the response.\n\n :return: a future contains the deserialized value of header\n \"\"\"\n raw_header = yield get_arg(self, 1)\n if not self.serializer:\n raise tornado.gen.Return(raw_header)\n else:\n header = self.serializer.deserialize_header(raw_header)\n raise tornado.gen.Return(header)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets the body value from the response.", "response": "def get_body(self):\n \"\"\"Get the body value from the response.\n\n :return: a future contains the deserialized value of body\n \"\"\"\n\n raw_body = yield get_arg(self, 2)\n if not self.serializer:\n raise tornado.gen.Return(raw_body)\n else:\n body = self.serializer.deserialize_body(raw_body)\n raise tornado.gen.Return(body)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsets the custom body stream for the current message.", "response": "def set_body_s(self, stream):\n \"\"\"Set customized body stream.\n\n Note: the body stream can only be changed before the stream\n is consumed.\n\n :param stream: InMemStream/PipeStream for body\n\n :except TChannelError:\n Raise TChannelError if the stream is being sent when you try\n to change the stream.\n \"\"\"\n if self.argstreams[2].state == StreamState.init:\n self.argstreams[2] = stream\n else:\n raise TChannelError(\n \"Unable to change the body since the streaming has started\")"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef set_header_s(self, stream):\n\n if self.argstreams[1].state == StreamState.init:\n self.argstreams[1] = stream\n else:\n raise TChannelError(\n \"Unable to change the header since the streaming has started\")", "response": "Set the custom header stream for the current entry."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nwrites the header to the header stream.", "response": "def write_header(self, chunk):\n \"\"\"Write to header.\n\n Note: the header stream is only available to write before write body.\n\n :param chunk: content to write to header\n\n :except TChannelError:\n Raise TChannelError if the response's flush() has been called\n \"\"\"\n\n if self.serializer:\n header = self.serializer.serialize_header(chunk)\n else:\n header = chunk\n\n if self.flushed:\n raise TChannelError(\"write operation invalid after flush call\")\n\n if (self.argstreams[0].state != StreamState.completed and\n self.argstreams[0].auto_close):\n self.argstreams[0].close()\n\n return self.argstreams[1].write(header)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef write_body(self, chunk):\n\n if self.serializer:\n body = self.serializer.serialize_body(chunk)\n else:\n body = chunk\n\n if self.flushed:\n raise TChannelError(\"write operation invalid after flush call\")\n\n if (self.argstreams[0].state != StreamState.completed and\n self.argstreams[0].auto_close):\n self.argstreams[0].close()\n if (self.argstreams[1].state != StreamState.completed and\n self.argstreams[1].auto_close):\n self.argstreams[1].close()\n\n return self.argstreams[2].write(body)", "response": "Write the content of the message to the response."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef read_message(stream):\n answer = tornado.gen.Future()\n io_loop = IOLoop.current()\n\n def on_error(future):\n log.info('Failed to read data: %s', future.exception())\n return answer.set_exc_info(future.exc_info())\n\n @fail_to(answer)\n def on_body(size, future):\n if future.exception():\n return on_error(future)\n\n body = future.result()\n f = frame.frame_rw.read(BytesIO(body), size=size)\n message_type = f.header.message_type\n message_rw = messages.RW.get(message_type)\n if not message_rw:\n exc = errors.FatalProtocolError(\n 'Unknown message type %s', str(message_type)\n )\n return answer.set_exception(exc)\n\n message = message_rw.read(BytesIO(f.payload))\n message.id = f.header.message_id\n answer.set_result(message)\n\n @fail_to(answer)\n def on_read_size(future):\n if future.exception():\n return answer.set_exc_info(future.exc_info())\n\n size_bytes = future.result()\n size = frame.frame_rw.size_rw.read(BytesIO(size_bytes))\n io_loop.add_future(\n stream.read_bytes(size - FRAME_SIZE_WIDTH),\n lambda f: on_body(size, f)\n )\n\n try:\n # read_bytes may fail if the stream has already been closed\n read_size_future = stream.read_bytes(FRAME_SIZE_WIDTH)\n except Exception:\n answer.set_exc_info(sys.exc_info())\n else:\n read_size_future.add_done_callback(on_read_size)\n return answer", "response": "Reads a message from the given IOStream."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nspecify a function to be called when this connection is closed.", "response": "def set_close_callback(self, cb):\n \"\"\"Specify a function to be called when this connection is closed.\n\n :param cb:\n A callable that takes no arguments. This callable will be called\n when this connection is closed.\n \"\"\"\n assert self._close_cb is None, (\n 'A close_callback has already been set for this connection.'\n )\n self._close_cb = stack_context.wrap(cb)\n if self.closed:\n self._close_cb()"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsend a message up the wire.", "response": "def send(self, message):\n \"\"\"Send the given message up the wire.\n Use this for messages which have a response message.\n\n :param message:\n Message to send\n :returns:\n A Future containing the response for the message\n \"\"\"\n assert self._handshake_performed, \"Perform a handshake first.\"\n assert message.message_type in self.CALL_REQ_TYPES, (\n \"Message '%s' can't use send\" % repr(message)\n )\n\n message.id = message.id or self.writer.next_message_id()\n assert message.id not in self._outbound_pending_call, (\n \"Message ID '%d' already being used\" % message.id\n )\n\n future = tornado.gen.Future()\n self._outbound_pending_call[message.id] = future\n self.write(message)\n return future"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nwrite the given message up the wire.", "response": "def write(self, message):\n \"\"\"Writes the given message up the wire.\n\n Does not expect a response back for the message.\n\n :param message:\n Message to write.\n \"\"\"\n message.id = message.id or self.writer.next_message_id()\n\n if message.message_type in self.CALL_REQ_TYPES:\n message_factory = self.request_message_factory\n else:\n message_factory = self.response_message_factory\n\n fragments = message_factory.fragment(message)\n return self._write_fragments(fragments)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _write_fragments(self, fragments):\n answer = tornado.gen.Future()\n\n if not fragments:\n answer.set_result(None)\n return answer\n\n io_loop = IOLoop.current()\n\n def _write_fragment(future):\n if future and future.exception():\n return answer.set_exc_info(future.exc_info())\n\n try:\n fragment = fragments.next()\n except StopIteration:\n return answer.set_result(None)\n\n io_loop.add_future(self.writer.put(fragment), _write_fragment)\n\n _write_fragment(None)\n return answer", "response": "Writes the given generator of messages to the cache."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef initiate_handshake(self, headers, timeout=None):\n io_loop = IOLoop.current()\n timeout = timeout or DEFAULT_INIT_TIMEOUT_SECS\n\n self.writer.put(messages.InitRequestMessage(\n version=PROTOCOL_VERSION,\n headers=headers\n ))\n\n init_res_future = self.reader.get()\n timeout_handle = io_loop.call_later(timeout, (\n lambda: init_res_future.set_exception(errors.TimeoutError(\n 'Handshake with %s:%d timed out. Did not receive an INIT_RES '\n 'after %s seconds' % (\n self.remote_host, self.remote_host_port, str(timeout)\n )\n ))\n ))\n io_loop.add_future(\n init_res_future,\n (lambda _: io_loop.remove_timeout(timeout_handle)),\n )\n\n init_res = yield init_res_future\n if init_res.message_type != Types.INIT_RES:\n raise errors.UnexpectedError(\n \"Expected handshake response, got %s\" % repr(init_res)\n )\n self._extract_handshake_headers(init_res)\n self._handshake_performed = True\n\n # The receive loop is started only after the handshake has been\n # completed.\n self._loop()\n\n raise tornado.gen.Return(init_res)", "response": "Initiate a handshake with the remote host."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nexpect a handshake from the remote host.", "response": "def expect_handshake(self, headers):\n \"\"\"Expect a handshake from the remote host.\n\n :param headers:\n Headers to respond with\n :returns:\n A future that resolves (with a value of None) when the handshake\n is complete.\n \"\"\"\n init_req = yield self.reader.get()\n if init_req.message_type != Types.INIT_REQ:\n raise errors.UnexpectedError(\n \"You need to shake my hand first. Got %s\" % repr(init_req)\n )\n self._extract_handshake_headers(init_req)\n self._handshake_performed = True\n\n self.writer.put(\n messages.InitResponseMessage(\n PROTOCOL_VERSION, headers, init_req.id),\n )\n\n # The receive loop is started only after the handshake has been\n # completed.\n self._loop()\n\n raise tornado.gen.Return(init_req)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef outgoing(cls, hostport, process_name=None, serve_hostport=None,\n handler=None, tchannel=None):\n \"\"\"Initiate a new connection to the given host.\n\n :param hostport:\n String in the form ``$host:$port`` specifying the target host\n :param process_name:\n Process name of the entity making the connection.\n :param serve_hostport:\n String in the form ``$host:$port`` specifying an address at which\n the caller can be reached. If omitted, ``0.0.0.0:0`` is used.\n :param handler:\n If given, any calls received from this connection will be sent to\n this RequestHandler.\n \"\"\"\n host, port = hostport.rsplit(\":\", 1)\n process_name = process_name or \"%s[%s]\" % (sys.argv[0], os.getpid())\n serve_hostport = serve_hostport or \"0.0.0.0:0\"\n\n # TODO: change this to tornado.tcpclient.TCPClient to do async DNS\n # lookups.\n stream = tornado.iostream.IOStream(\n socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n )\n\n log.debug(\"Connecting to %s\", hostport)\n try:\n yield stream.connect((host, int(port)))\n\n connection = cls(stream, tchannel, direction=OUTGOING)\n\n log.debug(\"Performing handshake with %s\", hostport)\n\n yield connection.initiate_handshake(headers={\n 'host_port': serve_hostport,\n 'process_name': process_name,\n 'tchannel_language': TCHANNEL_LANGUAGE,\n 'tchannel_language_version': TCHANNEL_LANGUAGE_VERSION,\n 'tchannel_version': TCHANNEL_VERSION,\n })\n except (StreamClosedError, socket.error, errors.TimeoutError) as e:\n log.warn(\"Couldn't connect to %s\", hostport)\n raise NetworkError(\n \"Couldn't connect to %s\" % hostport, e\n )\n\n if handler:\n connection.serve(handler)\n\n raise tornado.gen.Return(connection)", "response": "Initiate a new connection to the given host."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nserve calls over this connection using the given RequestHandler.", "response": "def serve(self, handler):\n \"\"\"Serve calls over this connection using the given RequestHandler.\n\n :param handler:\n RequestHandler to process the requests through\n :return:\n A Future that resolves (to None) once the loop is done running --\n which happens once this connection is closed.\n \"\"\"\n assert handler, \"handler is required\"\n\n while not self.closed:\n message = yield self.await()\n\n try:\n handler(message, self)\n except Exception:\n # TODO Send error frame back\n log.exception(\"Failed to process %s\", repr(message))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef send_error(self, error):\n\n error_message = build_raw_error_message(error)\n write_future = self.writer.put(error_message)\n write_future.add_done_callback(\n lambda f: IOLoop.current().add_callback(\n self.tchannel.event_emitter.fire(\n EventType.after_send_error,\n error,\n )\n )\n )\n return write_future", "response": "Convenience method for writing an error message up the wire."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ntransform request or response into frames", "response": "def _stream(self, context, message_factory):\n \"\"\"write request/response into frames\n\n Transform request/response into protocol level message objects based on\n types and argstreams.\n\n Assumption: the chunk data read from stream can fit into memory.\n\n If arg stream is at init or streaming state, build the message based on\n current chunk. If arg stream is at completed state, put current chunk\n into args[] array, and continue to read next arg stream in order to\n compose a larger message instead of sending multi small messages.\n\n Note: the message built at this stage is not guaranteed the size is\n less then 64KB.\n\n Possible messages created sequence:\n\n Take request as an example::\n CallRequestMessage(flags=fragment)\n --> CallRequestContinueMessage(flags=fragment)\n ....\n --> CallRequestContinueMessage(flags=fragment)\n --> CallRequestMessage(flags=none)\n\n :param context: Request or Response object\n \"\"\"\n args = []\n try:\n for argstream in context.argstreams:\n chunk = yield argstream.read()\n args.append(chunk)\n chunk = yield argstream.read()\n while chunk:\n message = (message_factory.\n build_raw_message(context, args))\n yield self.write(message)\n args = [chunk]\n chunk = yield argstream.read()\n\n # last piece of request/response.\n message = (message_factory.\n build_raw_message(context, args, is_completed=True))\n yield self.write(message)\n context.state = StreamState.completed\n # Stop streamming immediately if exception occurs on the handler side\n except TChannelError:\n # raise by tchannel intentionally\n log.info(\"Stopped outgoing streams because of an error\",\n exc_info=sys.exc_info())"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef stream_request(self, request, out_future):\n request.close_argstreams()\n\n def on_done(future):\n if future.exception() and out_future.running():\n out_future.set_exc_info(future.exc_info())\n request.close_argstreams(force=True)\n\n stream_future = self._stream(request, self.request_message_factory)\n stream_future.add_done_callback(on_done)\n return stream_future", "response": "send the given request and response is not required"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsending the given request and return a future that will be returned.", "response": "def send_request(self, request):\n \"\"\"Send the given request and response is required.\n\n Use this for messages which have a response message.\n\n :param request:\n request to send\n :returns:\n A Future containing the response for the request\n \"\"\"\n assert self._handshake_performed, \"Perform a handshake first.\"\n\n assert request.id not in self._outbound_pending_call, (\n \"Message ID '%d' already being used\" % request.id\n )\n\n future = tornado.gen.Future()\n self._outbound_pending_call[request.id] = future\n self.add_pending_outbound()\n self.stream_request(request, future).add_done_callback(\n lambda f: self.remove_pending_outbound()\n )\n\n if request.ttl:\n self._add_timeout(request, future)\n\n # the actual future that caller will yield\n response_future = tornado.gen.Future()\n # TODO: fire before_receive_response\n\n IOLoop.current().add_future(\n future,\n lambda f: self.adapt_result(f, request, response_future),\n )\n return response_future"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd a timeout for the given request to the given future.", "response": "def _add_timeout(self, request, future):\n \"\"\"Adds a timeout for the given request to the given future.\"\"\"\n io_loop = IOLoop.current()\n t = io_loop.call_later(\n request.ttl,\n self._request_timed_out,\n request.id,\n request.service,\n request.ttl,\n future,\n )\n io_loop.add_future(future, lambda f: io_loop.remove_timeout(t))"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get(self):\n if not self.filling:\n self.fill()\n\n answer = tornado.gen.Future()\n\n def _on_result(future):\n if future.exception():\n return answer.set_exc_info(future.exc_info())\n answer.set_result(future.result())\n\n def _on_item(future):\n if future.exception():\n return answer.set_exc_info(future.exc_info())\n future.result().add_done_callback(_on_result)\n\n self.queue.get().add_done_callback(_on_item)\n return answer", "response": "Receive the next message off the wire."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef put(self, message):\n if self.draining is False:\n self.drain()\n\n return self._enqueue(message)", "response": "Enqueues the given message for writing to the wire."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nmaking a service available on the Hyperbahn routing mesh. This will make contact with a Hyperbahn host from a list of known Hyperbahn routers. Additional Hyperbahn connections will be established once contact has been made with the network. :param router: A seed list of addresses of Hyperbahn routers, e.g., ``[\"127.0.0.1:23000\"]``. :param name: The identity of this service on the Hyperbahn. This is usually unnecessary, as it defaults to the name given when initializing the :py:class:`TChannel` (which is used as your identity as a caller). :returns: A future that resolves to the remote server's response after the first advertise finishes. Advertisement will continue to happen periodically.", "response": "def advertise(\n self,\n routers=None,\n name=None,\n timeout=None,\n router_file=None,\n jitter=None,\n ):\n \"\"\"Make a service available on the Hyperbahn routing mesh.\n\n This will make contact with a Hyperbahn host from a list of known\n Hyperbahn routers. Additional Hyperbahn connections will be established\n once contact has been made with the network.\n\n :param router:\n A seed list of addresses of Hyperbahn routers, e.g.,\n ``[\"127.0.0.1:23000\"]``.\n\n :param name:\n The identity of this service on the Hyperbahn.\n\n This is usually unnecessary, as it defaults to the name given when\n initializing the :py:class:`TChannel` (which is used as your\n identity as a caller).\n\n :returns:\n A future that resolves to the remote server's response after\n the first advertise finishes.\n\n Advertisement will continue to happen periodically.\n \"\"\"\n name = name or self.name\n\n if not self.is_listening():\n self.listen()\n\n return hyperbahn.advertise(\n self,\n name,\n routers,\n timeout,\n router_file,\n jitter,\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef request(self,\n hostport=None,\n service=None,\n arg_scheme=None,\n retry=None,\n **kwargs):\n \"\"\"Initiate a new request through this TChannel.\n\n :param hostport:\n Host to which the request will be made. If unspecified, a random\n known peer will be picked. This is not necessary if using\n Hyperbahn.\n\n :param service:\n The name of a service available on Hyperbahn. Defaults to an empty\n string.\n\n :param arg_scheme:\n Determines the serialization scheme for the request. One of 'raw',\n 'json', or 'thrift'. Defaults to 'raw'.\n\n :param rety:\n One of 'n' (never retry), 'c' (retry on connection errors), 't'\n (retry on timeout), 'ct' (retry on connection errors and timeouts).\n\n Defaults to 'c'.\n \"\"\"\n # TODO disallow certain parameters or don't propagate them backwards.\n # For example, blacklist and rank threshold aren't really\n # user-configurable right now.\n return self.peers.request(hostport=hostport,\n service=service,\n arg_scheme=arg_scheme,\n retry=retry,\n **kwargs)", "response": "Initiate a new request through this TChannel."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nstarts listening for incoming connections.", "response": "def listen(self, port=None):\n \"\"\"Start listening for incoming connections.\n\n A request handler must have already been specified with\n ``TChannel.host``.\n\n :param port:\n An explicit port to listen on. This is unnecessary when advertising\n on Hyperbahn.\n\n :returns:\n Returns immediately.\n\n :raises AlreadyListeningError:\n If listen was already called.\n \"\"\"\n\n if self.is_listening():\n raise AlreadyListeningError(\n \"listen has already been called\"\n )\n\n if port:\n assert not self._port, \"Port has already been set.\"\n self._port = int(port)\n\n assert self._handler, \"Call .host with a RequestHandler first\"\n server = TChannelServer(self)\n\n bind_sockets_kwargs = {\n 'port': self._port,\n # ipv6 causes random address already in use (socket.error w errno\n # == 98) when getaddrinfo() returns multiple values\n # @see https://github.com/uber/tchannel-python/issues/256\n 'family': socket.AF_INET,\n }\n if self._reuse_port is True:\n # allow multiple processes to share the same port,\n # this is really useful in a world where services launch N\n # processes per container/os-space, where N is\n # the amount of cpus for example\n bind_sockets_kwargs['reuse_port'] = True\n\n sockets = bind_sockets(**bind_sockets_kwargs)\n assert sockets, \"No sockets bound for port %d\" % self._port\n\n # If port was 0, the OS probably assigned something better.\n self._port = sockets[0].getsockname()[1]\n\n server.add_sockets(sockets)\n\n # assign server so we don't listen twice\n self._server = server"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nregister a simple endpoint with this TChannel.", "response": "def _register_simple(self, endpoint, scheme, f):\n \"\"\"Register a simple endpoint with this TChannel.\n\n :param endpoint:\n Name of the endpoint being registered.\n :param scheme:\n Name of the arg scheme under which the endpoint will be\n registered.\n :param f:\n Callable handler for the endpoint.\n \"\"\"\n assert scheme in DEFAULT_NAMES, (\"Unsupported arg scheme %s\" % scheme)\n if scheme == JSON:\n req_serializer = JsonSerializer()\n resp_serializer = JsonSerializer()\n else:\n req_serializer = RawSerializer()\n resp_serializer = RawSerializer()\n self._handler.register(endpoint, f, req_serializer, resp_serializer)\n return f"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nregistering a Thrift endpoint on this TChannel.", "response": "def _register_thrift(self, service_module, handler, **kwargs):\n \"\"\"Register a Thrift endpoint on this TChannel.\n\n :param service_module:\n Reference to the Thrift-generated module for the service being\n registered.\n :param handler:\n Handler for the endpoint\n :param method:\n Name of the Thrift method being registered. If omitted, ``f``'s\n name is assumed to be the method name.\n :param service:\n Name of the Thrift service. By default this is determined\n automatically from the module name.\n \"\"\"\n import tchannel.thrift as thrift\n # Imported inside the function so that we don't have a hard dependency\n # on the Thrift library. This function is usable only if the Thrift\n # library is installed.\n thrift.register(self._handler, service_module, handler, **kwargs)\n return handler"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef register(self, endpoint, scheme=None, handler=None, **kwargs):\n assert endpoint is not None, \"endpoint is required\"\n\n if endpoint is TChannel.FALLBACK:\n decorator = partial(self._handler.register, TChannel.FALLBACK)\n if handler is not None:\n return decorator(handler)\n else:\n return decorator\n\n if not scheme:\n # scheme defaults to raw, unless the endpoint is a service module.\n if inspect.ismodule(endpoint):\n scheme = \"thrift\"\n else:\n scheme = \"raw\"\n scheme = scheme.lower()\n if scheme == 'thrift':\n decorator = partial(self._register_thrift, endpoint, **kwargs)\n else:\n decorator = partial(\n self._register_simple, endpoint, scheme, **kwargs\n )\n\n if handler is not None:\n return decorator(handler)\n else:\n return decorator", "response": "Register a handler with this TChannel."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef lt(self, i, j):\n if self.peers[i].rank == self.peers[j].rank:\n return self.peers[i].order < self.peers[j].order\n\n return self.peers[i].rank < self.peers[j].rank", "response": "Compare the priority of two peers."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\npush a new peer into the heap", "response": "def push_peer(self, peer):\n \"\"\"Push a new peer into the heap\"\"\"\n\n self.order += 1\n peer.order = self.order + random.randint(0, self.size())\n heap.push(self, peer)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef add_and_shuffle(self, peer):\n self.push_peer(peer)\n\n r = random.randint(0, self.size() - 1)\n self.swap_order(peer.index, r)", "response": "Push a new peer into the heap and shuffle the heap"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nremoving the peer from the heap.", "response": "def remove_peer(self, peer):\n \"\"\"Remove the peer from the heap.\n\n Return: removed peer if peer exists. If peer's index is out of range,\n raise IndexError.\n \"\"\"\n if peer.index < 0 or peer.index >= self.size():\n raise IndexError('Peer index is out of range')\n\n assert peer is self.peers[peer.index], \"peer is not in the heap\"\n\n return heap.remove(self, peer.index)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef token_request(self, authorization_code):\n if not self._client.token_endpoint:\n return None\n\n request = {\n 'grant_type': 'authorization_code',\n 'code': authorization_code,\n 'redirect_uri': self._redirect_uri\n }\n\n logger.debug('making token request: %s', request)\n client_auth_method = self._client.registration_response.get('token_endpoint_auth_method', 'client_secret_basic')\n auth_header = _ClientAuthentication(self._client.client_id, self._client.client_secret)(client_auth_method,\n request)\n resp = self._provider_configuration.requests_session \\\n .post(self._client.token_endpoint,\n data=request,\n headers=auth_header) \\\n .json()\n logger.debug('received token response: %s', json.dumps(resp))\n\n if 'error' in resp:\n token_resp = TokenErrorResponse(**resp)\n else:\n token_resp = AccessTokenResponse(**resp)\n token_resp.verify(keyjar=self._client.keyjar)\n if 'id_token' in resp:\n token_resp['id_token_jwt'] = resp['id_token']\n\n return token_resp", "response": "Makes a token request."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef userinfo_request(self, access_token):\n http_method = self._provider_configuration.userinfo_endpoint_method\n if http_method is None or not self._client.userinfo_endpoint:\n return None\n\n logger.debug('making userinfo request')\n userinfo_response = self._client.do_user_info_request(method=http_method, token=access_token)\n logger.debug('received userinfo response: %s', userinfo_response.to_json())\n\n return userinfo_response", "response": "Makes a userinfo request to the user info endpoint."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef update(self, access_token=None, id_token=None, id_token_jwt=None, userinfo=None):\n\n def set_if_defined(session_key, value):\n if value:\n self._session_storage[session_key] = value\n\n auth_time = int(time.time())\n if id_token:\n auth_time = id_token.get('auth_time', auth_time)\n\n self._session_storage['last_authenticated'] = auth_time\n set_if_defined('access_token', access_token)\n set_if_defined('id_token', id_token)\n set_if_defined('id_token_jwt', id_token_jwt)\n set_if_defined('userinfo', userinfo)", "response": "Updates the internal state of the session."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef log_background_messages(self, name=None):\n with self.lock:\n if name:\n self._log_messages_by_thread(name)\n else:\n self._log_all_messages()", "response": "Forwards all messages logged on background to Robot Framework log."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndecorates a function using a caller.", "response": "def decorate(func, caller):\n \"\"\"\n decorate(func, caller) decorates a function using a caller.\n \"\"\"\n evaldict = func.__globals__.copy()\n evaldict['_call_'] = caller\n evaldict['_func_'] = func\n fun = FunctionMaker.create(\n func, \"return _call_(_func_, %(shortsignature)s)\",\n evaldict, __wrapped__=func)\n if hasattr(func, '__qualname__'):\n fun.__qualname__ = func.__qualname__\n return fun"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsets a handler function for the type of message template currently loaded.", "response": "def set_client_handler(self, handler_func, name=None, header_filter=None, interval=0.5):\n \"\"\"Sets an automatic handler for the type of message template currently loaded.\n\n This feature allows users to set a python handler function which is called\n automatically by the Rammbock message queue when message matches the expected\n template. The optional name argument defines the client node to which the\n handler will be bound. Otherwise the default client will be used.\n\n The header_filter defines which header field will be used to identify the\n message defined in template. (Otherwise all incoming messages will match!)\n\n The interval defines the interval in seconds on which the handler will\n be called on background. By default the incoming messages are checked\n every 0.5 seconds.\n\n The handler function will be called with two arguments: the rammbock library\n instance and the received message.\n\n Example:\n | Load template | SomeMessage |\n | Set client handler | my_module.respond_to_sample |\n\n my_module.py:\n | def respond_to_sample(rammbock, msg):\n | rammbock.save_template(\"__backup_template\", unlocked=True)\n | try:\n | rammbock.load_template(\"sample response\")\n | rammbock.client_sends_message()\n | finally:\n | rammbock.load_template(\"__backup_template\")\n \"\"\"\n msg_template = self._get_message_template()\n client, client_name = self._clients.get_with_name(name)\n client.set_handler(msg_template, handler_func, header_filter=header_filter, interval=interval)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsetting a handler function for the type of message template currently loaded.", "response": "def set_server_handler(self, handler_func, name=None, header_filter=None, alias=None, interval=0.5):\n \"\"\"Sets an automatic handler for the type of message template currently loaded.\n\n This feature allows users to set a python handler function which is called\n automatically by the Rammbock message queue when message matches the expected\n template. The optional name argument defines the server node to which the\n handler will be bound. Otherwise the default server will be used.\n\n The header_filter defines which header field will be used to identify the\n message defined in template. (Otherwise all incoming messages will match!)\n\n The interval defines the interval in seconds on which the handler will\n be called on background. By default the incoming messages are checked\n every 0.5 seconds.\n\n The alias is the alias for the connection. By default the current active\n connection will be used.\n\n The handler function will be called with two arguments: the rammbock library\n instance and the received message.\n\n Example:\n | Load template | SomeMessage |\n | Set server handler | my_module.respond_to_sample | messageType |\n\n my_module.py:\n | def respond_to_sample(rammbock, msg):\n | rammbock.save_template(\"__backup_template\", unlocked=True)\n | try:\n | rammbock.load_template(\"sample response\")\n | rammbock.server_sends_message()\n | finally:\n | rammbock.load_template(\"__backup_template\")\n \"\"\"\n msg_template = self._get_message_template()\n server, server_name = self._servers.get_with_name(name)\n server.set_handler(msg_template, handler_func, header_filter=header_filter, alias=alias, interval=interval)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef reset_rammbock(self):\n for client in self._clients:\n client.close()\n for server in self._servers:\n server.close()\n self._init_caches()", "response": "Closes all connections deletes all servers clients and protocols and sets the internal cache to empty."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreset streams and sockets of incoming messages.", "response": "def clear_message_streams(self):\n \"\"\" Resets streams and sockets of incoming messages.\n\n You can use this method to reuse the same connections for several\n consecutive test cases.\n \"\"\"\n for client in self._clients:\n client.empty()\n for server in self._servers:\n server.empty()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nstarting defining a new protocol template.", "response": "def new_protocol(self, protocol_name):\n \"\"\"Start defining a new protocol template.\n\n All messages sent and received from a connection that uses a protocol\n have to conform to this protocol template.\n \"\"\"\n if self._protocol_in_progress:\n raise Exception('Can not start a new protocol definition in middle of old.')\n if protocol_name in self._protocols:\n raise Exception('Protocol %s already defined' % protocol_name)\n self._init_new_message_stack(Protocol(protocol_name, library=self))\n self._protocol_in_progress = True"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef start_udp_server(self, ip, port, name=None, timeout=None, protocol=None, family='ipv4'):\n self._start_server(UDPServer, ip, port, name, timeout, protocol, family)", "response": "Starts a new UDP server to given ip and port."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nstarting a new TCP server to given ip and port.", "response": "def start_tcp_server(self, ip, port, name=None, timeout=None, protocol=None, family='ipv4'):\n \"\"\"Starts a new TCP server to given `ip` and `port`.\n\n Server can be given a `name`, default `timeout` and a `protocol`.\n `family` can be either ipv4 (default) or ipv6. Notice that you have to\n use `Accept Connection` keyword for server to receive connections.\n\n Examples:\n | Start TCP server | 10.10.10.2 | 53 |\n | Start TCP server | 10.10.10.2 | 53 | Server1 |\n | Start TCP server | 10.10.10.2 | 53 | name=Server1 | protocol=GTPV2 |\n | Start TCP server | 10.10.10.2 | 53 | timeout=5 |\n | Start TCP server | 0:0:0:0:0:0:0:1 | 53 | family=ipv6 |\n \"\"\"\n self._start_server(TCPServer, ip, port, name, timeout, protocol, family)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nstart a new SCTP server to given ip and port.", "response": "def start_sctp_server(self, ip, port, name=None, timeout=None, protocol=None, family='ipv4'):\n \"\"\"Starts a new STCP server to given `ip` and `port`.\n\n `family` can be either ipv4 (default) or ipv6.\n\n pysctp (https://github.com/philpraxis/pysctp) need to be installed your system.\n Server can be given a `name`, default `timeout` and a `protocol`.\n Notice that you have to use `Accept Connection` keyword for server to\n receive connections.\n\n Examples:\n | Start STCP server | 10.10.10.2 | 53 |\n | Start STCP server | 10.10.10.2 | 53 | Server1 |\n | Start STCP server | 10.10.10.2 | 53 | name=Server1 | protocol=GTPV2 |\n | Start STCP server | 10.10.10.2 | 53 | timeout=5 |\n \"\"\"\n self._start_server(SCTPServer, ip, port, name, timeout, protocol, family)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef start_udp_client(self, ip=None, port=None, name=None, timeout=None, protocol=None, family='ipv4'):\n self._start_client(UDPClient, ip, port, name, timeout, protocol, family)", "response": "Starts a new UDP client."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nstart a new TCP client.", "response": "def start_tcp_client(self, ip=None, port=None, name=None, timeout=None, protocol=None, family='ipv4'):\n \"\"\"Starts a new TCP client.\n\n Client can be optionally given `ip` and `port` to bind to, as well as\n `name`, default `timeout` and a `protocol`. `family` can be either\n ipv4 (default) or ipv6.\n\n You should use `Connect` keyword to connect client to a host.\n\n Examples:\n | Start TCP client |\n | Start TCP client | name=Client1 | protocol=GTPV2 |\n | Start TCP client | 10.10.10.2 | 53 | name=Server1 | protocol=GTPV2 |\n | Start TCP client | timeout=5 |\n | Start TCP client | 0:0:0:0:0:0:0:1 | 53 | family=ipv6 |\n \"\"\"\n self._start_client(TCPClient, ip, port, name, timeout, protocol, family)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nstarting a new SCTP client.", "response": "def start_sctp_client(self, ip=None, port=None, name=None, timeout=None, protocol=None, family='ipv4'):\n \"\"\"Starts a new SCTP client.\n\n Client can be optionally given `ip` and `port` to bind to, as well as\n `name`, default `timeout` and a `protocol`. `family` can be either\n ipv4 (default) or ipv6.\n\n You should use `Connect` keyword to connect client to a host.\n\n Examples:\n | Start TCP client |\n | Start TCP client | name=Client1 | protocol=GTPV2 |\n | Start TCP client | 10.10.10.2 | 53 | name=Server1 | protocol=GTPV2 |\n | Start TCP client | timeout=5 |\n \"\"\"\n self._start_client(SCTPClient, ip, port, name, timeout, protocol, family)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\naccept a connection to server identified by name or the latest server.", "response": "def accept_connection(self, name=None, alias=None, timeout=0):\n \"\"\"Accepts a connection to server identified by `name` or the latest\n server if `name` is empty.\n\n If given an `alias`, the connection is named and can be later referenced\n with that name.\n\n If `timeout` is > 0, the connection times out after the time specified.\n `timeout` defaults to 0 which will wait indefinitely.\n Empty value or None will use socket default timeout.\n\n Examples:\n | Accept connection |\n | Accept connection | Server1 | my_connection |\n | Accept connection | Server1 | my_connection | timeout=5 |\n \"\"\"\n server = self._servers.get(name)\n server.accept_connection(alias, timeout)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nconnects a client to given host and port.", "response": "def connect(self, host, port, name=None):\n \"\"\"Connects a client to given `host` and `port`. If client `name` is not\n given then connects the latest client.\n\n Examples:\n | Connect | 127.0.0.1 | 8080 |\n | Connect | 127.0.0.1 | 8080 | Client1 |\n \"\"\"\n client = self._clients.get(name)\n client.connect_to(host, port)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef client_sends_binary(self, message, name=None, label=None):\n client, name = self._clients.get_with_name(name)\n client.send(message)\n self._register_send(client, label, name)", "response": "Send raw binary message."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nsend raw binary message to server.", "response": "def server_sends_binary(self, message, name=None, connection=None, label=None):\n \"\"\"Send raw binary `message`.\n\n If server `name` is not given, uses the latest server. Optional message\n `label` is shown on logs.\n\n Examples:\n | Server sends binary | Hello! |\n | Server sends binary | ${some binary} | Server1 | label=DebugMessage |\n | Server sends binary | ${some binary} | connection=my_connection |\n \"\"\"\n server, name = self._servers.get_with_name(name)\n server.send(message, alias=connection)\n self._register_send(server, label, name, connection=connection)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreceive raw binary message.", "response": "def client_receives_binary(self, name=None, timeout=None, label=None):\n \"\"\"Receive raw binary message.\n\n If client `name` is not given, uses the latest client. Optional message\n `label` is shown on logs.\n\n Examples:\n | ${binary} = | Client receives binary |\n | ${binary} = | Client receives binary | Client1 | timeout=5 |\n \"\"\"\n client, name = self._clients.get_with_name(name)\n msg = client.receive(timeout=timeout)\n self._register_receive(client, label, name)\n return msg"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreceives raw binary message.", "response": "def server_receives_binary(self, name=None, timeout=None, connection=None, label=None):\n \"\"\"Receive raw binary message.\n\n If server `name` is not given, uses the latest server. Optional message\n `label` is shown on logs.\n\n Examples:\n | ${binary} = | Server receives binary |\n | ${binary} = | Server receives binary | Server1 | connection=my_connection | timeout=5 |\n \"\"\"\n return self.server_receives_binary_from(name, timeout, connection=connection, label=label)[0]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreceiving raw binary message from server.", "response": "def server_receives_binary_from(self, name=None, timeout=None, connection=None, label=None):\n \"\"\"Receive raw binary message. Returns message, ip, and port.\n\n If server `name` is not given, uses the latest server. Optional message\n `label` is shown on logs.\n\n Examples:\n | ${binary} | ${ip} | ${port} = | Server receives binary from |\n | ${binary} | ${ip} | ${port} = | Server receives binary from | Server1 | connection=my_connection | timeout=5 |\n \"\"\"\n server, name = self._servers.get_with_name(name)\n msg, ip, port = server.receive_from(timeout=timeout, alias=connection)\n self._register_receive(server, label, name, connection=connection)\n return msg, ip, port"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ndefining a new message template with message_name.", "response": "def new_message(self, message_name, protocol=None, *parameters):\n \"\"\"Define a new message template with `message_name`.\n\n `protocol` has to be defined earlier with `Start Protocol Description`.\n Optional parameters are default values for message header separated with\n colon.\n\n Examples:\n | New message | MyMessage | MyProtocol | header_field:value |\n \"\"\"\n proto = self._get_protocol(protocol)\n if not proto:\n raise Exception(\"Protocol not defined! Please define a protocol before creating a message!\")\n if self._protocol_in_progress:\n raise Exception(\"Protocol definition in progress. Please finish it before starting to define a message.\")\n configs, fields, header_fields = self._parse_parameters(parameters)\n self._raise_error_if_configs_or_fields(configs, fields, 'New message')\n self._init_new_message_stack(MessageTemplate(message_name, proto, header_fields))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef save_template(self, name, unlocked=False):\n if isinstance(unlocked, basestring):\n unlocked = unlocked.lower() != 'false'\n template = self._get_message_template()\n if not unlocked:\n template.set_as_saved()\n self._message_templates[name] = (template, self._field_values)", "response": "Save a message template for later use with Load template."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nloading a message template saved with Save template.", "response": "def load_template(self, name, *parameters):\n \"\"\"Load a message template saved with `Save template`.\n Optional parameters are default values for message header separated with\n colon.\n\n Examples:\n | Load Template | MyMessage | header_field:value |\n \"\"\"\n template, fields, header_fields = self._set_templates_fields_and_header_fields(name, parameters)\n self._init_new_message_stack(template, fields, header_fields)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef load_copy_of_template(self, name, *parameters):\n template, fields, header_fields = self._set_templates_fields_and_header_fields(name, parameters)\n copy_of_template = copy.deepcopy(template)\n copy_of_fields = copy.deepcopy(fields)\n self._init_new_message_stack(copy_of_template, copy_of_fields, header_fields)", "response": "Load a copy of message template saved with Save template when originally saved values need to be preserved\n from test to test."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_message(self, *parameters):\n _, message_fields, header_fields = self._get_parameters_with_defaults(parameters)\n return self._encode_message(message_fields, header_fields)", "response": "Get encoded message.\n\n * Send Message -keywords are convenience methods, that will call this to\n get the message object and then send it. Optional parameters are message\n field values separated with colon.\n\n Examples:\n | ${msg} = | Get message |\n | ${msg} = | Get message | field_name:value |"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreceive a message from the buffer and validate it.", "response": "def client_receives_message(self, *parameters):\n \"\"\"Receive a message with template defined using `New Message` and\n validate field values.\n\n Message template has to be defined with `New Message` before calling\n this.\n\n Optional parameters:\n - `name` the client name (default is the latest used) example: `name=Client 1`\n - `timeout` for receiving message. example: `timeout=0.1`\n - `latest` if set to True, get latest message from buffer instead first. Default is False. Example: `latest=True`\n - message field values for validation separated with colon. example: `some_field:0xaf05`\n\n Examples:\n | ${msg} = | Client receives message |\n | ${msg} = | Client receives message | name=Client1 | timeout=5 |\n | ${msg} = | Client receives message | message_field:(0|1) |\n \"\"\"\n with self._receive(self._clients, *parameters) as (msg, message_fields, header_fields):\n self._validate_message(msg, message_fields, header_fields)\n return msg"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreceive a message with template defined using New Message.", "response": "def client_receives_without_validation(self, *parameters):\n \"\"\"Receive a message with template defined using `New Message`.\n\n Message template has to be defined with `New Message` before calling\n this.\n\n Optional parameters:\n - `name` the client name (default is the latest used) example: `name=Client 1`\n - `timeout` for receiving message. example: `timeout=0.1`\n - `latest` if set to True, get latest message from buffer instead first. Default is False. Example: `latest=True`\n\n Examples:\n | ${msg} = | Client receives without validation |\n | ${msg} = | Client receives without validation | name=Client1 | timeout=5 |\n \"\"\"\n with self._receive(self._clients, *parameters) as (msg, _, _):\n return msg"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreceiving a message from the server and validate it.", "response": "def server_receives_message(self, *parameters):\n \"\"\"Receive a message with template defined using `New Message` and\n validate field values.\n\n Message template has to be defined with `New Message` before calling\n this.\n\n Optional parameters:\n - `name` the client name (default is the latest used) example: `name=Client 1`\n - `connection` alias. example: `connection=connection 1`\n - `timeout` for receiving message. example: `timeout=0.1`\n - `latest` if set to True, get latest message from buffer instead first. Default is False. Example: `latest=True`\n - message field values for validation separated with colon. example: `some_field:0xaf05`\n\n Optional parameters are server `name`, `connection` alias and\n possible `timeout` separated with equals and message field values for\n validation separated with colon.\n\n Examples:\n | ${msg} = | Server receives message |\n | ${msg} = | Server receives message | name=Server1 | alias=my_connection | timeout=5 |\n | ${msg} = | Server receives message | message_field:(0|1) |\n \"\"\"\n with self._receive(self._servers, *parameters) as (msg, message_fields, header_fields):\n self._validate_message(msg, message_fields, header_fields)\n return msg"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef server_receives_without_validation(self, *parameters):\n with self._receive(self._servers, *parameters) as (msg, _, _):\n return msg", "response": "Receive a message with template defined using New Message."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef validate_message(self, msg, *parameters):\n _, message_fields, header_fields = self._get_parameters_with_defaults(parameters)\n self._validate_message(msg, message_fields, header_fields)", "response": "Validates given message using template defined with New Message and\n field values given as optional arguments."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd an unsigned integer to template.", "response": "def uint(self, length, name, value=None, align=None):\n \"\"\"Add an unsigned integer to template.\n\n `length` is given in bytes and `value` is optional. `align` can be used\n to align the field to longer byte length.\n\n Examples:\n | uint | 2 | foo |\n | uint | 2 | foo | 42 |\n | uint | 2 | fourByteFoo | 42 | align=4 |\n \"\"\"\n self._add_field(UInt(length, name, value, align=align))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef int(self, length, name, value=None, align=None):\n self._add_field(Int(length, name, value, align=align))", "response": "Add an integer field to template."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef chars(self, length, name, value=None, terminator=None):\n\n self._add_field(Char(length, name, value, terminator))", "response": "Add a char array to the current entry in the log."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ndefining a new struct to template.", "response": "def new_struct(self, type, name, *parameters):\n \"\"\"Defines a new struct to template.\n\n You must call `End Struct` to end struct definition. `type` is the name\n for generic type and `name` is the field name in containing structure.\n Possible parameters are values for struct fields separated with colon\n and optional struct length defined with `length=`. Length can be used in\n receiveing to validate that struct matches predfeined length. When\n sending, the struct length can refer to other message field which will\n then be set dynamically.\n\n Examples:\n | New struct | Pair | myPair |\n | u8 | first |\n | u8 | second |\n | End Struct |\n \"\"\"\n configs, parameters, _ = self._get_parameters_with_defaults(parameters)\n self._add_struct_name_to_params(name, parameters)\n self._message_stack.append(StructTemplate(type, name, self._current_container, parameters, length=configs.get('length'), align=configs.get('align')))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ndefining a new list to template of size and with name.", "response": "def _new_list(self, size, name):\n \"\"\"Defines a new list to template of `size` and with `name`.\n\n List type must be given after this keyword by defining one field. Then\n the list definition has to be closed using `End List`.\n\n Special value '*' in size means that list will decode values as long as\n data is available. This free length value is not supported on encoding.\n\n Examples:\n | New list | 5 | myIntList |\n | u16 |\n | End List |\n\n | u8 | listLength |\n | New list | listLength | myIntList |\n | u16 |\n | End List |\n\n | New list | * | myIntList |\n | u16 |\n | End List |\n \"\"\"\n self._message_stack.append(ListTemplate(size, name, self._current_container))"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ndefines a new binary container to template.", "response": "def new_binary_container(self, name):\n \"\"\"Defines a new binary container to template.\n\n Binary container can only contain binary fields defined with `Bin`\n keyword.\n\n Examples:\n | New binary container | flags |\n | bin | 2 | foo |\n | bin | 6 | bar |\n | End binary container |\n \"\"\"\n self._message_stack.append(BinaryContainerTemplate(name, self._current_container))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nending a binary container.", "response": "def end_binary_container(self):\n \"\"\"End binary container. See `New Binary Container`.\n \"\"\"\n binary_container = self._message_stack.pop()\n binary_container.verify()\n self._add_field(binary_container)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nadds a new binary field to template. This keyword has to be called within a binary container.", "response": "def bin(self, size, name, value=None):\n \"\"\"Add new binary field to template.\n\n This keyword has to be called within a binary container. See `New Binary\n Container`.\n \"\"\"\n self._add_field(Binary(size, name, value))"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ndefine a new union to template of type and name.", "response": "def new_union(self, type, name):\n \"\"\"Defines a new union to template of `type` and `name`.\n\n Fields inside the union are alternatives and the length of the union is\n the length of its longest field.\n\n Example:\n | New union | IntOrAddress | foo |\n | Chars | 16 | ipAddress |\n | u32 | int |\n | End union |\n \"\"\"\n self._message_stack.append(UnionTemplate(type, name, self._current_container))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef start_bag(self, name):\n self._message_stack.append(BagTemplate(name, self._current_container))", "response": "Starts bag templates for the given name."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef value(self, name, value):\n if isinstance(value, _StructuredElement):\n self._struct_fields_as_values(name, value)\n elif name.startswith('header:'):\n self._header_values[name.partition(':')[-1]] = value\n else:\n self._field_values[name] = value", "response": "Defines a default value for a template field identified by name."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndefines a conditional element of name exists if condition is true.", "response": "def conditional(self, condition, name):\n \"\"\"Defines a 'condition' when conditional element of 'name' exists if `condition` is true.\n\n `condition` can contain multiple conditions combined together using Logical Expressions(&&,||).\n\n Example:\n | Conditional | mycondition == 1 | foo |\n | u8 | myelement | 42 |\n | End conditional |\n\n | Conditional | condition1 == 1 && condition2 != 2 | bar |\n | u8 | myelement | 8 |\n | End condtional |\n \"\"\"\n self._message_stack.append(ConditionalTemplate(condition, name, self._current_container))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_client_unread_messages_count(self, client_name=None):\n client = self._clients.get_with_name(client_name)[0]\n return client.get_messages_count_in_buffer()", "response": "Gets the count of unread messages from client"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_server_unread_messages_count(self, server_name=None):\n server = self._servers.get(server_name)\n return server.get_messages_count_in_buffer()", "response": "Gets the count of unread messages from server"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncompute the 2 s compliment of int value val", "response": "def to_twos_comp(val, bits):\n \"\"\"compute the 2's compliment of int value val\"\"\"\n if not val.startswith('-'):\n return to_int(val)\n value = _invert(to_bin_str_from_int_string(bits, bin(to_int(val[1:]))))\n return int(value, 2) + 1"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nadds an unsigned 1 byte integer field to template. This is an convenience method that simply calls uint and does not call it.", "response": "def u8(self, name, value=None, align=None):\n \"\"\"Add an unsigned 1 byte integer field to template.\n\n This is an convenience method that simply calls `Uint` keyword with predefined length.\"\"\"\n self.uint(1, name, value, align)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef u16(self, name, value=None, align=None):\n self.uint(2, name, value, align)", "response": "Add an unsigned 2 byte integer field to template."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef u24(self, name, value=None, align=None):\n self.uint(3, name, value, align)", "response": "Add an unsigned 3 byte integer field to template.\n This is an convenience method that simply calls uint and also uses the predefined length."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nadds an unsigned 4 byte integer field to template.", "response": "def u32(self, name, value=None, align=None):\n \"\"\"Add an unsigned 4 byte integer field to template.\n\n This is an convenience method that simply calls `Uint` keyword with predefined length.\"\"\"\n self.uint(4, name, value, align)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nadds an unsigned 5 byte integer field to template. This is an convenience method that simply calls uint and then calls self. uint.", "response": "def u40(self, name, value=None, align=None):\n \"\"\"Add an unsigned 5 byte integer field to template.\n\n This is an convenience method that simply calls `Uint` keyword with predefined length.\"\"\"\n self.uint(5, name, value, align)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nadd an unsigned 8 byte integer field to template. This is an convenience method that simply calls uint and does not call it.", "response": "def u64(self, name, value=None, align=None):\n \"\"\"Add an unsigned 8 byte integer field to template.\n\n This is an convenience method that simply calls `Uint` keyword with predefined length.\"\"\"\n self.uint(8, name, value, align)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef u128(self, name, value=None, align=None):\n self.uint(16, name, value, align)", "response": "Add an unsigned 16 byte integer field to template."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd an 1 byte integer field to template. This is an convenience method that simply calls Int.", "response": "def i8(self, name, value=None, align=None):\n \"\"\"Add an 1 byte integer field to template.\n\n This is an convenience method that simply calls `Int` keyword with predefined length.\"\"\"\n self.int(1, name, value, align)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef i32(self, name, value=None, align=None):\n self.int(4, name, value, align)", "response": "Add an 32 byte integer field to template.\n This is a convenience method that simply calls Int."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndefines a new array of given size and containing fields of type containing parameters of type name if the name of this array element is provided as the contents of the keyword parameters.", "response": "def array(self, size, type, name, *parameters):\n \"\"\"Define a new array of given `size` and containing fields of type `type`.\n\n `name` if the name of this array element. The `type` is the name of keyword that is executed as the contents of\n the array and optional extra parameters are passed as arguments to this keyword.\n\n Examples:\n | Array | 8 | u16 | myArray |\n\n | u32 | length |\n | Array | length | someStruct | myArray | |\n \"\"\"\n self._new_list(size, name)\n BuiltIn().run_keyword(type, '', *parameters)\n self._end_list()"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef container(self, name, length, type, *parameters):\n self.new_struct('Container', name, 'length=%s' % length)\n BuiltIn().run_keyword(type, *parameters)\n self.end_struct()", "response": "Define a container with given length."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef embed_seqdiag_sequence(self):\n test_name = BuiltIn().replace_variables('${TEST NAME}')\n outputdir = BuiltIn().replace_variables('${OUTPUTDIR}')\n path = os.path.join(outputdir, test_name + '.seqdiag')\n SeqdiagGenerator().compile(path, self._message_sequence)", "response": "Create a message sequence diagram png file to output folder and embed the image to log file"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef local_error(self, originalValue, calculatedValue):\n originalValue = originalValue[0]\n calculatedValue = calculatedValue[0]\n\n # error is zero\n if not originalValue and not calculatedValue:\n return 0.0\n\n return abs(calculatedValue - originalValue)/ ((abs(originalValue) + abs(calculatedValue))/2) * 100", "response": "Calculates the error between two values."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef tostring(self, inject):\n injected_parts = ''\n for part in self.parts:\n injected = part.tostring(inject)\n tei_tag = next(\n (attribute for attribute in part.attributes if attribute.key == \"tei-tag\"), None)\n if tei_tag and tei_tag.text == \"w\" and injected_parts:\n # make sure words can be tokenized correctly\n if injected_parts and injected_parts[-1] != ' ':\n injected_parts += ' '\n injected_parts += injected.strip() + ' '\n else:\n injected_parts += injected\n\n return inject(self, injected_parts)", "response": "Convert an element to a single string and allow the passed inject method to place content before any ArcGIS element."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a parser hooking up the command methods below to be run when chosen.", "response": "def _create_parser(self):\n \"\"\"\n Create a parser hooking up the command methods below to be run when chosen.\n :return: CommandParser parser with commands attached.\n \"\"\"\n parser = CommandParser(get_internal_version_str())\n parser.register_list_command(self._setup_run_command(ListCommand))\n parser.register_upload_command(self._setup_run_command(UploadCommand))\n parser.register_add_user_command(self._setup_run_command(AddUserCommand))\n parser.register_remove_user_command(self._setup_run_command(RemoveUserCommand))\n parser.register_download_command(self._setup_run_command(DownloadCommand))\n parser.register_share_command(self._setup_run_command(ShareCommand))\n parser.register_deliver_command(self._setup_run_command(DeliverCommand))\n parser.register_delete_command(self._setup_run_command(DeleteCommand))\n parser.register_list_auth_roles_command(self._setup_run_command(ListAuthRolesCommand))\n return parser"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncheck the version of the pypi and print a error to stderr and pause.", "response": "def _check_pypi_version(self):\n \"\"\"\n When the version is out of date or we have trouble retrieving it print a error to stderr and pause.\n \"\"\"\n try:\n check_version()\n except VersionException as err:\n print(str(err), file=sys.stderr)\n time.sleep(TWO_SECONDS)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nrun command_constructor and call run on the resulting object", "response": "def _run_command(self, command_constructor, args):\n \"\"\"\n Run command_constructor and call run(args) on the resulting object\n :param command_constructor: class of an object that implements run(args)\n :param args: object arguments for specific command created by CommandParser\n \"\"\"\n verify_terminal_encoding(sys.stdout.encoding)\n self._check_pypi_version()\n config = create_config(allow_insecure_config_file=args.allow_insecure_config_file)\n self.show_error_stack_trace = config.debug_mode\n command = command_constructor(config)\n command.run(args)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef make_user_list(self, emails, usernames):\n to_users = []\n remaining_emails = [] if not emails else list(emails)\n remaining_usernames = [] if not usernames else list(usernames)\n for user in self.remote_store.fetch_users():\n if user.email in remaining_emails:\n to_users.append(user)\n remaining_emails.remove(user.email)\n elif user.username in remaining_usernames:\n to_users.append(user)\n remaining_usernames.remove(user.username)\n if remaining_emails or remaining_usernames:\n unable_to_find_users = ','.join(remaining_emails + remaining_usernames)\n msg = \"Unable to find users for the following email/usernames: {}\".format(unable_to_find_users)\n raise ValueError(msg)\n return to_users", "response": "Given a list of emails and usernames fetch DukeDS user info."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nuploading contents of folders to a project with project_name on remote store.", "response": "def run(self, args):\n \"\"\"\n Upload contents of folders to a project with project_name on remote store.\n If follow_symlinks we will traverse symlinked directories.\n If content is already on remote site it will not be sent.\n :param args: Namespace arguments parsed from the command line.\n \"\"\"\n project_name_or_id = self.create_project_name_or_id_from_args(args)\n folders = args.folders # list of local files/folders to upload into the project\n follow_symlinks = args.follow_symlinks # should we follow symlinks when traversing folders\n dry_run = args.dry_run # do not upload anything, instead print out what you would upload\n\n project_upload = ProjectUpload(self.config, project_name_or_id, folders, follow_symlinks=follow_symlinks)\n if dry_run:\n print(project_upload.dry_run_report())\n else:\n print(project_upload.get_differences_summary())\n if project_upload.needs_to_upload():\n project_upload.run()\n print('\\n')\n print(project_upload.get_upload_report())\n print('\\n')\n print(project_upload.get_url_msg())"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef run(self, args):\n project_name_or_id = self.create_project_name_or_id_from_args(args)\n folder = args.folder # path to a folder to download data into\n # Default to project name with spaces replaced with '_' if not specified\n if not folder:\n folder = replace_invalid_path_chars(project_name_or_id.value.replace(' ', '_'))\n destination_path = format_destination_path(folder)\n path_filter = PathFilter(args.include_paths, args.exclude_paths)\n project = self.fetch_project(args, must_exist=True)\n project_download = ProjectDownload(self.remote_store, project, destination_path, path_filter)\n project_download.run()", "response": "Download a project based on passed in args."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ngive the user with user_full_name the auth_role permissions on the remote project with project_name.", "response": "def run(self, args):\n \"\"\"\n Give the user with user_full_name the auth_role permissions on the remote project with project_name.\n :param args Namespace arguments parsed from the command line\n \"\"\"\n email = args.email # email of person to give permissions, will be None if username is specified\n username = args.username # username of person to give permissions, will be None if email is specified\n auth_role = args.auth_role # type of permission(project_admin)\n project = self.fetch_project(args, must_exist=True, include_children=False)\n user = self.remote_store.lookup_or_register_user_by_email_or_username(email, username)\n self.remote_store.set_user_project_permission(project, user, auth_role)\n print(u'Gave user {} {} permissions for project {}.'.format(user.full_name, auth_role, project.name))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run(self, args):\n email = args.email # email of person to remove permissions from (None if username specified)\n username = args.username # username of person to remove permissions from (None if email is specified)\n project = self.fetch_project(args, must_exist=True, include_children=False)\n user = self.remote_store.lookup_or_register_user_by_email_or_username(email, username)\n self.remote_store.revoke_user_project_permission(project, user)\n print(u'Removed permissions from user {} for project {}.'.format(user.full_name, project.name))", "response": "Remove permissions from the user with user_full_name or email on the remote project with project_name."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef run(self, args):\n email = args.email # email of person to send email to\n username = args.username # username of person to send email to, will be None if email is specified\n force_send = args.resend # is this a resend so we should force sending\n auth_role = args.auth_role # authorization role(project permissions) to give to the user\n msg_file = args.msg_file # message file who's contents will be sent with the share\n message = read_argument_file_contents(msg_file)\n print(\"Sharing project.\")\n to_user = self.remote_store.lookup_or_register_user_by_email_or_username(email, username)\n try:\n project = self.fetch_project(args, must_exist=True, include_children=False)\n dest_email = self.service.share(project, to_user, force_send, auth_role, message)\n print(\"Share email message sent to \" + dest_email)\n except D4S2Error as ex:\n if ex.warning:\n print(ex.message)\n else:\n raise", "response": "Share a project with the remote store."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nbegins process that will transfer the project to another user. Send delivery message to D4S2 service specifying a project and a user. When user accepts delivery they receive access and we lose admin privileges. :param args Namespace arguments parsed from the command line", "response": "def run(self, args):\n \"\"\"\n Begins process that will transfer the project to another user.\n Send delivery message to D4S2 service specifying a project and a user.\n When user accepts delivery they receive access and we lose admin privileges.\n :param args Namespace arguments parsed from the command line\n \"\"\"\n email = args.email # email of person to deliver to, will be None if username is specified\n username = args.username # username of person to deliver to, will be None if email is specified\n copy_project = args.copy_project # should we deliver a copy of the project\n force_send = args.resend # is this a resend so we should force sending\n msg_file = args.msg_file # message file who's contents will be sent with the delivery\n share_usernames = args.share_usernames # usernames who will have this project shared once it is accepted\n share_emails = args.share_emails # emails of users who will have this project shared once it is accepted\n message = read_argument_file_contents(msg_file)\n project = self.fetch_project(args, must_exist=True, include_children=False)\n share_users = self.make_user_list(share_emails, share_usernames)\n print(\"Delivering project.\")\n new_project_name = None\n if copy_project:\n new_project_name = self.get_new_project_name(project.name)\n to_user = self.remote_store.lookup_or_register_user_by_email_or_username(email, username)\n try:\n path_filter = PathFilter(args.include_paths, args.exclude_paths)\n dest_email = self.service.deliver(project, new_project_name, to_user, share_users,\n force_send, path_filter, message)\n print(\"Delivery email message sent to \" + dest_email)\n except D4S2Error as ex:\n if ex.warning:\n print(ex.message)\n else:\n raise"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns a unique project name for the copy.", "response": "def get_new_project_name(self, project_name):\n \"\"\"\n Return a unique project name for the copy.\n :param project_name: str: name of project we will copy\n :return: str\n \"\"\"\n timestamp_str = datetime.datetime.utcnow().strftime('%Y-%m-%d %H:%M')\n return \"{} {}\".format(project_name, timestamp_str)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nlists project names. :param args Namespace arguments parsed from the command line", "response": "def run(self, args):\n \"\"\"\n Lists project names.\n :param args Namespace arguments parsed from the command line\n \"\"\"\n long_format = args.long_format\n # project_name and auth_role args are mutually exclusive\n if args.project_name or args.project_id:\n project = self.fetch_project(args, must_exist=True, include_children=True)\n self.print_project_details(project, long_format)\n else:\n self.print_project_list_details(args.auth_role, long_format)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef print_project_list_details(self, filter_auth_role, long_format):\n if filter_auth_role:\n projects_details = self.remote_store.get_projects_with_auth_role(auth_role=filter_auth_role)\n else:\n projects_details = self.remote_store.get_projects_details()\n if projects_details:\n for projects_detail in projects_details:\n print(self.get_project_info_line(projects_detail, long_format))\n else:\n print(NO_PROJECTS_FOUND_MESSAGE)", "response": "Prints the list of projects and their details to stdout."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef run(self, args):\n project = self.fetch_project(args, must_exist=True, include_children=False)\n if not args.force:\n delete_prompt = \"Are you sure you wish to delete {} (y/n)?\".format(project.name)\n if not boolean_input_prompt(delete_prompt):\n return\n self.remote_store.delete_project(self.create_project_name_or_id_from_args(args))", "response": "Delete a single project specified by project_name or id in args."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nprint out non deprecated project - type auth roles.", "response": "def run(self, args):\n \"\"\"\n Prints out non deprecated project-type auth roles.\n :param args Namespace arguments parsed from the command line\n \"\"\"\n auth_roles = self.remote_store.get_active_auth_roles(RemoteAuthRole.PROJECT_CONTEXT)\n if auth_roles:\n for auth_role in auth_roles:\n print(auth_role.id, \"-\", auth_role.description)\n else:\n print(\"No authorization roles found.\")"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nretrieves a list of CA certificates at either the TXAWS_CERTS_PATH or the TXAWS_CERTS_PATH environment variable.", "response": "def get_ca_certs(environ=os.environ):\n \"\"\"\n Retrieve a list of CAs at either the DEFAULT_CERTS_PATH or the env\n override, TXAWS_CERTS_PATH.\n\n In order to find .pem files, this function checks first for presence of the\n TXAWS_CERTS_PATH environment variable that should point to a directory\n containing cert files. In the absense of this variable, the module-level\n DEFAULT_CERTS_PATH will be used instead.\n\n Note that both of these variables have have multiple paths in them, just\n like the familiar PATH environment variable (separated by colons).\n \"\"\"\n cert_paths = environ.get(\"TXAWS_CERTS_PATH\", DEFAULT_CERTS_PATH).split(\":\")\n certificate_authority_map = {}\n for path in cert_paths:\n if not path:\n continue\n for cert_file_name in glob(os.path.join(path, \"*.pem\")):\n # There might be some dead symlinks in there, so let's make sure\n # it's real.\n if not os.path.exists(cert_file_name):\n continue\n cert_file = open(cert_file_name)\n data = cert_file.read()\n cert_file.close()\n x509 = load_certificate(FILETYPE_PEM, data)\n digest = x509.digest(\"sha1\")\n # Now, de-duplicate in case the same cert has multiple names.\n certificate_authority_map[digest] = x509\n values = certificate_authority_map.values()\n if len(values) == 0:\n raise exception.CertsNotFoundError(\"Could not find any .pem files.\")\n return values"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _calculate(self, startingPercentage, endPercentage, startDate, endDate):\n # get the defined subset of error values\n errorValues = self._get_error_values(startingPercentage, endPercentage, startDate, endDate)\n errorValues = filter(lambda item: item is None, errorValues)\n\n if errorValues[0] is None:\n return 1.0\n\n share = 1.0 / float(len(errorValues))\n product = 1.0\n\n for errorValue in errorValues:\n # never multiply with zero!\n if 0 == errorValue:\n continue\n\n product *= errorValue**share\n\n return product", "response": "This method is used to calculate the error of the current language."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef iso8601time(time_tuple):\n if time_tuple:\n return time.strftime(\"%Y-%m-%dT%H:%M:%SZ\", time_tuple)\n else:\n return time.strftime(\"%Y-%m-%dT%H:%M:%SZ\", time.gmtime())", "response": "Format time_tuple as an ISO8601 time string."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef parse(url, defaultPort=True):\n url = url.strip()\n parsed = urlparse(url)\n scheme = parsed[0]\n path = urlunparse((\"\", \"\") + parsed[2:])\n host = parsed[1]\n\n if \":\" in host:\n host, port = host.split(\":\")\n try:\n port = int(port)\n except ValueError:\n # A non-numeric port was given, it will be replaced with\n # an appropriate default value if defaultPort is True\n port = None\n else:\n port = None\n\n if port is None and defaultPort:\n if scheme == \"https\":\n port = 443\n else:\n port = 80\n\n if path == \"\":\n path = \"/\"\n return (str(scheme), str(host), port, str(path))", "response": "Parses the given URL into scheme host port and path."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting the entire text content as str", "response": "def tostring(self, inject):\n \"\"\"Get the entire text content as str\"\"\"\n return inject(self, '\\n'.join(document.tostring(inject) for document in self.documents))"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef retry_when_service_down(func):\n def retry_function(*args, **kwds):\n showed_status_msg = False\n status_watcher = args[0]\n while True:\n try:\n result = func(*args, **kwds)\n if showed_status_msg:\n status_watcher.set_status_message('')\n return result\n except DataServiceError as dse:\n if dse.status_code == 503:\n if not showed_status_msg:\n message = SERVICE_DOWN_MESSAGE.format(datetime.datetime.utcnow())\n status_watcher.set_status_message(message)\n showed_status_msg = True\n time.sleep(SERVICE_DOWN_RETRY_SECONDS)\n else:\n raise\n return retry_function", "response": "Decorator that will retry a function while it fails with status 503\nInsights"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nrunning func until the resource is consistent.", "response": "def retry_until_resource_is_consistent(func, monitor):\n \"\"\"\n Runs func, if func raises DSResourceNotConsistentError will retry func indefinitely.\n Notifies monitor if we have to wait(only happens if DukeDS API raises DSResourceNotConsistentError.\n :param func: func(): function to run\n :param monitor: object: has start_waiting() and done_waiting() methods when waiting for non-consistent resource\n :return: whatever func returns\n \"\"\"\n waiting = False\n while True:\n try:\n resp = func()\n if waiting and monitor:\n monitor.done_waiting()\n return resp\n except DSResourceNotConsistentError:\n if not waiting and monitor:\n monitor.start_waiting()\n waiting = True\n time.sleep(RESOURCE_NOT_CONSISTENT_RETRY_SECONDS)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_auth(self):\n if self.legacy_auth():\n return self._auth\n if not self.auth_expired():\n return self._auth\n self.claim_new_token()\n return self._auth", "response": "Gets an active authentication token refreshing it if necessary."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nupdates internal state to have a new token using a no authorization data service.", "response": "def claim_new_token(self):\n \"\"\"\n Update internal state to have a new token using a no authorization data service.\n \"\"\"\n # Intentionally doing this manually so we don't have a chicken and egg problem with DataServiceApi.\n headers = {\n 'Content-Type': ContentType.json,\n 'User-Agent': self.user_agent_str,\n }\n data = {\n \"agent_key\": self.config.agent_key,\n \"user_key\": self.config.user_key,\n }\n url_suffix = \"/software_agents/api_token\"\n url = self.config.url + url_suffix\n response = requests.post(url, headers=headers, data=json.dumps(data))\n if response.status_code == 404:\n if not self.config.agent_key:\n raise MissingInitialSetupError()\n else:\n raise SoftwareAgentNotFoundError()\n elif response.status_code == 503:\n raise DataServiceError(response, url_suffix, data)\n elif response.status_code != 201:\n raise AuthTokenCreationError(response)\n resp_json = response.json()\n self._auth = resp_json['api_token']\n self._expires = resp_json['expires_on']"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef auth_expired(self):\n if self._auth and self._expires:\n now_with_skew = time.time() + AUTH_TOKEN_CLOCK_SKEW_MAX\n return now_with_skew > self._expires\n return True", "response": "Returns True if the current token has expired."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _url_parts(self, url_suffix, data, content_type):\n url = self.base_url + url_suffix\n send_data = data\n if content_type == ContentType.json:\n send_data = json.dumps(data)\n headers = {\n 'Content-Type': content_type,\n 'User-Agent': self.user_agent_str,\n }\n if self.auth:\n headers['Authorization'] = self.auth.get_auth()\n return url, send_data, headers", "response": "Format the url data based on config_type."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _post(self, url_suffix, data, content_type=ContentType.json):\n (url, data_str, headers) = self._url_parts(url_suffix, data, content_type=content_type)\n resp = self.http.post(url, data_str, headers=headers)\n return self._check_err(resp, url_suffix, data, allow_pagination=False)", "response": "Send a POST request to API at url_suffix with post_data. Raises error if x - total - pages is contained in the response."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsending PUT request to API at url_suffix with post_data.", "response": "def _put(self, url_suffix, data, content_type=ContentType.json):\n \"\"\"\n Send PUT request to API at url_suffix with post_data.\n Raises error if x-total-pages is contained in the response.\n :param url_suffix: str URL path we are sending a PUT to\n :param data: object data we are sending\n :param content_type: str from ContentType that determines how we format the data\n :return: requests.Response containing the result\n \"\"\"\n (url, data_str, headers) = self._url_parts(url_suffix, data, content_type=content_type)\n resp = self.http.put(url, data_str, headers=headers)\n return self._check_err(resp, url_suffix, data, allow_pagination=False)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _get_single_item(self, url_suffix, data, content_type=ContentType.json):\n (url, data_str, headers) = self._url_parts(url_suffix, data, content_type=content_type)\n resp = self.http.get(url, headers=headers, params=data_str)\n return self._check_err(resp, url_suffix, data, allow_pagination=False)", "response": "Send a GET request to API at url_suffix with post_data."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _get_single_page(self, url_suffix, data, page_num):\n data_with_per_page = dict(data)\n data_with_per_page['page'] = page_num\n data_with_per_page['per_page'] = self._get_page_size()\n (url, data_str, headers) = self._url_parts(url_suffix, data_with_per_page,\n content_type=ContentType.form)\n resp = self.http.get(url, headers=headers, params=data_str)\n return self._check_err(resp, url_suffix, data, allow_pagination=True)", "response": "Send a GET request to API at url_suffix with post_data adding page and per_page parameters to retrieve a single page."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _get_collection(self, url_suffix, data):\n response = self._get_single_page(url_suffix, data, page_num=1)\n total_pages_str = response.headers.get('x-total-pages')\n if total_pages_str:\n total_pages = int(total_pages_str)\n if total_pages > 1:\n multi_response = MultiJSONResponse(base_response=response, merge_array_field_name=\"results\")\n for page in range(2, total_pages + 1):\n additional_response = self._get_single_page(url_suffix, data, page_num=page)\n multi_response.add_response(additional_response)\n return multi_response\n return response", "response": "Performs a GET for all pages based on x - total - pages in first response headers."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _check_err(resp, url_suffix, data, allow_pagination):\n total_pages = resp.headers.get('x-total-pages')\n if not allow_pagination and total_pages:\n raise UnexpectedPagingReceivedError()\n if 200 <= resp.status_code < 300:\n return resp\n if resp.status_code == 404:\n if resp.json().get(\"code\") == \"resource_not_consistent\":\n raise DSResourceNotConsistentError(resp, url_suffix, data)\n raise DataServiceError(resp, url_suffix, data)", "response": "Raise DataServiceError if the response was not successful."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncreate a new project with the specified name and description.", "response": "def create_project(self, project_name, desc):\n \"\"\"\n Send POST to /projects creating a new project with the specified name and desc.\n Raises DataServiceError on error.\n :param project_name: str name of the project\n :param desc: str description of the project\n :return: requests.Response containing the successful result\n \"\"\"\n data = {\n \"name\": project_name,\n \"description\": desc\n }\n return self._post(\"/projects\", data)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_folder(self, folder_name, parent_kind_str, parent_uuid):\n data = {\n 'name': folder_name,\n 'parent': {\n 'kind': parent_kind_str,\n 'id': parent_uuid\n }\n }\n return self._post(\"/folders\", data)", "response": "Create a new folder with specified name and parent."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_project_children(self, project_id, name_contains, exclude_response_fields=None):\n return self._get_children('projects', project_id, name_contains, exclude_response_fields)", "response": "Get the children of a project"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _get_children(self, parent_name, parent_id, name_contains, exclude_response_fields=None):\n data = {}\n if name_contains is not None:\n data['name_contains'] = name_contains\n if exclude_response_fields:\n data['exclude_response_fields'] = ' '.join(exclude_response_fields)\n url_prefix = \"/{}/{}/children\".format(parent_name, parent_id)\n return self._get_collection(url_prefix, data)", "response": "Send GET message to / projects or folders or files or folders to fetch info about children"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef create_upload(self, project_id, filename, content_type, size,\n hash_value, hash_alg, storage_provider_id=None, chunked=True):\n \"\"\"\n Post to /projects/{project_id}/uploads to create a uuid for uploading chunks.\n NOTE: The optional hash_value and hash_alg parameters are being removed from the DukeDS API.\n :param project_id: str uuid of the project we are uploading data for.\n :param filename: str name of the file we want to upload\n :param content_type: str mime type of the file\n :param size: int size of the file in bytes\n :param hash_value: str hash value of the entire file\n :param hash_alg: str algorithm used to create hash_value\n :param storage_provider_id: str optional storage provider id\n :param chunked: is the uploaded file made up of multiple chunks. When False a single upload url is returned.\n For more see https://github.com/Duke-Translational-Bioinformatics/duke-data-service/blob/develop/api_design/DDS-1182-nonchucked_upload_api_design.md\n :return: requests.Response containing the successful result\n \"\"\"\n data = {\n \"name\": filename,\n \"content_type\": content_type,\n \"size\": size,\n \"hash\": {\n \"value\": hash_value,\n \"algorithm\": hash_alg\n },\n \"chunked\": chunked,\n }\n if storage_provider_id:\n data['storage_provider'] = {'id': storage_provider_id}\n return self._post(\"/projects/\" + project_id + \"/uploads\", data)", "response": "Create a new upload for a specific file in the project."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef create_upload_url(self, upload_id, number, size, hash_value, hash_alg):\n if number < 1:\n raise ValueError(\"Chunk number must be > 0\")\n data = {\n \"number\": number,\n \"size\": size,\n \"hash\": {\n \"value\": hash_value,\n \"algorithm\": hash_alg\n }\n }\n return self._put(\"/uploads/\" + upload_id + \"/chunks\", data)", "response": "Given an upload created by create_upload retrieve a url where we can upload a chunk."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nmarking the upload we created in create_upload complete.", "response": "def complete_upload(self, upload_id, hash_value, hash_alg):\n \"\"\"\n Mark the upload we created in create_upload complete.\n :param upload_id: str uuid of the upload to complete.\n :param hash_value: str hash value of chunk\n :param hash_alg: str algorithm used to create hash\n :return: requests.Response containing the successful result\n \"\"\"\n data = {\n \"hash[value]\": hash_value,\n \"hash[algorithm]\": hash_alg\n }\n return self._put(\"/uploads/\" + upload_id + \"/complete\", data, content_type=ContentType.form)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncreating a new file after completing an upload.", "response": "def create_file(self, parent_kind, parent_id, upload_id):\n \"\"\"\n Create a new file after completing an upload.\n :param parent_kind: str kind of parent(dds-folder,dds-project)\n :param parent_id: str uuid of parent\n :param upload_id: str uuid of complete upload\n :return: requests.Response containing the successful result\n \"\"\"\n data = {\n \"parent\": {\n \"kind\": parent_kind,\n \"id\": parent_id\n },\n \"upload\": {\n \"id\": upload_id\n }\n }\n return self._post(\"/files/\", data)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef update_file(self, file_id, upload_id):\n put_data = {\n \"upload[id]\": upload_id,\n }\n return self._put(\"/files/\" + file_id, put_data, content_type=ContentType.form)", "response": "Send PUT request to the files endpoint to update the contents of the file to upload_id and sets a label."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef send_external(self, http_verb, host, url, http_headers, chunk):\n if http_verb == 'PUT':\n return self.http.put(host + url, data=chunk, headers=http_headers)\n elif http_verb == 'POST':\n return self.http.post(host + url, data=chunk, headers=http_headers)\n else:\n raise ValueError(\"Unsupported http_verb:\" + http_verb)", "response": "Send a chunk to an external object store."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nretrieves a streaming request for a file.", "response": "def receive_external(self, http_verb, host, url, http_headers):\n \"\"\"\n Retrieve a streaming request for a file.\n :param http_verb: str GET is only supported right now\n :param host: str host we are requesting the file from\n :param url: str url to ask the host for\n :param http_headers: object headers to send with the request\n :return: requests.Response containing the successful result\n \"\"\"\n if http_verb == 'GET':\n return self.http.get(host + url, headers=http_headers, stream=True)\n else:\n raise ValueError(\"Unsupported http_verb:\" + http_verb)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsending GET request to / users with optional full_name email and username filtering.", "response": "def get_users(self, full_name=None, email=None, username=None):\n \"\"\"\n Send GET request to /users for users with optional full_name, email, and/or username filtering.\n :param full_name: str name of the user we are searching for\n :param email: str: optional email to filter by\n :param username: str: optional username to filter by\n :return: requests.Response containing the successful result\n \"\"\"\n data = {}\n if full_name:\n data['full_name_contains'] = full_name\n if email:\n data['email'] = email\n if username:\n data['username'] = username\n return self._get_collection('/users', data)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef set_user_project_permission(self, project_id, user_id, auth_role):\n put_data = {\n \"auth_role[id]\": auth_role\n }\n return self._put(\"/projects/\" + project_id + \"/permissions/\" + user_id, put_data,\n content_type=ContentType.form)", "response": "Set the permission for a user in a project."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_api_token(self, agent_key, user_key):\n data = {\n \"agent_key\": agent_key,\n \"user_key\": user_key,\n }\n return self._post(\"/software_agents/api_token\", data)", "response": "Send POST request to get an auth token."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsends POST request to initiate transfer of a project to the specified user ids", "response": "def create_project_transfer(self, project_id, to_user_ids):\n \"\"\"\n Send POST request to initiate transfer of a project to the specified user ids\n :param project_id: str uuid of the project\n :param to_users: list of user uuids to receive the project\n :return: requests.Response containing the successful result\n \"\"\"\n data = {\n \"to_users[][id]\": to_user_ids,\n }\n return self._post(\"/projects/\" + project_id + \"/transfers\", data,\n content_type=ContentType.form)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nsend PUT request to one of the project transfer endpoints", "response": "def _process_project_transfer(self, action, transfer_id, status_comment):\n \"\"\"\n Send PUT request to one of the project transfer action endpoints\n :param action: str name of the action (reject/accept/cancel)\n :param transfer_id: str uuid of the project_transfer\n :param status_comment: str comment about the action, optional\n :return: requests.Response containing the successful result\n \"\"\"\n data = {}\n if status_comment:\n data[\"status_comment\"] = status_comment\n path = \"/project_transfers/{}/{}\".format(transfer_id, action)\n return self._put(path, data, content_type=ContentType.form)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a new activity with the specified name and description.", "response": "def create_activity(self, activity_name, desc=None, started_on=None, ended_on=None):\n \"\"\"\n Send POST to /activities creating a new activity with the specified name and desc.\n Raises DataServiceError on error.\n :param activity_name: str name of the activity\n :param desc: str description of the activity (optional)\n :param started_on: str datetime when the activity started (optional)\n :param ended_on: str datetime when the activity ended (optional)\n :return: requests.Response containing the successful result\n \"\"\"\n data = {\n \"name\": activity_name,\n \"description\": desc,\n \"started_on\": started_on,\n \"ended_on\": ended_on\n }\n return self._post(\"/activities\", data)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef update_activity(self, activity_id, activity_name=None, desc=None,\n started_on=None, ended_on=None):\n \"\"\"\n Send PUT request to /activities/{activity_id} to update the activity metadata.\n Raises ValueError if at least one field is not updated.\n :param activity_id: str uuid of activity\n :param activity_name: str new name of the activity (optional)\n :param desc: str description of the activity (optional)\n :param started_on: str date the updated activity began on (optional)\n :param ended_on: str date the updated activity ended on (optional)\n :return: requests.Response containing the successful result\n \"\"\"\n put_data = {\n \"name\": activity_name,\n \"description\": desc,\n \"started_on\": started_on,\n \"ended_on\": ended_on\n }\n return self._put(\"/activities/\" + activity_id, put_data)", "response": "Update the metadata of an activity."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncreate a was used by relationship between an activity and an entity.", "response": "def create_used_relation(self, activity_id, entity_kind, entity_id):\n \"\"\"\n Create a was used by relationship between an activity and a entity(file).\n :param activity_id: str: uuid of the activity\n :param entity_kind: str: kind of entity('dds-file')\n :param entity_id: str: uuid of the entity\n :return: requests.Response containing the successful result\n \"\"\"\n return self._create_activity_relation(activity_id, entity_kind, entity_id, ActivityRelationTypes.USED)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a was generated by relationship between an activity and an entity.", "response": "def create_was_generated_by_relation(self, activity_id, entity_kind, entity_id):\n \"\"\"\n Create a was generated by relationship between an activity and a entity(file).\n :param activity_id: str: uuid of the activity\n :param entity_kind: str: kind of entity('dds-file')\n :param entity_id: str: uuid of the entity\n :return: requests.Response containing the successful result\n \"\"\"\n return self._create_activity_relation(activity_id, entity_kind, entity_id, ActivityRelationTypes.WAS_GENERATED_BY)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef create_was_invalidated_by_relation(self, activity_id, entity_kind, entity_id):\n return self._create_activity_relation(activity_id, entity_kind, entity_id, ActivityRelationTypes.WAS_INVALIDATED_BY)", "response": "Create a was invalidated by relationship between an activity and an entity."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating a was derived from relation.", "response": "def create_was_derived_from_relation(self, used_entity_id, used_entity_kind,\n generated_entity_id, generated_entity_kind):\n \"\"\"\n Create a was derived from relation.\n :param used_entity_id: str: uuid of the used entity (file_version_id)\n :param used_entity_kind: str: kind of entity ('dds-file')\n :param generated_entity_id: uuid of the generated entity (file_version_id)\n :param generated_entity_kind: str: kind of entity ('dds-file')\n :return: requests.Response containing the successful result\n \"\"\"\n data = {\n \"used_entity\": {\n \"id\": used_entity_id,\n \"kind\": used_entity_kind\n },\n \"generated_entity\": {\n \"id\": generated_entity_id,\n \"kind\": generated_entity_kind\n }\n }\n return self._post(\"/relations/was_derived_from\", data)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_auth_provider_affiliates(self, auth_provider_id, full_name_contains=None, email=None, username=None):\n data = {}\n if full_name_contains:\n data['full_name_contains'] = full_name_contains\n if email:\n data['email'] = email\n if username:\n data['username'] = username\n return self._get_collection(\"/auth_providers/{}/affiliates/\".format(auth_provider_id), data)", "response": "Get the affiliates for a specific auth provider."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef auth_provider_add_user(self, auth_provider_id, username):\n url = \"/auth_providers/{}/affiliates/{}/dds_user/\".format(auth_provider_id, username)\n return self._post(url, {})", "response": "Add a user to the set of user identities for the given auth provider."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef add_response(self, response):\n key = self.merge_array_field_name\n response_json = response.json()\n value = self.combined_json[key]\n self.combined_json[self.merge_array_field_name] = value + response_json[key]", "response": "Add data from json to data returned by json()"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_method(self, call, *args, **kwargs):\n method_class = self.registry.get(call.action, call.version)\n method = method_class(*args, **kwargs)\n if not method.is_available():\n raise APIError(400, \"InvalidAction\", \"The action %s is not \"\n \"valid for this web service.\" % call.action)\n else:\n return method", "response": "Returns the method instance to invoke for the given L { Call."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef handle(self, request):\n request.id = str(uuid4())\n deferred = maybeDeferred(self._validate, request)\n deferred.addCallback(self.execute)\n\n def write_response(response):\n request.setHeader(\"Content-Length\", str(len(response)))\n request.setHeader(\"Content-Type\", self.content_type)\n # Prevent browsers from trying to guess a different content type.\n request.setHeader(\"X-Content-Type-Options\", \"nosniff\")\n request.write(response)\n request.finish()\n return response\n\n def write_error(failure):\n if failure.check(APIError):\n status = failure.value.status\n\n # Don't log the stack traces for 4xx responses.\n if status < 400 or status >= 500:\n log.err(failure)\n else:\n log.msg(\"status: %s message: %s\" % (\n status, safe_str(failure.value)))\n\n body = failure.value.response\n if body is None:\n body = self.dump_error(failure.value, request)\n else:\n # If the error is a generic one (not an APIError), log the\n # message , but don't send it back to the client, as it could\n # contain sensitive information. Send a generic server error\n # message instead.\n log.err(failure)\n body = \"Server error\"\n status = 500\n request.setResponseCode(status)\n write_response(body)\n\n deferred.addCallback(write_response)\n deferred.addErrback(write_error)\n return deferred", "response": "Handle an HTTP request for executing an API call."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nexecuting an API L { Call.", "response": "def execute(self, call):\n \"\"\"Execute an API L{Call}.\n\n At this point the request has been authenticated and C{call.principal}\n is set with the L{Principal} for the L{User} requesting the call.\n\n @return: The response to write in the request for the given L{Call}.\n @raises: An L{APIError} in case the execution fails, sporting an error\n message the HTTP status code to return.\n \"\"\"\n method = self.get_method(call)\n deferred = maybeDeferred(self.authorize, method, call)\n deferred.addCallback(lambda _: method.invoke(call))\n return deferred.addCallback(self.dump_result)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_call_arguments(self, request):\n params = dict((k, v[-1]) for k, v in request.args.iteritems())\n args, rest = self.schema.extract(params)\n # Get rid of Signature so it doesn't mess with signature verification\n params.pop(\"Signature\")\n result = {\n \"transport_args\": {\n \"action\": args.Action,\n \"access_key_id\": args.AWSAccessKeyId,\n \"timestamp\": args.Timestamp,\n \"expires\": args.Expires,\n \"version\": args.Version,\n \"signature_method\": args.SignatureMethod,\n \"signature\": args.Signature,\n \"signature_version\": args.SignatureVersion},\n \"handler_args\": rest,\n \"raw_args\": params\n }\n return result", "response": "Get the arguments for the specified request."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _validate(self, request):\n call_arguments = self.get_call_arguments(request)\n args = call_arguments[\"transport_args\"]\n rest = call_arguments[\"handler_args\"]\n params = call_arguments[\"raw_args\"]\n\n self._validate_generic_parameters(args)\n\n def create_call(principal):\n self._validate_principal(principal, args)\n self._validate_signature(request, principal, args, params)\n return Call(raw_params=rest,\n principal=principal,\n action=args[\"action\"],\n version=args[\"version\"],\n id=request.id)\n\n deferred = maybeDeferred(self.get_principal, args[\"access_key_id\"])\n deferred.addCallback(create_call)\n return deferred", "response": "Validate an L{HTTPRequest before executing it."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nvalidate the generic request parameters.", "response": "def _validate_generic_parameters(self, args):\n \"\"\"Validate the generic request parameters.\n\n @param args: Parsed schema arguments.\n @raises APIError: In the following cases:\n - Action is not included in C{self.actions}\n - SignatureVersion is not included in C{self.signature_versions}\n - Expires and Timestamp are present\n - Expires is before the current time\n - Timestamp is older than 15 minutes.\n \"\"\"\n utc_now = self.get_utc_time()\n\n if getattr(self, \"actions\", None) is not None:\n # Check the deprecated 'actions' attribute\n if not args[\"action\"] in self.actions:\n raise APIError(400, \"InvalidAction\", \"The action %s is not \"\n \"valid for this web service.\" % args[\"action\"])\n else:\n self.registry.check(args[\"action\"], args[\"version\"])\n\n if not args[\"signature_version\"] in self.signature_versions:\n raise APIError(403, \"InvalidSignature\", \"SignatureVersion '%s' \"\n \"not supported\" % args[\"signature_version\"])\n\n if args[\"expires\"] and args[\"timestamp\"]:\n raise APIError(400, \"InvalidParameterCombination\",\n \"The parameter Timestamp cannot be used with \"\n \"the parameter Expires\")\n if args[\"expires\"] and args[\"expires\"] < utc_now:\n raise APIError(400,\n \"RequestExpired\",\n \"Request has expired. Expires date is %s\" % (\n args[\"expires\"].strftime(self.time_format)))\n if (args[\"timestamp\"]\n and args[\"timestamp\"] + timedelta(minutes=15) < utc_now):\n raise APIError(400,\n \"RequestExpired\",\n \"Request has expired. Timestamp date is %s\" % (\n args[\"timestamp\"].strftime(self.time_format)))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _validate_signature(self, request, principal, args, params):\n creds = AWSCredentials(principal.access_key, principal.secret_key)\n endpoint = AWSServiceEndpoint()\n endpoint.set_method(request.method)\n endpoint.set_canonical_host(request.getHeader(\"Host\"))\n path = request.path\n if self.path is not None:\n path = \"%s/%s\" % (self.path.rstrip(\"/\"), path.lstrip(\"/\"))\n endpoint.set_path(path)\n signature = Signature(creds, endpoint, params,\n signature_method=args[\"signature_method\"],\n signature_version=args[\"signature_version\"]\n )\n if signature.compute() != args[\"signature\"]:\n raise APIError(403, \"SignatureDoesNotMatch\",\n \"The request signature we calculated does not \"\n \"match the signature you provided. Check your \"\n \"key and signing method.\")", "response": "Validate the signature of the request."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef render_GET(self, request):\n if not request.args:\n request.setHeader(\"Content-Type\", \"text/plain\")\n return self.get_status_text()\n else:\n self.handle(request)\n return NOT_DONE_YET", "response": "Handle a GET request."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef getSignatureKey(key, dateStamp, regionName, serviceName):\n kDate = sign((b'AWS4' + key), dateStamp)\n kRegion = sign(kDate, regionName)\n kService = sign(kRegion, serviceName)\n kSigning = sign(kService, b'aws4_request')\n return kSigning", "response": "Generate the signing key for AWS V4 requests."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _make_canonical_uri(parsed):\n path = urllib.quote(parsed.path)\n canonical_parsed = parsed._replace(path=path,\n params='', query='', fragment='')\n return urlparse.urlunparse(canonical_parsed)", "response": "Returns the canonical URI for a parsed URL."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _make_canonical_query_string(parsed):\n query_params = urlparse.parse_qs(parsed.query, keep_blank_values=True)\n sorted_query_params = sorted((k, v)\n for k, vs in query_params.items()\n for v in vs)\n return urllib.urlencode(sorted_query_params)", "response": "Returns the canonical query string for a parsed URL."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _make_canonical_headers(headers, headers_to_sign):\n pairs = []\n for name in headers_to_sign:\n if name not in headers:\n continue\n values = headers[name]\n if not isinstance(values, (list, tuple)):\n values = [values]\n comma_values = b','.join(' '.join(line.strip().split())\n for value in values\n for line in value.splitlines())\n pairs.append((name.lower(), comma_values))\n\n sorted_pairs = sorted(b'%s:%s' % (name, value)\n for name, value in sorted(pairs))\n return b'\\n'.join(sorted_pairs) + b'\\n'", "response": "Return the canonicalized headers."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _make_signed_headers(headers, headers_to_sign):\n return b\";\".join(header.lower() for header in sorted(headers_to_sign)\n if header in headers)", "response": "Return a semicolon - delimited list of headers to sign."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _make_authorization_header(region,\n service,\n canonical_request,\n credentials,\n instant):\n \"\"\"\n Construct an AWS version 4 authorization value for use in an\n C{Authorization} header.\n\n @param region: The AWS region name (e.g., C{'us-east-1'}).\n @type region: L{str}\n\n @param service: The AWS service's name (e.g., C{'s3'}).\n @type service: L{str}\n\n @param canonical_request: The canonical form of the request.\n @type canonical_request: L{_CanonicalRequest} (use\n L{_CanonicalRequest.from_payload_and_headers})\n\n @param credentials: The AWS credentials.\n @type credentials: L{txaws.credentials.AWSCredentials}\n\n @param instant: The current UTC date and time\n @type instant: A naive local L{datetime.datetime} (as returned by\n L{datetime.datetime.utcnow})\n\n @return: A value suitable for use in an C{Authorization} header\n @rtype: L{bytes}\n \"\"\"\n date_stamp = makeDateStamp(instant)\n amz_date = makeAMZDate(instant)\n\n scope = _CredentialScope(\n date_stamp=date_stamp,\n region=region,\n service=service\n )\n\n signable = _SignableAWS4HMAC256Token(\n amz_date,\n scope,\n canonical_request,\n )\n\n signature = signable.signature(\n getSignatureKey(credentials.secret_key,\n date_stamp,\n region,\n service)\n )\n\n v4credential = _Credential(\n access_key=credentials.access_key,\n credential_scope=scope,\n )\n\n return (\n b\"%s \" % (_SignableAWS4HMAC256Token.ALGORITHM,) +\n b\", \".join([\n b\"Credential=%s\" % (v4credential.serialize(),),\n b\"SignedHeaders=%s\" % (canonical_request.signed_headers,),\n b\"Signature=%s\" % (signature,),\n ]))", "response": "Construct an AWS version 4 authorization header."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ninterpolates all missing values using linear interpolation.", "response": "def linear_interpolation(first, last, steps):\n \"\"\"Interpolates all missing values using linear interpolation.\n\n :param numeric first: Start value for the interpolation.\n :param numeric last: End Value for the interpolation\n :param integer steps: Number of missing values that have to be calculated.\n\n :return: Returns a list of floats containing only the missing values.\n :rtype: list\n\n :todo: Define a more general interface!\n \"\"\"\n result = []\n\n for step in xrange(0, steps):\n fpart = (steps - step) * first\n lpart = (step + 1) * last\n value = (fpart + lpart) / float(steps + 1)\n result.append(value)\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfunctions run by CreateProjectCommand to create the project.", "response": "def upload_project_run(upload_context):\n \"\"\"\n Function run by CreateProjectCommand to create the project.\n Runs in a background process.\n :param upload_context: UploadContext: contains data service setup and project name to create.\n \"\"\"\n data_service = upload_context.make_data_service()\n project_name = upload_context.project_name_or_id.get_name_or_raise()\n result = data_service.create_project(project_name, project_name)\n return result.json()['id']"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef upload_folder_run(upload_context):\n data_service = upload_context.make_data_service()\n folder_name, parent_kind, parent_remote_id = upload_context.params\n result = data_service.create_folder(folder_name, parent_kind, parent_remote_id)\n return result.json()['id']", "response": "Function run by CreateFolderCommand to create the folder."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_small_file(upload_context):\n data_service = upload_context.make_data_service()\n parent_data, path_data, remote_file_id = upload_context.params\n\n # The small file will fit into one chunk so read into memory and hash it.\n chunk = path_data.read_whole_file()\n hash_data = path_data.get_hash()\n\n # Talk to data service uploading chunk and creating the file.\n upload_operations = FileUploadOperations(data_service, upload_context)\n upload_id, url_info = upload_operations.create_upload_and_chunk_url(\n upload_context.project_id, path_data, hash_data, storage_provider_id=upload_context.config.storage_provider_id)\n upload_operations.send_file_external(url_info, chunk)\n return upload_operations.finish_upload(upload_id, hash_data, parent_data, remote_file_id)", "response": "Function run by CreateSmallFileCommand to create a small file."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef rebuild_data_service(config, data_service_auth_data):\n auth = DataServiceAuth(config)\n auth.set_auth_data(data_service_auth_data)\n return DataServiceApi(auth, config.url)", "response": "Deserialize value into DataServiceApi object."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef run(self, local_project):\n # Walk project adding small items to runner saving large items to large_items\n ProjectWalker.walk_project(local_project, self)\n # Run small items in parallel\n self.runner.run()\n # Run parts of each large item in parallel\n self.upload_large_items()", "response": "Uploads a project by uploading folders and small files then uploading the large files."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd the file to the large or small task list.", "response": "def visit_file(self, item, parent):\n \"\"\"\n If file is large add it to the large items to be processed after small task list.\n else file is small add it to the small task list.\n \"\"\"\n if self.is_large_file(item):\n self.large_items.append((item, parent))\n else:\n self.small_item_task_builder.visit_file(item, parent)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef upload_large_items(self):\n for local_file, parent in self.large_items:\n if local_file.need_to_send:\n self.process_large_file(local_file, parent)", "response": "Upload files that were too large."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nuploading a single file using multiple processes to upload multiple chunks at the same time.", "response": "def process_large_file(self, local_file, parent):\n \"\"\"\n Upload a single file using multiple processes to upload multiple chunks at the same time.\n Updates local_file with it's remote_id when done.\n :param local_file: LocalFile: file we are uploading\n :param parent: LocalFolder/LocalProject: parent of the file\n \"\"\"\n file_content_sender = FileUploader(self.settings.config, self.settings.data_service, local_file,\n self.settings.watcher, self.settings.file_upload_post_processor)\n remote_id = file_content_sender.upload(self.settings.project_id, parent.kind, parent.remote_id)\n local_file.set_remote_id_after_send(remote_id)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nadds create project command to task runner if project doesn t already exist.", "response": "def visit_project(self, item):\n \"\"\"\n Adds create project command to task runner if project doesn't already exist.\n \"\"\"\n if not item.remote_id:\n command = CreateProjectCommand(self.settings, item)\n self.task_runner_add(None, item, command)\n else:\n self.settings.project_id = item.remote_id"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef visit_folder(self, item, parent):\n if not item.remote_id:\n command = CreateFolderCommand(self.settings, item, parent)\n self.task_runner_add(parent, item, command)", "response": "Adds create folder command to task runner if folder doesn t already exist."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef visit_file(self, item, parent):\n if item.need_to_send:\n if item.size > self.settings.config.upload_bytes_per_chunk:\n msg = \"Programmer Error: Trying to upload large file as small item size:{} name:{}\"\n raise ValueError(msg.format(item.size, item.name))\n else:\n command = CreateSmallFileCommand(self.settings, item, parent,\n self.settings.file_upload_post_processor)\n self.task_runner_add(parent, item, command)", "response": "Visit a file and create a command to upload it."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef task_runner_add(self, parent, item, command):\n parent_task_id = self.item_to_id.get(parent)\n task_id = self.task_runner.add(parent_task_id, command)\n self.item_to_id[item] = task_id", "response": "Add command to task runner with parent s task id creating a task id for item."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef after_run(self, result_id):\n self.local_project.set_remote_id_after_send(result_id)\n self.settings.project_id = result_id", "response": "Save uuid associated with project we just created."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef create_context(self, message_queue, task_id):\n params = (self.remote_folder.name, self.parent.kind, self.parent.remote_id)\n return UploadContext(self.settings, params, message_queue, task_id)", "response": "Create values to be used by upload_folder_run function."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef create_context(self, message_queue, task_id):\n parent_data = ParentData(self.parent.kind, self.parent.remote_id)\n path_data = self.local_file.get_path_data()\n params = parent_data, path_data, self.local_file.remote_id\n return UploadContext(self.settings, params, message_queue, task_id)", "response": "Create values to be used by create_small_file function."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef after_run(self, remote_file_data):\n if self.file_upload_post_processor:\n self.file_upload_post_processor.run(self.settings.data_service, remote_file_data)\n remote_file_id = remote_file_data['id']\n self.settings.watcher.transferring_item(self.local_file)\n self.local_file.set_remote_id_after_send(remote_file_id)", "response": "Called after the file is uploaded to the local file store."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncall when a message is received from create_small_file method.", "response": "def on_message(self, started_waiting):\n \"\"\"\n Receives started_waiting boolean from create_small_file method and notifies project_status_monitor in settings.\n :param started_waiting: boolean: True when we start waiting, False when done\n \"\"\"\n watcher = self.settings.watcher\n if started_waiting:\n watcher.start_waiting()\n else:\n watcher.done_waiting()"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _visit_recur(self, item):\n if item.kind == KindType.file_str:\n if item.need_to_send:\n self.add_upload_item(item.path)\n else:\n if item.kind == KindType.project_str:\n pass\n else:\n if not item.remote_id:\n self.add_upload_item(item.path)\n for child in item.children:\n self._visit_recur(child)", "response": "Recursively visits the item and adds it to upload_items if needed."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nadds project_name argument to an ArgumentParser parser.", "response": "def add_project_name_arg(arg_parser, required, help_text):\n \"\"\"\n Adds project_name parameter to a parser.\n :param arg_parser: ArgumentParser parser to add this argument to.\n :param help_text: str label displayed in usage\n \"\"\"\n arg_parser.add_argument(\"-p\", '--project-name',\n metavar='ProjectName',\n type=to_unicode,\n dest='project_name',\n help=help_text,\n required=required)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef add_project_name_or_id_arg(arg_parser, required=True, help_text_suffix=\"manage\"):\n project_name_or_id = arg_parser.add_mutually_exclusive_group(required=required)\n name_help_text = \"Name of the project to {}.\".format(help_text_suffix)\n add_project_name_arg(project_name_or_id, required=False, help_text=name_help_text)\n id_help_text = \"ID of the project to {}.\".format(help_text_suffix)\n add_project_id_arg(project_name_or_id, required=False, help_text=id_help_text)", "response": "Adds project name or project id argument. These two are mutually exclusive."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nraising error if path doesn t exist.", "response": "def _paths_must_exists(path):\n \"\"\"\n Raises error if path doesn't exist.\n :param path: str path to check\n :return: str same path passed in\n \"\"\"\n path = to_unicode(path)\n if not os.path.exists(path):\n raise argparse.ArgumentTypeError(\"{} is not a valid file/folder.\".format(path))\n return path"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nvalidating path for invalid characters.", "response": "def _path_has_ok_chars(path):\n \"\"\"\n Validate path for invalid characters.\n :param path: str possible filesystem path\n :return: path if it was ok otherwise raises error\n \"\"\"\n basename = os.path.basename(path)\n if any([bad_char in basename for bad_char in INVALID_PATH_CHARS]):\n raise argparse.ArgumentTypeError(\"{} contains invalid characters for a directory.\".format(path))\n return path"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nadding optional auth_role argument to an ArgumentParser parser.", "response": "def _add_auth_role_arg(arg_parser, default_permissions):\n \"\"\"\n Adds optional auth_role parameter to a parser.\n :param arg_parser: ArgumentParser parser to add this argument to.\n :param default_permissions: default value to use for this argument\n \"\"\"\n help_text = \"Specifies which project permissions to give to the user. Example: 'project_admin'. \"\n help_text += \"See command list_auth_roles for AuthRole values.\"\n arg_parser.add_argument(\"--auth-role\",\n metavar='AuthRole',\n type=to_unicode,\n dest='auth_role',\n help=help_text,\n default=default_permissions)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nadd optional auth_role argument to an ArgumentParser parser.", "response": "def _add_project_filter_auth_role_arg(arg_parser):\n \"\"\"\n Adds optional auth_role filtering parameter to a parser.\n :param arg_parser: ArgumentParser parser to add this argument to.\n \"\"\"\n help_text = \"Filters project listing to just those projects with the specified role. \"\n help_text += \"See command list_auth_roles for AuthRole values.\"\n arg_parser.add_argument(\"--auth-role\",\n metavar='AuthRole',\n type=to_unicode,\n dest='auth_role',\n help=help_text,\n default=None)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadding resend parameter to an ArgumentParser parser.", "response": "def _add_resend_arg(arg_parser, resend_help):\n \"\"\"\n Adds resend parameter to a parser.\n :param arg_parser: ArgumentParser parser to add this argument to.\n :param type_str\n \"\"\"\n arg_parser.add_argument(\"--resend\",\n action='store_true',\n default=False,\n dest='resend',\n help=resend_help)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nadds optional repeatable include parameter to an ArgumentParser parser.", "response": "def _add_include_arg(arg_parser):\n \"\"\"\n Adds optional repeatable include parameter to a parser.\n :param arg_parser: ArgumentParser parser to add this argument to.\n \"\"\"\n arg_parser.add_argument(\"--include\",\n metavar='Path',\n action='append',\n type=to_unicode,\n dest='include_paths',\n help=\"Specifies a single path to include. This argument can be repeated.\",\n default=[])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nadding a message file argument to the argument parser.", "response": "def _add_message_file(arg_parser, help_text):\n \"\"\"\n Add mesage file argument with help_text to arg_parser.\n :param arg_parser: ArgumentParser parser to add this argument to.\n :param help_text: str: help text for this argument\n \"\"\"\n arg_parser.add_argument('--msg-file',\n type=argparse.FileType('r'),\n help=help_text)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nregister the command line for the upload command", "response": "def register_upload_command(self, upload_func):\n \"\"\"\n Add the upload command to the parser and call upload_func(project_name, folders, follow_symlinks) when chosen.\n :param upload_func: func Called when this option is chosen: upload_func(project_name, folders, follow_symlinks).\n \"\"\"\n description = \"Uploads local files and folders to a remote host.\"\n upload_parser = self.subparsers.add_parser('upload', description=description)\n _add_dry_run(upload_parser, help_text=\"Instead of uploading displays a list of folders/files that \"\n \"need to be uploaded.\")\n add_project_name_or_id_arg(upload_parser, help_text_suffix=\"upload files/folders to.\")\n _add_folders_positional_arg(upload_parser)\n _add_follow_symlinks_arg(upload_parser)\n upload_parser.set_defaults(func=upload_func)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nregisters the add - user command to the parser and call add_user_func when chosen.", "response": "def register_add_user_command(self, add_user_func):\n \"\"\"\n Add the add-user command to the parser and call add_user_func(project_name, user_full_name, auth_role)\n when chosen.\n :param add_user_func: func Called when this option is chosen: upload_func(project_name, user_full_name, auth_role).\n \"\"\"\n description = \"Gives user permission to access a remote project.\"\n add_user_parser = self.subparsers.add_parser('add-user', description=description)\n add_project_name_or_id_arg(add_user_parser, help_text_suffix=\"add a user to\")\n user_or_email = add_user_parser.add_mutually_exclusive_group(required=True)\n add_user_arg(user_or_email)\n add_email_arg(user_or_email)\n _add_auth_role_arg(add_user_parser, default_permissions='project_admin')\n add_user_parser.set_defaults(func=add_user_func)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nregister a remove - user command to the parser and call remove_user_func when chosen.", "response": "def register_remove_user_command(self, remove_user_func):\n \"\"\"\n Add the remove-user command to the parser and call remove_user_func(project_name, user_full_name) when chosen.\n :param remove_user_func: func Called when this option is chosen: remove_user_func(project_name, user_full_name).\n \"\"\"\n description = \"Removes user permission to access a remote project.\"\n remove_user_parser = self.subparsers.add_parser('remove-user', description=description)\n add_project_name_or_id_arg(remove_user_parser, help_text_suffix=\"remove a user from\")\n user_or_email = remove_user_parser.add_mutually_exclusive_group(required=True)\n add_user_arg(user_or_email)\n add_email_arg(user_or_email)\n remove_user_parser.set_defaults(func=remove_user_func)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef register_download_command(self, download_func):\n description = \"Download the contents of a remote remote project to a local folder.\"\n download_parser = self.subparsers.add_parser('download', description=description)\n add_project_name_or_id_arg(download_parser, help_text_suffix=\"download\")\n _add_folder_positional_arg(download_parser)\n include_or_exclude = download_parser.add_mutually_exclusive_group(required=False)\n _add_include_arg(include_or_exclude)\n _add_exclude_arg(include_or_exclude)\n download_parser.set_defaults(func=download_func)", "response": "Register a command to download a remote project to a local folder."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef register_share_command(self, share_func):\n description = \"Share a project with another user with specified permissions. \" \\\n \"Sends the other user an email message via D4S2 service. \" \\\n \"If not specified this command gives user download permissions.\"\n share_parser = self.subparsers.add_parser('share', description=description)\n add_project_name_or_id_arg(share_parser)\n user_or_email = share_parser.add_mutually_exclusive_group(required=True)\n add_user_arg(user_or_email)\n add_email_arg(user_or_email)\n _add_auth_role_arg(share_parser, default_permissions='file_downloader')\n _add_resend_arg(share_parser, \"Resend share\")\n _add_message_file(share_parser, \"Filename containing a message to be sent with the share. \"\n \"Pass - to read from stdin.\")\n share_parser.set_defaults(func=share_func)", "response": "Register a command to add view only project permissions and sending email via another service."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nregister the deliver command to the parser", "response": "def register_deliver_command(self, deliver_func):\n \"\"\"\n Add 'deliver' command for transferring a project to another user.,\n :param deliver_func: function to run when user choses this option\n \"\"\"\n description = \"Initiate delivery of a project to another user. Removes other user's current permissions. \" \\\n \"Send message to D4S2 service to send email and allow access to the project once user \" \\\n \"acknowledges receiving the data.\"\n deliver_parser = self.subparsers.add_parser('deliver', description=description)\n add_project_name_or_id_arg(deliver_parser)\n user_or_email = deliver_parser.add_mutually_exclusive_group(required=True)\n add_user_arg(user_or_email)\n add_email_arg(user_or_email)\n add_share_usernames_arg(deliver_parser)\n add_share_emails_arg(deliver_parser)\n _add_copy_project_arg(deliver_parser)\n _add_resend_arg(deliver_parser, \"Resend delivery\")\n include_or_exclude = deliver_parser.add_mutually_exclusive_group(required=False)\n _add_include_arg(include_or_exclude)\n _add_exclude_arg(include_or_exclude)\n _add_message_file(deliver_parser, \"Filename containing a message to be sent with the delivery. \"\n \"Pass - to read from stdin.\")\n deliver_parser.set_defaults(func=deliver_func)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef register_list_command(self, list_func):\n description = \"Show a list of project names or folders/files of a single project.\"\n list_parser = self.subparsers.add_parser('list', description=description)\n project_name_or_auth_role = list_parser.add_mutually_exclusive_group(required=False)\n _add_project_filter_auth_role_arg(project_name_or_auth_role)\n add_project_name_or_id_arg(project_name_or_auth_role, required=False,\n help_text_suffix=\"show details for\")\n _add_long_format_option(list_parser, 'Display long format.')\n list_parser.set_defaults(func=list_func)", "response": "Register a command to get a list of projects or folders or files of a single project."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nregisters a delete command to the remote store.", "response": "def register_delete_command(self, delete_func):\n \"\"\"\n Add 'delete' command delete a project from the remote store.\n :param delete_func: function: run when user choses this option.\n \"\"\"\n description = \"Permanently delete a project.\"\n delete_parser = self.subparsers.add_parser('delete', description=description)\n add_project_name_or_id_arg(delete_parser, help_text_suffix=\"delete\")\n _add_force_arg(delete_parser, \"Do not prompt before deleting.\")\n delete_parser.set_defaults(func=delete_func)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nregister a command to list project authorization roles that can be used with add_user command.", "response": "def register_list_auth_roles_command(self, list_auth_roles_func):\n \"\"\"\n Add 'list_auth_roles' command to list project authorization roles that can be used with add_user.\n :param list_auth_roles_func: function: run when user choses this option.\n \"\"\"\n description = \"List authorization roles for use with add_user command.\"\n list_auth_roles_parser = self.subparsers.add_parser('list-auth-roles', description=description)\n list_auth_roles_parser.set_defaults(func=list_auth_roles_func)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef run_command(self, args):\n parsed_args = self.parser.parse_args(args)\n if hasattr(parsed_args, 'func'):\n parsed_args.func(parsed_args)\n else:\n self.parser.print_help()", "response": "Parse command line arguments and run function registered for the appropriate command."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef filter_signal(self, data_frame, ts='mag_sum_acc'):\n b, a = signal.butter(self.filter_order, 2*self.cutoff_frequency/self.sampling_frequency,'high', analog=False)\n filtered_signal = signal.lfilter(b, a, data_frame[ts].values)\n data_frame['filtered_signal'] = filtered_signal\n\n logging.debug(\"filter signal\")\n return data_frame", "response": "This method filters a data frame by a given time series name."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef amplitude_by_welch(self, data_frame):\n frq, Pxx_den = signal.welch(data_frame.filtered_signal.values, self.sampling_frequency, nperseg=self.window)\n freq = frq[Pxx_den.argmax(axis=0)]\n ampl = sum(Pxx_den[(frq > self.lower_frequency) & (frq < self.upper_frequency)])\n\n logging.debug(\"tremor amplitude by welch calculated\")\n\n return ampl, freq", "response": "This method calculates the power spectral density of the current object in the specified data frame using the Welch method."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef approximate_entropy(self, x, m=None, r=None):\n if m is None or r is None:\n m = 2\n r = 0.3\n entropy = feature_calculators.approximate_entropy(x, m, r)\n logging.debug(\"approximate entropy by tsfresh calculated\")\n return entropy", "response": "Calculates the approximate entropy of a time series."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef autocorrelation(self, x, lag):\n # This is important: If a series is passed, the product below is calculated\n # based on the index, which corresponds to squaring the series.\n if lag is None:\n lag = 0\n _autoc = feature_calculators.autocorrelation(x, lag)\n logging.debug(\"autocorrelation by tsfresh calculated\")\n return _autoc", "response": "Calculates the autocorrelation of the specified lag of the time series x."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef partial_autocorrelation(self, x, param=None):\n if param is None:\n param = [{'lag': 3}, {'lag': 5}, {'lag': 6}]\n _partialc = feature_calculators.partial_autocorrelation(x, param)\n logging.debug(\"partial autocorrelation by tsfresh calculated\")\n return _partialc", "response": "This method calculates the partial autocorrelation function at the given lag."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef ratio_value_number_to_time_series_length(self, x):\n ratio = feature_calculators.ratio_value_number_to_time_series_length(x)\n logging.debug(\"ratio value number to time series length by tsfresh calculated\")\n return ratio", "response": "ratio_value_number_to_time_series_length - Calculates the ratio of the value of the feature of the given time series"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncalculate the change quantiles of the time series x.", "response": "def change_quantiles(self, x, ql=None, qh=None, isabs=None, f_agg=None):\n \"\"\"\n As in tsfresh `change_quantiles `_\n\n First fixes a corridor given by the quantiles ql and qh of the distribution of x. Then calculates the \\\n average, absolute value of consecutive changes of the series x inside this corridor. Think about selecting \\\n a corridor on the y-Axis and only calculating the mean of the absolute change of the time series inside \\\n this corridor.\n \n \n :param x: the time series to calculate the feature of\n :type x: pandas.Series\n :param ql: the lower quantile of the corridor\n :type ql: float\n :param qh: the higher quantile of the corridor\n :type qh: float\n :param isabs: should the absolute differences be taken?\n :type isabs: bool\n :param f_agg: the aggregator function that is applied to the differences in the bin\n :type f_agg: str, name of a numpy function (e.g. mean, var, std, median)\n :return: the value of this feature\n :rtype: float\n \"\"\"\n if ql is None or qh is None or isabs is None or f_agg is None:\n f_agg = 'mean'\n isabs = True\n qh = 0.2\n ql = 0.0\n quantile = feature_calculators.change_quantiles(x, ql, qh, isabs, f_agg)\n logging.debug(\"change_quantiles by tsfresh calculated\")\n return quantile"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncalculate the number of peaks of at least support n in the time series x.", "response": "def number_peaks(self, x, n=None):\n \"\"\"\n As in tsfresh `number_peaks `_\n \n Calculates the number of peaks of at least support n in the time series x. A peak of support n is defined \\\n as a subsequence of x where a value occurs, which is bigger than its n neighbours to the left and to the right.\n\n Hence in the sequence\n\n >>> x = [3, 0, 0, 4, 0, 0, 13]\n\n 4 is a peak of support 1 and 2 because in the subsequences\n\n >>> [0, 4, 0]\n >>> [0, 0, 4, 0, 0]\n\n 4 is still the highest value. Here, 4 is not a peak of support 3 because 13 is the 3th neighbour to the \\\n right of 4 and its bigger than 4.\n\n :param x: the time series to calculate the feature of\n :type x: pandas.Series\n :param n: the support of the peak\n :type n: int\n :return: the value of this feature\n :rtype: float\n \"\"\"\n if n is None:\n n = 5\n peaks = feature_calculators.number_peaks(x, n)\n logging.debug(\"agg linear trend by tsfresh calculated\")\n return peaks"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef percentage_of_reoccurring_datapoints_to_all_datapoints(self, x):\n _perc = feature_calculators.percentage_of_reoccurring_datapoints_to_all_datapoints(x)\n logging.debug(\"percentage of reoccurring datapoints to all datapoints by tsfresh calculated\")\n return _perc", "response": "Calculates the percentage of reoccurring datapoints to all datapoints."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncalculating the absolute energy of the time series x", "response": "def abs_energy(self, x):\n \"\"\"\n As in tsfresh `abs_energy `_ \\\n Returns the absolute energy of the time series which is the sum over the squared values\\\n \n\n .. math:: \n \n E=\\\\sum_{i=1,\\ldots, n}x_i^2\n \n \n :param x: the time series to calculate the feature of\n :type x: pandas.Series\n :return: the value of this feature\n :rtype: float\n \"\"\"\n _energy = feature_calculators.abs_energy(x)\n logging.debug(\"abs energy by tsfresh calculated\")\n return _energy"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef fft_coefficient(self, x, param=None):\n if param is None:\n param = [{'attr': 'abs', 'coeff': 44}, {'attr': 'abs', 'coeff': 63}, {'attr': 'abs', 'coeff': 0},\n {'attr': 'real', 'coeff': 0}, {'attr': 'real', 'coeff': 23}]\n _fft_coef = feature_calculators.fft_coefficient(x, param)\n logging.debug(\"fft coefficient by tsfresh calculated\")\n return list(_fft_coef)", "response": "Calculates the fourier coefficients of the real input by fast Fourier Transform."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef dc_remove_signal(self, data_frame):\n\n mean_signal = np.mean(data_frame.mag_sum_acc)\n data_frame['dc_mag_sum_acc'] = data_frame.mag_sum_acc - mean_signal\n logging.debug(\"dc remove signal\")\n return data_frame", "response": "Removes the dc component of the signal as per Kassavetis2015"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef main(argv, reactor=None):\n if reactor is None:\n from twisted.internet import gtk2reactor\n gtk2reactor.install()\n from twisted.internet import reactor\n try:\n AWSStatusIndicator(reactor)\n gobject.set_application_name(\"aws-status\")\n reactor.run()\n except ValueError:\n # In this case, the user cancelled, and the exception bubbled to here.\n pass", "response": "Run the client GUI."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nparses command line arguments and returns a dictionary of key - value pairs.", "response": "def parse_options(arguments):\n \"\"\"Parse command line arguments.\n\n The parsing logic is fairly simple. It can only parse long-style\n parameters of the form::\n\n --key value\n\n Several parameters can be defined in the environment and will be used\n unless explicitly overridden with command-line arguments. The access key,\n secret and endpoint values will be loaded from C{AWS_ACCESS_KEY_ID},\n C{AWS_SECRET_ACCESS_KEY} and C{AWS_ENDPOINT} environment variables.\n\n @param arguments: A list of command-line arguments. The first item is\n expected to be the name of the program being run.\n @raises OptionError: Raised if incorrectly formed command-line arguments\n are specified, or if required command-line arguments are not present.\n @raises UsageError: Raised if C{--help} is present in command-line\n arguments.\n @return: A C{dict} with key/value pairs extracted from the argument list.\n \"\"\"\n arguments = arguments[1:]\n options = {}\n while arguments:\n key = arguments.pop(0)\n if key in (\"-h\", \"--help\"):\n raise UsageError(\"Help requested.\")\n if key.startswith(\"--\"):\n key = key[2:]\n try:\n value = arguments.pop(0)\n except IndexError:\n raise OptionError(\"'--%s' is missing a value.\" % key)\n options[key] = value\n else:\n raise OptionError(\"Encountered unexpected value '%s'.\" % key)\n\n default_key = os.environ.get(\"AWS_ACCESS_KEY_ID\")\n if \"key\" not in options and default_key:\n options[\"key\"] = default_key\n default_secret = os.environ.get(\"AWS_SECRET_ACCESS_KEY\")\n if \"secret\" not in options and default_secret:\n options[\"secret\"] = default_secret\n default_endpoint = os.environ.get(\"AWS_ENDPOINT\")\n if \"endpoint\" not in options and default_endpoint:\n options[\"endpoint\"] = default_endpoint\n for name in (\"key\", \"secret\", \"endpoint\", \"action\"):\n if name not in options:\n raise OptionError(\n \"The '--%s' command-line argument is required.\" % name)\n\n return options"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nparsing the command - line arguments and configure a Command instance.", "response": "def get_command(arguments, output=None):\n \"\"\"Parse C{arguments} and configure a L{Command} instance.\n\n An access key, secret key, endpoint and action are required. Additional\n parameters included with the request are passed as parameters to the\n method call. For example, the following command will create a L{Command}\n object that can invoke the C{DescribeRegions} method with the optional\n C{RegionName.0} parameter included in the request::\n\n txaws-discover --key KEY --secret SECRET --endpoint URL \\\n --action DescribeRegions --RegionName.0 us-west-1\n\n @param arguments: The command-line arguments to parse.\n @raises OptionError: Raised if C{arguments} can't be used to create a\n L{Command} object.\n @return: A L{Command} instance configured to make an EC2 API method call.\n \"\"\"\n options = parse_options(arguments)\n key = options.pop(\"key\")\n secret = options.pop(\"secret\")\n endpoint = options.pop(\"endpoint\")\n action = options.pop(\"action\")\n return Command(key, secret, endpoint, action, options, output)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef execute(self, timeSeries):\n # determine the number of values to forecast, if necessary\n self._calculate_values_to_forecast(timeSeries)\n\n # extract the required parameters, performance improvement\n alpha = self._parameters[\"smoothingFactor\"]\n valuesToForecast = self._parameters[\"valuesToForecast\"]\n\n # initialize some variables\n resultList = []\n estimator = None\n lastT = None\n\n # \"It's always about performance!\"\n append = resultList.append\n\n # smooth the existing TimeSeries data\n for idx in xrange(len(timeSeries)):\n # get the current to increase performance\n t = timeSeries[idx]\n\n # get the initial estimate\n if estimator is None:\n estimator = t[1]\n continue\n\n # add the first value to the resultList without any correction\n if 0 == len(resultList):\n append([t[0], estimator])\n lastT = t\n continue\n\n # calculate the error made during the last estimation\n error = lastT[1] - estimator\n\n # calculate the new estimator, based on the last occured value, the error and the smoothingFactor\n estimator = estimator + alpha * error\n\n # save the current value for the next iteration\n lastT = t\n\n # add an entry to the result\n append([t[0], estimator])\n\n # forecast additional values if requested\n if valuesToForecast > 0:\n currentTime = resultList[-1][0]\n normalizedTimeDiff = currentTime - resultList[-2][0]\n\n for idx in xrange(valuesToForecast):\n currentTime += normalizedTimeDiff\n\n # reuse everything\n error = lastT[1] - estimator\n estimator = estimator + alpha * error\n\n # add a forecasted value\n append([currentTime, estimator])\n\n # set variables for next iteration\n lastT = resultList[-1]\n\n # return a TimeSeries, containing the result\n return TimeSeries.from_twodim_list(resultList)", "response": "Executes the forecasting process."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _get_parameter_intervals(self):\n parameterIntervals = {}\n\n parameterIntervals[\"smoothingFactor\"] = [0.0, 1.0, False, False]\n parameterIntervals[\"trendSmoothingFactor\"] = [0.0, 1.0, False, False]\n\n return parameterIntervals", "response": "Returns the intervals for the methods parameter."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef execute(self, timeSeries):\n # determine the number of values to forecast, if necessary\n self._calculate_values_to_forecast(timeSeries)\n\n # extract the required parameters, performance improvement\n alpha = self._parameters[\"smoothingFactor\"]\n beta = self._parameters[\"trendSmoothingFactor\"]\n\n # initialize some variables\n resultList = []\n estimator = None\n trend = None\n lastT = None\n\n # \"It's always about performance!\"\n append = resultList.append\n\n # smooth the existing TimeSeries data\n for idx in xrange(len(timeSeries)):\n # get the current to increase performance\n t = timeSeries[idx]\n\n # get the initial estimate\n if estimator is None:\n estimator = t[1]\n lastT = t\n continue\n\n # add the first value to the resultList without any correction\n if 0 == len(resultList):\n append([t[0], estimator])\n trend = t[1] - lastT[1]\n\n # store current values for next iteration\n lastT = t\n lastEstimator = estimator\n continue\n\n # calculate the new estimator and trend, based on the last occured value, the error and the smoothingFactor\n estimator = alpha * t[1] + (1 - alpha) * (estimator + trend)\n trend = beta * (estimator - lastEstimator) + (1 - beta) * trend\n\n # add an entry to the result\n append([t[0], estimator])\n\n # store current values for next iteration\n lastT = t\n lastEstimator = estimator\n\n # forecast additional values if requested\n if self._parameters[\"valuesToForecast\"] > 0:\n currentTime = resultList[-1][0]\n normalizedTimeDiff = currentTime - resultList[-2][0]\n\n for idx in xrange(1, self._parameters[\"valuesToForecast\"] + 1):\n currentTime += normalizedTimeDiff\n\n # reuse everything\n forecast = estimator + idx * trend\n\n # add a forecasted value\n append([currentTime, forecast])\n\n # return a TimeSeries, containing the result\n return TimeSeries.from_twodim_list(resultList)", "response": "Executes the forecasting process."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef execute(self, timeSeries):\n # determine the number of values to forecast, if necessary\n self._calculate_values_to_forecast(timeSeries)\n\n seasonLength = self.get_parameter(\"seasonLength\")\n if len(timeSeries) < seasonLength:\n raise ValueError(\"The time series must contain at least one full season.\")\n\n alpha = self.get_parameter(\"smoothingFactor\")\n beta = self.get_parameter(\"trendSmoothingFactor\")\n gamma = self.get_parameter(\"seasonSmoothingFactor\")\n\n\n seasonValues = self.initSeasonFactors(timeSeries)\n resultList = []\n lastEstimator = 0\n\n for idx in xrange(len(timeSeries)):\n t = timeSeries[idx][0]\n x_t = timeSeries[idx][1]\n if idx == 0:\n lastTrend = self.initialTrendSmoothingFactors(timeSeries)\n lastEstimator = x_t\n resultList.append([t, x_t])\n continue\n\n lastSeasonValue = seasonValues[idx % seasonLength]\n\n estimator = alpha * x_t/lastSeasonValue + (1 - alpha) * (lastEstimator + lastTrend)\n lastTrend = beta * (estimator - lastEstimator) + (1 - beta) * lastTrend\n seasonValues[idx % seasonLength] = gamma * x_t/estimator + (1 - gamma) * lastSeasonValue\n\n lastEstimator = estimator\n resultList.append([t, estimator])\n\n resultList += self._calculate_forecast(timeSeries, resultList, seasonValues, [lastEstimator, lastSeasonValue, lastTrend])\n return TimeSeries.from_twodim_list(resultList)", "response": "This method calculates the exponentially smoothed values of the given time series."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _calculate_forecast(self, originalTimeSeries, smoothedData, seasonValues, lastSmoothingParams):\n forecastResults = []\n\n lastEstimator, lastSeasonValue, lastTrend = lastSmoothingParams\n\n seasonLength = self.get_parameter(\"seasonLength\")\n\n #Forecasting. Determine the time difference between two points for extrapolation\n currentTime = smoothedData[-1][0]\n normalizedTimeDiff = currentTime - smoothedData[-2][0]\n\n for m in xrange(1, self._parameters[\"valuesToForecast\"] + 1):\n currentTime += normalizedTimeDiff\n lastSeasonValue = seasonValues[(len(originalTimeSeries) + m - 2) % seasonLength]\n forecast = (lastEstimator + m * lastTrend) * lastSeasonValue\n forecastResults.append([currentTime, forecast])\n\n return forecastResults", "response": "Calculates the actual forecast based on the input data."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef initSeasonFactors(self, timeSeries):\n\n seasonLength = self.get_parameter(\"seasonLength\")\n try:\n seasonValues = self.get_parameter(\"seasonValues\")\n assert seasonLength == len(seasonValues), \"Preset Season Values have to have to be of season's length\"\n return seasonValues\n except KeyError:\n pass\n\n seasonValues = []\n completeCycles = len(timeSeries) / seasonLength\n A = {} #cache values for A_j\n\n for i in xrange(seasonLength):\n c_i = 0\n for j in xrange(completeCycles):\n if j not in A:\n A[j] = self.computeA(j, timeSeries)\n c_i += timeSeries[(seasonLength * j) + i][1] / A[j] #wikipedia suggests j-1, but we worked with indices in the first place\n c_i /= completeCycles\n seasonValues.append(c_i)\n return seasonValues", "response": "Computes the initial season smoothing factors."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncalculating the initial trend smoothing factor b0.", "response": "def initialTrendSmoothingFactors(self, timeSeries):\n \"\"\" Calculate the initial Trend smoothing Factor b0.\n\n Explanation:\n http://en.wikipedia.org/wiki/Exponential_smoothing#Triple_exponential_smoothing\n\n :return: Returns the initial trend smoothing factor b0\n \"\"\"\n\n result = 0.0\n seasonLength = self.get_parameter(\"seasonLength\")\n k = min(len(timeSeries) - seasonLength, seasonLength) #In case of only one full season, use average trend of the months that we have twice\n for i in xrange(0, k):\n result += (timeSeries[seasonLength + i][1] - timeSeries[i][1]) / seasonLength\n return result / k"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncalculates A_j. Aj is the average value of x in the jth cycle of your data :return: A_j :rtype: numeric", "response": "def computeA(self, j, timeSeries):\n \"\"\" Calculates A_j. Aj is the average value of x in the jth cycle of your data\n\n :return: A_j\n :rtype: numeric\n \"\"\"\n seasonLength = self.get_parameter(\"seasonLength\")\n A_j = 0\n for i in range(seasonLength):\n A_j += timeSeries[(seasonLength * (j)) + i][1]\n return A_j / seasonLength"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _split_quoted(text):\n quoted = False\n escaped = False\n result = []\n for i, ch in enumerate(text):\n if escaped:\n escaped = False\n result.append(ch)\n elif ch == u'\\\\':\n escaped = True\n elif ch == u'\"':\n quoted = not quoted\n elif not quoted and ch == u' ':\n return u\"\".join(result), text[i:].lstrip()\n else:\n result.append(ch)\n return u\"\".join(result), u\"\"", "response": "Splits a unicode string on SPACE characters and escapes them with QUOTATION MARKs."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef freeze_of_gait(self, x):\n\n data = self.resample_signal(x).values\n\n f_res = self.sampling_frequency / self.window\n\n f_nr_LBs = int(self.loco_band[0] / f_res)\n f_nr_LBe = int(self.loco_band[1] / f_res)\n f_nr_FBs = int(self.freeze_band[0] / f_res)\n f_nr_FBe = int(self.freeze_band[1] / f_res)\n\n jPos = self.window + 1\n i = 0\n \n time = []\n sumLocoFreeze = []\n freezeIndex = []\n \n while jPos < len(data):\n \n jStart = jPos - self.window\n time.append(jPos)\n\n y = data[int(jStart):int(jPos)]\n y = y - np.mean(y)\n\n Y = np.fft.fft(y, int(self.window))\n Pyy = abs(Y*Y) / self.window\n\n areaLocoBand = numerical_integration( Pyy[f_nr_LBs-1 : f_nr_LBe], self.sampling_frequency )\n areaFreezeBand = numerical_integration( Pyy[f_nr_FBs-1 : f_nr_FBe], self.sampling_frequency )\n\n sumLocoFreeze.append(areaFreezeBand + areaLocoBand)\n\n freezeIndex.append(areaFreezeBand / areaLocoBand)\n\n jPos = jPos + self.step_size\n i = i + 1\n\n freeze_time = np.asarray(time, dtype=np.int32)\n freeze_index = np.asarray(freezeIndex, dtype=np.float32)\n locomotor_freeze_index = np.asarray(sumLocoFreeze, dtype=np.float32)\n\n return freeze_time, freeze_index, locomotor_freeze_index", "response": "This method assesses freeze of gait on."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef speed_of_gait(self, x, wavelet_type='db3', wavelet_level=6):\n\n coeffs = wavedec(x.values, wavelet=wavelet_type, level=wavelet_level)\n\n energy = [sum(coeffs[wavelet_level - i]**2) / len(coeffs[wavelet_level - i]) for i in range(wavelet_level)]\n\n WEd1 = energy[0] / (5 * np.sqrt(2))\n WEd2 = energy[1] / (4 * np.sqrt(2))\n WEd3 = energy[2] / (3 * np.sqrt(2))\n WEd4 = energy[3] / (2 * np.sqrt(2))\n WEd5 = energy[4] / np.sqrt(2)\n WEd6 = energy[5] / np.sqrt(2)\n\n gait_speed = 0.5 * np.sqrt(WEd1+(WEd2/2)+(WEd3/3)+(WEd4/4)+(WEd5/5)+(WEd6/6))\n\n return gait_speed", "response": "This method assess the speed of gait on the time series x."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef walk_direction_preheel(self, data_frame):\n\n # Sum of absolute values across accelerometer axes:\n data = data_frame.x.abs() + data_frame.y.abs() + data_frame.z.abs()\n\n # Find maximum peaks of smoothed data:\n dummy, ipeaks_smooth = self.heel_strikes(data)\n data = data.values\n\n # Compute number of samples between peaks using the real part of the FFT:\n interpeak = compute_interpeak(data, self.sampling_frequency)\n decel = np.int(np.round(self.stride_fraction * interpeak))\n\n # Find maximum peaks close to maximum peaks of smoothed data:\n ipeaks = []\n for ipeak_smooth in ipeaks_smooth:\n ipeak = np.argmax(data[ipeak_smooth - decel:ipeak_smooth + decel])\n ipeak += ipeak_smooth - decel\n ipeaks.append(ipeak)\n\n # Compute the average vector for each deceleration phase:\n vectors = []\n for ipeak in ipeaks:\n decel_vectors = np.asarray([[data_frame.x[i], data_frame.y[i], data_frame.z[i]]\n for i in range(ipeak - decel, ipeak)])\n vectors.append(np.mean(decel_vectors, axis=0))\n\n # Compute the average deceleration vector and take the opposite direction:\n direction = -1 * np.mean(vectors, axis=0)\n\n # Return the unit vector in this direction:\n direction /= np.sqrt(direction.dot(direction))\n\n return direction", "response": "Finding Walking Direction Leveraging Smartphone s Inertial Sensors"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nestimating the heel strike times between sign changes in accelerometer data.", "response": "def heel_strikes(self, x):\n \"\"\" Estimate heel strike times between sign changes in accelerometer data.\n\n :param x: The time series to assess freeze of gait on. This could be x, y, z or mag_sum_acc.\n :type x: pandas.Series\n :return strikes: Heel strike timings measured in seconds.\n :rtype striles: numpy.ndarray\n :return strikes_idx: Heel strike timing indices of the time series.\n :rtype strikes_idx: numpy.ndarray\n \"\"\"\n \n # Demean data:\n data = x.values\n data -= data.mean()\n \n # TODO: fix this\n # Low-pass filter the AP accelerometer data by the 4th order zero lag\n # Butterworth filter whose cut frequency is set to 5 Hz:\n filtered = butter_lowpass_filter(data, self.sampling_frequency, self.cutoff_frequency, self.filter_order)\n\n # Find transitional positions where AP accelerometer changes from\n # positive to negative.\n transitions = crossings_nonzero_pos2neg(filtered)\n\n # Find the peaks of AP acceleration preceding the transitional positions,\n # and greater than the product of a threshold and the maximum value of\n # the AP acceleration:\n strike_indices_smooth = []\n filter_threshold = np.abs(self.delta * np.max(filtered))\n for i in range(1, np.size(transitions)):\n segment = range(transitions[i-1], transitions[i])\n imax = np.argmax(filtered[segment])\n if filtered[segment[imax]] > filter_threshold:\n strike_indices_smooth.append(segment[imax])\n\n # Compute number of samples between peaks using the real part of the FFT:\n interpeak = compute_interpeak(data, self.sampling_frequency)\n decel = np.int(interpeak / 2)\n\n # Find maximum peaks close to maximum peaks of smoothed data:\n strikes_idx = []\n for ismooth in strike_indices_smooth:\n istrike = np.argmax(data[ismooth - decel:ismooth + decel])\n istrike = istrike + ismooth - decel\n strikes_idx.append(istrike)\n\n strikes = np.asarray(strikes_idx)\n strikes -= strikes[0]\n strikes = strikes / self.sampling_frequency\n\n return strikes, np.array(strikes_idx)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef gait_regularity_symmetry(self, x, average_step_duration='autodetect', average_stride_duration='autodetect', unbias=1, normalize=2):\n if (average_step_duration=='autodetect') or (average_stride_duration=='autodetect'):\n\n strikes, _ = self.heel_strikes(x)\n\n step_durations = []\n for i in range(1, np.size(strikes)):\n step_durations.append(strikes[i] - strikes[i-1])\n\n average_step_duration = np.mean(step_durations)\n\n number_of_steps = np.size(strikes)\n\n strides1 = strikes[0::2]\n strides2 = strikes[1::2]\n\n stride_durations1 = []\n for i in range(1, np.size(strides1)):\n stride_durations1.append(strides1[i] - strides1[i-1])\n\n stride_durations2 = []\n for i in range(1, np.size(strides2)):\n stride_durations2.append(strides2[i] - strides2[i-1])\n\n strides = [strides1, strides2]\n stride_durations = [stride_durations1, stride_durations2]\n\n average_stride_duration = np.mean((np.mean(stride_durations1),\n np.mean(stride_durations2)))\n\n return self.gait_regularity_symmetry(x, average_step_duration, average_stride_duration)\n \n else:\n coefficients, _ = autocorrelate(x, unbias=1, normalize=2)\n\n step_period = np.int(np.round(1 / average_step_duration))\n stride_period = np.int(np.round(1 / average_stride_duration))\n\n step_regularity = coefficients[step_period]\n stride_regularity = coefficients[stride_period]\n symmetry = np.abs(stride_regularity - step_regularity)\n\n return step_regularity, stride_regularity, symmetry", "response": "Compute step and stride regularity and symmetry from accelerometer data."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef gait(self, x):\n \n data = x\n \n strikes, _ = self.heel_strikes(data)\n\n step_durations = []\n for i in range(1, np.size(strikes)):\n step_durations.append(strikes[i] - strikes[i-1])\n\n avg_step_duration = np.mean(step_durations)\n sd_step_durations = np.std(step_durations)\n\n number_of_steps = np.size(strikes)\n\n strides1 = strikes[0::2]\n strides2 = strikes[1::2]\n\n stride_durations1 = []\n for i in range(1, np.size(strides1)):\n stride_durations1.append(strides1[i] - strides1[i-1])\n\n stride_durations2 = []\n for i in range(1, np.size(strides2)):\n stride_durations2.append(strides2[i] - strides2[i-1])\n\n strides = [strides1, strides2]\n stride_durations = [stride_durations1, stride_durations2]\n\n avg_number_of_strides = np.mean([np.size(strides1), np.size(strides2)])\n avg_stride_duration = np.mean((np.mean(stride_durations1),\n np.mean(stride_durations2)))\n sd_stride_durations = np.mean((np.std(stride_durations1),\n np.std(stride_durations2)))\n\n step_period = np.int(np.round(1 / avg_step_duration))\n stride_period = np.int(np.round(1 / avg_stride_duration))\n\n step_regularity, stride_regularity, symmetry = self.gait_regularity_symmetry(data,\n average_step_duration=avg_step_duration, \n average_stride_duration=avg_stride_duration)\n\n cadence = None\n if self.duration:\n cadence = number_of_steps / self.duration\n \n\n velocity = None\n avg_step_length = None\n avg_stride_length = None\n \n if self.distance:\n velocity = self.distance / self.duration\n avg_step_length = number_of_steps / self.distance\n avg_stride_length = avg_number_of_strides / self.distance\n\n return [number_of_steps, cadence,\n velocity,\n avg_step_length,\n avg_stride_length,\n step_durations,\n avg_step_duration,\n sd_step_durations,\n strides,\n stride_durations,\n avg_number_of_strides,\n avg_stride_duration,\n sd_stride_durations,\n step_regularity,\n stride_regularity, \n symmetry]", "response": "Extracts gait features from estimated heel strikes and accelerometer data."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndivides a univariate time - series data_frame into states contiguous segments using Bellman k - segmentation algorithm on the peak prominences of the data.", "response": "def bellman_segmentation(self, x, states):\n \"\"\" \n Divide a univariate time-series, data_frame, into states contiguous segments, using Bellman k-segmentation algorithm on the peak prominences of the data.\n \n :param x: The time series to assess freeze of gait on. This could be x, y, z or mag_sum_acc.\n :type x: pandas.Series\n :param states: Number of contigous segments.\n :type states: int\n :return peaks: The peaks in our data_frame.\n :rtype peaks: list\n :return prominences: Peaks prominences.\n :rtype prominences: list\n :return bellman_idx: The indices of the segments.\n :rtype bellman_idx: list\n \"\"\"\n peaks, prominences = get_signal_peaks_and_prominences(x)\n bellman_idx = BellmanKSegment(prominences, states)\n \n return peaks, prominences, bellman_idx"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndivides a univariate time - series data_frame into states contiguous segments using sk - learn clustering algorithms on the peak prominences of the data.", "response": "def sklearn_segmentation(self, x, cluster_fn):\n \"\"\" \n Divide a univariate time-series, data_frame, into states contiguous segments, using sk-learn clustering algorithms on the peak prominences of the data.\n \n :param x: The time series to assess freeze of gait on. This could be x, y, z or mag_sum_acc.\n :type x: pandas.Series\n :param cluster_fn: Any unsupervised learning algorithm from the sklearn library. It needs to have the `fit_predict` method.\n :param cluster_fn: sklearn.aglorithm\n :return peaks: The peaks in our data_frame.\n :rtype peaks: list\n :return prominences: Peaks prominences.\n :rtype prominences: list\n :return sklearn_idx: The indices of the segments.\n :rtype sklearn_idx: list\n \"\"\"\n peaks, prominences = get_signal_peaks_and_prominences(x)\n \n # sklearn fix: reshape to (-1, 1)\n sklearn_idx = cluster_fn.fit_predict(prominences.reshape(-1, 1))\n \n return peaks, prominences, sklearn_idx"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef error_wrapper(error, errorClass):\n http_status = 0\n if error.check(TwistedWebError):\n xml_payload = error.value.response\n if error.value.status:\n http_status = int(error.value.status)\n else:\n error.raiseException()\n if http_status >= 400:\n if not xml_payload:\n error.raiseException()\n try:\n fallback_error = errorClass(\n xml_payload, error.value.status, str(error.value),\n error.value.response)\n except (ParseError, AWSResponseParseError):\n error_message = http.RESPONSES.get(http_status)\n fallback_error = TwistedWebError(\n http_status, error_message, error.value.response)\n raise fallback_error\n elif 200 <= http_status < 300:\n return str(error.value)\n else:\n error.raiseException()", "response": "Wrapper for error messages from Amazon s EC2 API."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the un - urlencoded path component but joined by slashes.", "response": "def _get_joined_path(ctx):\n \"\"\"\n @type ctx: L{_URLContext}\n @param ctx: A URL context.\n\n @return: The path component, un-urlencoded, but joined by slashes.\n @rtype: L{bytes}\n \"\"\"\n return b'/' + b'/'.join(seg.encode('utf-8') for seg in ctx.path)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_page(self, url, *args, **kwds):\n contextFactory = None\n scheme, host, port, path = parse(url)\n data = kwds.get('postdata', None)\n self._method = method = kwds.get('method', 'GET')\n self.request_headers = self._headers(kwds.get('headers', {}))\n if (self.body_producer is None) and (data is not None):\n self.body_producer = FileBodyProducer(StringIO(data))\n if self.endpoint.ssl_hostname_verification:\n contextFactory = None\n else:\n contextFactory = WebClientContextFactory()\n agent = _get_agent(scheme, host, self.reactor, contextFactory)\n if scheme == \"https\":\n self.client.url = url\n d = agent.request(method, url, self.request_headers, self.body_producer)\n d.addCallback(self._handle_response)\n return d", "response": "Get a page from the given url."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _headers(self, headers_dict):\n return Headers(dict((k,[v]) for (k,v) in headers_dict.items()))", "response": "Convert dictionary of headers into twisted. web. client. Headers object."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _unpack_headers(self, headers):\n return dict((k,v[0]) for (k,v) in headers.getAllRawHeaders())", "response": "Unpack twisted. web. client. Headers object to dict. This is to provide\n backwards compatability."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _handle_response(self, response):\n self.client.status = response.code\n self.response_headers = response.headers\n # XXX This workaround (which needs to be improved at that) for possible\n # bug in Twisted with new client:\n # http://twistedmatrix.com/trac/ticket/5476\n if self._method.upper() == 'HEAD' or response.code == NO_CONTENT:\n return succeed('')\n receiver = self.receiver_factory()\n receiver.finished = d = Deferred()\n receiver.content_length = response.length\n response.deliverBody(receiver)\n if response.code >= 400:\n d.addCallback(self._fail_response, response)\n return d", "response": "Handle the HTTP response by memoizing the headers and then delivering the bytes."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncreate a JWT token for a GOV. UK Notify base token.", "response": "def create_jwt_token(secret, client_id):\n \"\"\"\n Create JWT token for GOV.UK Notify\n\n Tokens have standard header:\n {\n \"typ\": \"JWT\",\n \"alg\": \"HS256\"\n }\n\n Claims consist of:\n iss: identifier for the client\n iat: issued at in epoch seconds (UTC)\n\n :param secret: Application signing secret\n :param client_id: Identifier for the client\n :return: JWT token for this request\n \"\"\"\n assert secret, \"Missing secret key\"\n assert client_id, \"Missing client id\"\n\n headers = {\n \"typ\": __type__,\n \"alg\": __algorithm__\n }\n\n claims = {\n 'iss': client_id,\n 'iat': epoch_seconds()\n }\n\n return jwt.encode(payload=claims, key=secret, headers=headers).decode()"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn the issuer of a token", "response": "def get_token_issuer(token):\n \"\"\"\n Issuer of a token is the identifier used to recover the secret\n Need to extract this from token to ensure we can proceed to the signature validation stage\n Does not check validity of the token\n :param token: signed JWT token\n :return issuer: iss field of the JWT token\n :raises TokenIssuerError: if iss field not present\n :raises TokenDecodeError: if token does not conform to JWT spec\n \"\"\"\n try:\n unverified = decode_token(token)\n\n if 'iss' not in unverified:\n raise TokenIssuerError\n\n return unverified.get('iss')\n except jwt.DecodeError:\n raise TokenDecodeError"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nvalidating and decodes the JWT token and returns a boolean if the token is valid False otherwise", "response": "def decode_jwt_token(token, secret):\n \"\"\"\n Validates and decodes the JWT token\n Token checked for\n - signature of JWT token\n - token issued date is valid\n\n :param token: jwt token\n :param secret: client specific secret\n :return boolean: True if valid token, False otherwise\n :raises TokenIssuerError: if iss field not present\n :raises TokenIssuedAtError: if iat field not present\n :raises jwt.DecodeError: If signature validation fails\n \"\"\"\n try:\n # check signature of the token\n decoded_token = jwt.decode(\n token,\n key=secret.encode(),\n verify=True,\n algorithms=[__algorithm__],\n leeway=__bound__\n )\n # token has all the required fields\n if 'iss' not in decoded_token:\n raise TokenIssuerError\n if 'iat' not in decoded_token:\n raise TokenIssuedAtError\n\n # check iat time is within bounds\n now = epoch_seconds()\n iat = int(decoded_token['iat'])\n if now > (iat + __bound__):\n raise TokenExpiredError(\"Token has expired\", decoded_token)\n if iat > (now + __bound__):\n raise TokenExpiredError(\"Token can not be in the future\", decoded_token)\n\n return True\n except jwt.InvalidIssuedAtError:\n raise TokenExpiredError(\"Token has invalid iat field\", decode_token(token))\n except jwt.DecodeError:\n raise TokenDecodeError"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef parse(stream, with_text=False): # type: (Iterator[str], bool) -> Iterator[Union[Tuple[str, LexicalUnit], LexicalUnit]]\n\n buffer = ''\n text_buffer = ''\n in_lexical_unit = False\n in_superblank = False\n\n for char in stream:\n if in_superblank:\n if char == ']':\n in_superblank = False\n text_buffer += char\n elif char == '\\\\':\n text_buffer += char\n text_buffer += next(stream)\n else:\n text_buffer += char\n elif in_lexical_unit:\n if char == '$':\n if with_text:\n yield (text_buffer, LexicalUnit(buffer))\n else:\n yield LexicalUnit(buffer)\n buffer = ''\n text_buffer = ''\n in_lexical_unit = False\n elif char == '\\\\':\n buffer += char\n buffer += next(stream)\n else:\n buffer += char\n else:\n if char == '[':\n in_superblank = True\n text_buffer += char\n elif char == '^':\n in_lexical_unit = True\n elif char == '\\\\':\n text_buffer += char\n text_buffer += next(stream)\n else:\n text_buffer += char", "response": "Generates a list of lexical units from a character stream."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef to_gnuplot_datafile(self, datafilepath):\n try:\n datafile = file(datafilepath, \"wb\")\n except Exception:\n return False\n\n if self._timestampFormat is None:\n self._timestampFormat = _STR_EPOCHS\n\n datafile.write(\"# time_as_<%s> value\\n\" % self._timestampFormat)\n\n convert = TimeSeries.convert_epoch_to_timestamp\n for datapoint in self._timeseriesData:\n timestamp, value = datapoint\n if self._timestampFormat is not None:\n timestamp = convert(timestamp, self._timestampFormat)\n\n datafile.write(\"%s %s\\n\" % (timestamp, value))\n\n datafile.close()\n return True", "response": "Dumps the TimeSeries into a gnuplot compatible data file."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating a new TimeSeries instance from the data stored inside a two dimensional list.", "response": "def from_twodim_list(cls, datalist, tsformat=None):\n \"\"\"Creates a new TimeSeries instance from the data stored inside a two dimensional list.\n\n :param list datalist: List containing multiple iterables with at least two values.\n The first item will always be used as timestamp in the predefined format,\n the second represents the value. All other items in those sublists will be ignored.\n :param string tsformat: Format of the given timestamp. This is used to convert the\n timestamp into UNIX epochs, if necessary. For valid examples take a look into\n the :py:func:`time.strptime` documentation.\n\n :return: Returns a TimeSeries instance containing the data from datalist.\n :rtype: TimeSeries\n \"\"\"\n # create and fill the given TimeSeries\n ts = TimeSeries()\n ts.set_timeformat(tsformat)\n\n for entry in datalist:\n ts.add_entry(*entry[:2])\n\n # set the normalization level\n ts._normalized = ts.is_normalized()\n ts.sort_timeseries()\n\n return ts"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ninitializing the TimeSeries s data from the given SQL cursor.", "response": "def initialize_from_sql_cursor(self, sqlcursor):\n \"\"\"Initializes the TimeSeries's data from the given SQL cursor.\n\n You need to set the time stamp format using :py:meth:`TimeSeries.set_timeformat`.\n\n :param SQLCursor sqlcursor: Cursor that was holds the SQL result for any given\n \"SELECT timestamp, value, ... FROM ...\" SQL query.\n Only the first two attributes of the SQL result will be used.\n\n :return: Returns the number of entries added to the TimeSeries.\n :rtype: integer\n \"\"\"\n # initialize the result\n tuples = 0\n\n # add the SQL result to the time series\n data = sqlcursor.fetchmany()\n while 0 < len(data):\n for entry in data:\n self.add_entry(str(entry[0]), entry[1])\n\n data = sqlcursor.fetchmany()\n\n # set the normalization level\n self._normalized = self._check_normalization\n\n # return the number of tuples added to the timeseries.\n return tuples"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nconverts the given timestamp into a float representing the UNIX - epochs for the given timestamp in the defined format.", "response": "def convert_timestamp_to_epoch(cls, timestamp, tsformat):\n \"\"\"Converts the given timestamp into a float representing UNIX-epochs.\n\n :param string timestamp: Timestamp in the defined format.\n :param string tsformat: Format of the given timestamp. This is used to convert the\n timestamp into UNIX epochs. For valid examples take a look into\n the :py:func:`time.strptime` documentation.\n\n\n :return: Returns an float, representing the UNIX-epochs for the given timestamp.\n :rtype: float\n \"\"\"\n return time.mktime(time.strptime(timestamp, tsformat))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef convert_epoch_to_timestamp(cls, timestamp, tsformat):\n return time.strftime(tsformat, time.gmtime(timestamp))", "response": "Converts the given float representing UNIX - epochs into an actual timestamp."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd a new entry to the data list.", "response": "def add_entry(self, timestamp, data):\n \"\"\"Adds a new data entry to the TimeSeries.\n\n :param timestamp: Time stamp of the data.\n This has either to be a float representing the UNIX epochs\n or a string containing a timestamp in the given format.\n :param numeric data: Actual data value.\n \"\"\"\n self._normalized = self._predefinedNormalized\n self._sorted = self._predefinedSorted\n\n tsformat = self._timestampFormat\n if tsformat is not None:\n timestamp = TimeSeries.convert_timestamp_to_epoch(timestamp, tsformat)\n\n self._timeseriesData.append([float(timestamp), float(data)])"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nsort the data points within the time series according to their occurrence inline.", "response": "def sort_timeseries(self, ascending=True):\n \"\"\"Sorts the data points within the TimeSeries according to their occurrence inline.\n\n :param boolean ascending: Determines if the TimeSeries will be ordered ascending or\n descending. If this is set to descending once, the ordered parameter defined in\n :py:meth:`TimeSeries.__init__` will be set to False FOREVER.\n\n :return: Returns :py:obj:`self` for convenience.\n :rtype: TimeSeries\n \"\"\"\n # the time series is sorted by default\n if ascending and self._sorted:\n return\n\n sortorder = 1\n if not ascending:\n sortorder = -1\n self._predefinedSorted = False\n\n self._timeseriesData.sort(key=lambda i: sortorder * i[0])\n\n self._sorted = ascending\n\n return self"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn a sorted copy of the TimeSeries preserving the original one.", "response": "def sorted_timeseries(self, ascending=True):\n \"\"\"Returns a sorted copy of the TimeSeries, preserving the original one.\n\n As an assumption this new TimeSeries is not ordered anymore if a new value is added.\n\n :param boolean ascending: Determines if the TimeSeries will be ordered ascending\n or descending.\n\n :return: Returns a new TimeSeries instance sorted in the requested order.\n :rtype: TimeSeries\n \"\"\"\n sortorder = 1\n if not ascending:\n sortorder = -1\n\n data = sorted(self._timeseriesData, key=lambda i: sortorder * i[0])\n\n newTS = TimeSeries(self._normalized)\n for entry in data:\n newTS.add_entry(*entry)\n\n newTS._sorted = ascending\n\n return newTS"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nnormalizing the data points in a single time series.", "response": "def normalize(self, normalizationLevel=\"minute\", fusionMethod=\"mean\", interpolationMethod=\"linear\"):\n \"\"\"Normalizes the TimeSeries data points.\n\n If this function is called, the TimeSeries gets ordered ascending\n automatically. The new timestamps will represent the center of each time\n bucket. Within a normalized TimeSeries, the temporal distance between two consecutive data points is constant.\n\n :param string normalizationLevel: Level of normalization that has to be applied.\n The available normalization levels are defined in :py:data:`timeseries.NormalizationLevels`.\n :param string fusionMethod: Normalization method that has to be used if multiple data entries exist\n within the same normalization bucket. The available methods are defined in :py:data:`timeseries.FusionMethods`.\n :param string interpolationMethod: Interpolation method that is used if a data entry at a specific time\n is missing. The available interpolation methods are defined in :py:data:`timeseries.InterpolationMethods`.\n\n :raise: Raises a :py:exc:`ValueError` if a normalizationLevel, fusionMethod or interpolationMethod hanve an unknown value.\n \"\"\"\n # do not normalize the TimeSeries if it is already normalized, either by\n # definition or a prior call of normalize(*)\n if self._normalizationLevel == normalizationLevel:\n if self._normalized: # pragma: no cover\n return\n\n # check if all parameters are defined correctly\n if normalizationLevel not in NormalizationLevels:\n raise ValueError(\"Normalization level %s is unknown.\" % normalizationLevel)\n if fusionMethod not in FusionMethods:\n raise ValueError(\"Fusion method %s is unknown.\" % fusionMethod)\n if interpolationMethod not in InterpolationMethods:\n raise ValueError(\"Interpolation method %s is unknown.\" % interpolationMethod)\n\n # (nearly) empty TimeSeries instances do not require normalization\n if len(self) < 2:\n self._normalized = True\n return\n\n # get the defined methods and parameter\n self._normalizationLevel = normalizationLevel\n normalizationLevel = NormalizationLevels[normalizationLevel]\n fusionMethod = FusionMethods[fusionMethod]\n interpolationMethod = InterpolationMethods[interpolationMethod]\n\n # sort the TimeSeries\n self.sort_timeseries()\n\n # prepare the required buckets\n start = self._timeseriesData[0][0]\n end = self._timeseriesData[-1][0]\n span = end - start\n bucketcnt = int(span / normalizationLevel) + 1\n\n buckethalfwidth = normalizationLevel / 2.0\n bucketstart = start + buckethalfwidth\n buckets = [[bucketstart + idx * normalizationLevel] for idx in xrange(bucketcnt)]\n\n # Step One: Populate buckets\n # Initialize the timeseries data iterators\n tsdStartIdx = 0\n tsdEndIdx = 0\n tsdlength = len(self)\n\n for idx in xrange(bucketcnt):\n # get the bucket to avoid multiple calls of buckets.__getitem__()\n bucket = buckets[idx]\n\n # get the range for the given bucket\n bucketend = bucket[0] + buckethalfwidth\n\n while tsdEndIdx < tsdlength and self._timeseriesData[tsdEndIdx][0] < bucketend:\n tsdEndIdx += 1\n\n # continue, if no valid data entries exist\n if tsdStartIdx == tsdEndIdx:\n continue\n\n # use the given fusion method to calculate the fusioned value\n values = [i[1] for i in self._timeseriesData[tsdStartIdx:tsdEndIdx]]\n bucket.append(fusionMethod(values))\n\n # set the new timeseries data index\n tsdStartIdx = tsdEndIdx\n\n # Step Two: Fill missing buckets\n missingCount = 0\n lastIdx = 0\n for idx in xrange(bucketcnt):\n # bucket is empty\n if 1 == len(buckets[idx]):\n missingCount += 1\n continue\n\n # This is the first bucket. The first bucket is not empty by definition!\n if idx == 0:\n lastIdx = idx\n continue\n\n # update the lastIdx, if none was missing\n if 0 == missingCount:\n lastIdx = idx\n continue\n\n # calculate and fill in missing values\n missingValues = interpolationMethod(buckets[lastIdx][1], buckets[idx][1], missingCount)\n for idx2 in xrange(1, missingCount + 1):\n buckets[lastIdx + idx2].append(missingValues[idx2 - 1])\n\n lastIdx = idx\n missingCount = 0\n\n self._timeseriesData = buckets\n\n # at the end set self._normalized to True\n self._normalized = True"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _check_normalization(self):\n lastDistance = None\n distance = None\n for idx in xrange(len(self) - 1):\n distance = self[idx+1][0] - self[idx][0]\n\n # first run\n if lastDistance is None:\n lastDistance = distance\n continue\n\n if lastDistance != distance:\n return False\n\n lastDistance = distance\n\n return True", "response": "Checks if the TimeSeries is normalized."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\napply the given ForecastingAlgorithm or SmoothingMethod from the pycast. methods module to the TimeSeries.", "response": "def apply(self, method):\n \"\"\"Applies the given ForecastingAlgorithm or SmoothingMethod from the :py:mod:`pycast.methods`\n module to the TimeSeries.\n\n :param BaseMethod method: Method that should be used with the TimeSeries.\n For more information about the methods take a look into their corresponding documentation.\n\n :raise: Raises a StandardError when the TimeSeries was not normalized and hte method requires a\n normalized TimeSeries\n \"\"\"\n # check, if the methods requirements are fullfilled\n if method.has_to_be_normalized() and not self._normalized:\n raise StandardError(\"method requires a normalized TimeSeries instance.\")\n\n if method.has_to_be_sorted():\n self.sort_timeseries()\n\n return method.execute(self)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsamples with replacement from the TimeSeries. Returns the sample and the remaining timeseries.", "response": "def sample(self, percentage):\n \"\"\"Samples with replacement from the TimeSeries. Returns the sample and the remaining timeseries.\n The original timeseries is not changed.\n\n :param float percentage: How many percent of the original timeseries should be in the sample\n\n :return: A tuple containing (sample, rest) as two TimeSeries.\n :rtype: tuple(TimeSeries,TimeSeries)\n\n :raise: Raises a ValueError if percentage is not in (0.0, 1.0).\n \"\"\"\n if not (0.0 < percentage < 1.0):\n raise ValueError(\"Parameter percentage has to be in (0.0, 1.0).\")\n\n cls = self.__class__\n value_count = int(len(self) * percentage)\n values = random.sample(self, value_count)\n\n sample = cls.from_twodim_list(values)\n rest_values = self._timeseriesData[:]\n\n for value in values:\n rest_values.remove(value)\n\n rest = cls.from_twodim_list(rest_values)\n\n return sample, rest"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd a new entry to the data list.", "response": "def add_entry(self, timestamp, data):\n \"\"\"Adds a new data entry to the TimeSeries.\n\n :param timestamp: Time stamp of the data.\n This has either to be a float representing the UNIX epochs\n or a string containing a timestamp in the given format.\n :param list data: A list containing the actual dimension values.\n\n :raise: Raises a :py:exc:`ValueError` if data does not contain as many dimensions as\n defined in __init__.\n \"\"\"\n if not isinstance(data, list):\n data = [data]\n\n if len(data) != self._dimensionCount:\n raise ValueError(\"data does contain %s instead of %s dimensions.\\n %s\" % (len(data), self._dimensionCount, data))\n\n self._normalized = self._predefinedNormalized\n self._sorted = self._predefinedSorted\n\n tsformat = self._timestampFormat\n if tsformat is not None:\n timestamp = TimeSeries.convert_timestamp_to_epoch(timestamp, tsformat)\n\n self._timeseriesData.append([float(timestamp)] + [float(dimensionValue) for dimensionValue in data])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nserialize the MultiDimensionalTimeSeries data into a two dimensional list of [ timestamp values pairs.", "response": "def to_twodim_list(self):\n \"\"\"Serializes the MultiDimensionalTimeSeries data into a two dimensional list of [timestamp, [values]] pairs.\n\n :return: Returns a two dimensional list containing [timestamp, [values]] pairs.\n :rtype: list\n \"\"\"\n if self._timestampFormat is None:\n return self._timeseriesData\n\n datalist = []\n append = datalist.append\n convert = TimeSeries.convert_epoch_to_timestamp\n for entry in self._timeseriesData:\n append([convert(entry[0], self._timestampFormat), entry[1:]])\n\n return datalist"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a new MultiDimensionalTimeSeries instance from the data stored inside a two dimensional list.", "response": "def from_twodim_list(cls, datalist, tsformat=None, dimensions=1):\n \"\"\"Creates a new MultiDimensionalTimeSeries instance from the data stored inside a two dimensional list.\n\n :param list datalist: List containing multiple iterables with at least two values.\n The first item will always be used as timestamp in the predefined format,\n the second is a list, containing the dimension values.\n :param string format: Format of the given timestamp. This is used to convert the\n timestamp into UNIX epochs, if necessary. For valid examples take a look into\n the :py:func:`time.strptime` documentation.\n :param integer dimensions: Number of dimensions the MultiDimensionalTimeSeries contains.\n\n :return: Returns a MultiDimensionalTimeSeries instance containing the data from datalist.\n :rtype: MultiDimensionalTimeSeries\n \"\"\"\n # create and fill the given TimeSeries\n ts = MultiDimensionalTimeSeries(dimensions=dimensions)\n ts.set_timeformat(tsformat)\n\n for entry in datalist:\n ts.add_entry(entry[0], entry[1])\n\n # set the normalization level\n ts._normalized = ts.is_normalized()\n ts.sort_timeseries()\n\n return ts"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef to_gnuplot_datafile(self, datafilepath):\n try:\n datafile = file(datafilepath, \"wb\")\n except Exception:\n return False\n\n if self._timestampFormat is None:\n self._timestampFormat = _STR_EPOCHS\n\n datafile.write(\"# time_as_<%s> value...\" % self._timestampFormat)\n\n convert = TimeSeries.convert_epoch_to_timestamp\n for datapoint in self._timeseriesData:\n timestamp = datapoint[0]\n values = datapoint[1:]\n if self._timestampFormat is not None:\n timestamp = convert(timestamp, self._timestampFormat)\n\n datafile.write(\"%s %s\" % (timestamp, \" \".join([str(entry) for entry in values])))\n\n datafile.close()\n return True", "response": "Dumps the TimeSeries into a gnuplot compatible data file."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef include(self, filename, is_file):\n if self.exclude_regex and is_file:\n if self.exclude_regex.match(filename):\n return False\n return True\n else:\n return True", "response": "Determines if a file should be included in a project for uploading."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nadding a Unix shell - style wildcard pattern underneath the specified directory.", "response": "def add_filename_pattern(self, dir_name, pattern):\n \"\"\"\n Adds a Unix shell-style wildcard pattern underneath the specified directory\n :param dir_name: str: directory that contains the pattern\n :param pattern: str: Unix shell-style wildcard pattern\n \"\"\"\n full_pattern = '{}{}{}'.format(dir_name, os.sep, pattern)\n filename_regex = fnmatch.translate(full_pattern)\n self.regex_list.append(re.compile(filename_regex))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef include(self, path):\n for regex_item in self.regex_list:\n if regex_item.match(path):\n return False\n return True", "response": "Returns True if any pattern matches the path\n Returns False if any pattern matches the path\n Returns False if any pattern matches the path\n Returns True if any pattern matches the path\n Returns False if any pattern matches the path\n Returns True if any pattern matches the path\n "} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef load_directory(self, top_path, followlinks):\n for dir_name, child_dirs, child_files in os.walk(top_path, followlinks=followlinks):\n for child_filename in child_files:\n if child_filename == DDS_IGNORE_FILENAME:\n pattern_lines = self._read_non_empty_lines(dir_name, child_filename)\n self.add_patterns(dir_name, pattern_lines)", "response": "Load patterns from a directory and save them in the internal cache."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_patterns(self, dir_name, pattern_lines):\n for pattern_line in pattern_lines:\n self.pattern_list.add_filename_pattern(dir_name, pattern_line)", "response": "Add patterns the should apply below dir_name\n "} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns True if any pattern matches the path", "response": "def include(self, path, is_file):\n \"\"\"\n Returns False if any pattern matches the path\n :param path: str: filename path to test\n :return: boolean: True if we should include this path\n \"\"\"\n return self.pattern_list.include(path) and self.file_filter.include(os.path.basename(path), is_file)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncreates a project with the specified name and description.", "response": "def create_project(self, name, description):\n \"\"\"\n Create a project with the specified name and description\n :param name: str: unique name for this project\n :param description: str: long description of this project\n :return: str: name of the project\n \"\"\"\n self._cache_project_list_once()\n if name in [project.name for project in self.projects]:\n raise DuplicateNameError(\"There is already a project named {}\".format(name))\n self.client.create_project(name, description)\n self.clear_project_cache()\n return name"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ndeletes a project with the specified name. Raises ItemNotFound if no such project exists", "response": "def delete_project(self, project_name):\n \"\"\"\n Delete a project with the specified name. Raises ItemNotFound if no such project exists\n :param project_name: str: name of the project to delete\n :return:\n \"\"\"\n project = self._get_project_for_name(project_name)\n project.delete()\n self.clear_project_cache()"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn a list of file paths that make up project_name", "response": "def list_files(self, project_name):\n \"\"\"\n Return a list of file paths that make up project_name\n :param project_name: str: specifies the name of the project to list contents of\n :return: [str]: returns a list of remote paths for all files part of the specified project qq\n \"\"\"\n project = self._get_project_for_name(project_name)\n file_path_dict = self._get_file_path_dict_for_project(project)\n return list(file_path_dict)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef download_file(self, project_name, remote_path, local_path=None):\n project = self._get_project_for_name(project_name)\n file = project.get_child_for_path(remote_path)\n file.download_to_path(local_path)", "response": "Download a file from a project containing the given remote path."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef upload_file(self, project_name, local_path, remote_path=None):\n project = self._get_or_create_project(project_name)\n file_upload = FileUpload(project, remote_path, local_path)\n file_upload.run()", "response": "Upload a file into a project creating a new version if it does not exist."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef delete_file(self, project_name, remote_path):\n project = self._get_or_create_project(project_name)\n remote_file = project.get_child_for_path(remote_path)\n remote_file.delete()", "response": "Delete a file or folder from a project"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _determineLength(self, fObj):\n try:\n seek = fObj.seek\n tell = fObj.tell\n except AttributeError:\n return UNKNOWN_LENGTH\n originalPosition = tell()\n seek(0, self._SEEK_END)\n end = tell()\n seek(originalPosition, self._SEEK_SET)\n return end - originalPosition", "response": "Determine how many bytes can be read out of a file."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nstart a cooperative task which will read all available bytes from the input file and write them to the output file and return a Deferred which fires after all bytes have been written.", "response": "def startProducing(self, consumer):\n \"\"\"\n Start a cooperative task which will read bytes from the input file and\n write them to C{consumer}. Return a L{Deferred} which fires after all\n bytes have been written.\n\n @param consumer: Any L{IConsumer} provider\n \"\"\"\n self._task = self._cooperate(self._writeloop(consumer))\n d = self._task.whenDone()\n def maybeStopped(reason):\n # IBodyProducer.startProducing's Deferred isn't support to fire if\n # stopProducing is called.\n reason.trap(task.TaskStopped)\n return defer.Deferred()\n d.addCallbacks(lambda ignored: None, maybeStopped)\n return d"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn an iterator which reads one chunk of bytes from the input file and writes them to the consumer for each time it is iterated.", "response": "def _writeloop(self, consumer):\n \"\"\"\n Return an iterator which reads one chunk of bytes from the input file\n and writes them to the consumer for each time it is iterated.\n \"\"\"\n while True:\n bytes = self._inputFile.read(self._readSize)\n if not bytes:\n self._inputFile.close()\n break\n consumer.write(bytes)\n yield None"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _count_differences(self):\n different_items = LocalOnlyCounter(self.config.upload_bytes_per_chunk)\n different_items.walk_project(self.local_project)\n return different_items", "response": "Count how many things we will send from the local project we will send to"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef run(self):\n progress_printer = ProgressPrinter(self.different_items.total_items(), msg_verb='sending')\n upload_settings = UploadSettings(self.config, self.remote_store.data_service, progress_printer,\n self.project_name_or_id, self.file_upload_post_processor)\n project_uploader = ProjectUploader(upload_settings)\n project_uploader.run(self.local_project)\n progress_printer.finished()", "response": "Runs the remote store upload"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn a dry run report of the items that need to be uploaded or a message saying there are no files or folders that need to be uploaded.", "response": "def dry_run_report(self):\n \"\"\"\n Returns text displaying the items that need to be uploaded or a message saying there are no files/folders\n to upload.\n :return: str: report text\n \"\"\"\n project_uploader = ProjectUploadDryRun()\n project_uploader.run(self.local_project)\n items = project_uploader.upload_items\n if not items:\n return \"\\n\\nNo changes found. Nothing needs to be uploaded.\\n\\n\"\n else:\n result = \"\\n\\nFiles/Folders that need to be uploaded:\\n\"\n for item in items:\n result += \"{}\\n\".format(item)\n result += \"\\n\"\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_upload_report(self):\n project = self.remote_store.fetch_remote_project(self.project_name_or_id,\n must_exist=True,\n include_children=False)\n report = UploadReport(project.name)\n report.walk_project(self.local_project)\n return report.get_content()", "response": "Generate and print a report onto stdout."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nprint url to view the project via dds portal.", "response": "def get_url_msg(self):\n \"\"\"\n Print url to view the project via dds portal.\n \"\"\"\n msg = 'URL to view project'\n project_id = self.local_project.remote_id\n url = '{}: https://{}/#/project/{}'.format(msg, self.config.get_portal_url_base(), project_id)\n return url"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef visit_file(self, item, parent):\n if item.need_to_send:\n self.files += 1\n self.chunks += item.count_chunks(self.bytes_per_chunk)", "response": "Increments counter if item needs to be sent."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef result_str(self):\n return '{}, {}, {}'.format(LocalOnlyCounter.plural_fmt('project', self.projects),\n LocalOnlyCounter.plural_fmt('folder', self.folders),\n LocalOnlyCounter.plural_fmt('file', self.files))", "response": "Return a string representing the totals contained herein."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\npluralizes name if necessary and combine with cnt", "response": "def plural_fmt(name, cnt):\n \"\"\"\n pluralize name if necessary and combine with cnt\n :param name: str name of the item type\n :param cnt: int number items of this type\n :return: str name and cnt joined\n \"\"\"\n if cnt == 1:\n return '{} {}'.format(cnt, name)\n else:\n return '{} {}s'.format(cnt, name)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd folder to the report if it was sent to remote.", "response": "def visit_folder(self, item, parent):\n \"\"\"\n Add folder to the report if it was sent.\n :param item: LocalFolder folder to possibly add\n :param parent: LocalFolder/LocalContent not used here\n \"\"\"\n if item.sent_to_remote:\n self._add_report_item(item.path, item.remote_id)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef visit_file(self, item, parent):\n if item.sent_to_remote:\n self._add_report_item(item.path, item.remote_id, item.size, item.get_hash_value())", "response": "Add file to the report if it was sent to remote."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef str_with_sizes(self, max_name, max_remote_id, max_size):\n name_str = self.name.ljust(max_name)\n remote_id_str = self.remote_id.ljust(max_remote_id)\n size_str = self.size.ljust(max_size)\n return u'{} {} {} {}'.format(name_str, remote_id_str, size_str, self.file_hash)", "response": "Create string for the internal properties of the item based on the sizes of the internal properties."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef upload(self, project_id, parent_kind, parent_id):\n path_data = self.local_file.get_path_data()\n hash_data = path_data.get_hash()\n self.upload_id = self.upload_operations.create_upload(project_id, path_data, hash_data,\n storage_provider_id=self.config.storage_provider_id)\n ParallelChunkProcessor(self).run()\n parent_data = ParentData(parent_kind, parent_id)\n remote_file_data = self.upload_operations.finish_upload(self.upload_id, hash_data, parent_data,\n self.local_file.remote_id)\n if self.file_upload_post_processor:\n self.file_upload_post_processor.run(self.data_service, remote_file_data)\n return remote_file_data['id']", "response": "Upload file contents to project within a specified parent."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncreate upload for uploading multiple chunks or the non - chunked variety.", "response": "def _create_upload(self, project_id, path_data, hash_data, remote_filename=None, storage_provider_id=None,\n chunked=True):\n \"\"\"\n Create upload for uploading multiple chunks or the non-chunked variety (includes upload url).\n :param project_id: str: uuid of the project\n :param path_data: PathData: holds file system data about the file we are uploading\n :param hash_data: HashData: contains hash alg and value for the file we are uploading\n :param remote_filename: str: name to use for our remote file (defaults to path_data basename otherwise)\n :param storage_provider_id: str: optional storage provider id\n :param chunked: bool: should we create a chunked upload\n :return: str: uuid for the upload\n \"\"\"\n if not remote_filename:\n remote_filename = path_data.name()\n mime_type = path_data.mime_type()\n size = path_data.size()\n\n def func():\n return self.data_service.create_upload(project_id, remote_filename, mime_type, size,\n hash_data.value, hash_data.alg,\n storage_provider_id=storage_provider_id,\n chunked=chunked)\n\n resp = retry_until_resource_is_consistent(func, self.waiting_monitor)\n return resp.json()"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_upload(self, project_id, path_data, hash_data, remote_filename=None, storage_provider_id=None):\n upload_response = self._create_upload(project_id, path_data, hash_data, remote_filename=remote_filename,\n storage_provider_id=storage_provider_id, chunked=True)\n return upload_response['id']", "response": "Create a chunked upload id for the current file system entry"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate an upload that returns upload id and upload url.", "response": "def create_upload_and_chunk_url(self, project_id, path_data, hash_data, remote_filename=None,\n storage_provider_id=None):\n \"\"\"\n Create an non-chunked upload that returns upload id and upload url. This type of upload doesn't allow\n additional upload urls. For single chunk files this method is more efficient than\n create_upload/create_file_chunk_url.\n :param project_id: str: uuid of the project\n :param path_data: PathData: holds file system data about the file we are uploading\n :param hash_data: HashData: contains hash alg and value for the file we are uploading\n :param remote_filename: str: name to use for our remote file (defaults to path_data basename otherwise)\n :param storage_provider_id:str: optional storage provider id\n :return: str, dict: uuid for the upload, upload chunk url dict\n \"\"\"\n upload_response = self._create_upload(project_id, path_data, hash_data, remote_filename=remote_filename,\n storage_provider_id=storage_provider_id, chunked=False)\n return upload_response['id'], upload_response['signed_url']"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef create_file_chunk_url(self, upload_id, chunk_num, chunk):\n chunk_len = len(chunk)\n hash_data = HashData.create_from_chunk(chunk)\n one_based_index = chunk_num + 1\n\n def func():\n return self.data_service.create_upload_url(upload_id, one_based_index, chunk_len,\n hash_data.value, hash_data.alg)\n\n resp = retry_until_resource_is_consistent(func, self.waiting_monitor)\n return resp.json()", "response": "Create a url for uploading a particular chunk to the datastore."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsending a file to external store", "response": "def send_file_external(self, url_json, chunk):\n \"\"\"\n Send chunk to external store specified in url_json.\n Raises ValueError on upload failure.\n :param data_service: data service to use for sending chunk\n :param url_json: dict contains where/how to upload chunk\n :param chunk: data to be uploaded\n \"\"\"\n http_verb = url_json['http_verb']\n host = url_json['host']\n url = url_json['url']\n http_headers = url_json['http_headers']\n resp = self._send_file_external_with_retry(http_verb, host, url, http_headers, chunk)\n if resp.status_code != 200 and resp.status_code != 201:\n raise ValueError(\"Failed to send file to external store. Error:\" + str(resp.status_code) + host + url)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsending a file to the remote host using http_verb.", "response": "def _send_file_external_with_retry(self, http_verb, host, url, http_headers, chunk):\n \"\"\"\n Send chunk to host, url using http_verb. If http_verb is PUT and a connection error occurs\n retry a few times. Pauses between retries. Raises if unsuccessful.\n \"\"\"\n count = 0\n retry_times = 1\n if http_verb == 'PUT':\n retry_times = SEND_EXTERNAL_PUT_RETRY_TIMES\n while True:\n try:\n return self.data_service.send_external(http_verb, host, url, http_headers, chunk)\n except requests.exceptions.ConnectionError:\n count += 1\n if count < retry_times:\n if count == 1: # Only show a warning the first time we fail to send a chunk\n self._show_retry_warning(host)\n time.sleep(SEND_EXTERNAL_RETRY_SECONDS)\n self.data_service.recreate_requests_session()\n else:\n raise"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndisplays a message on stderr that we lost connection to a host.", "response": "def _show_retry_warning(host):\n \"\"\"\n Displays a message on stderr that we lost connection to a host and will retry.\n :param host: str: name of the host we are trying to communicate with\n \"\"\"\n sys.stderr.write(\"\\nConnection to {} failed. Retrying.\\n\".format(host))\n sys.stderr.flush()"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncompletes the upload and create or update the file.", "response": "def finish_upload(self, upload_id, hash_data, parent_data, remote_file_id):\n \"\"\"\n Complete the upload and create or update the file.\n :param upload_id: str: uuid of the upload we are completing\n :param hash_data: HashData: hash info about the file\n :param parent_data: ParentData: info about the parent of this file\n :param remote_file_id: str: uuid of this file if it already exists or None if it is a new file\n :return: dict: DukeDS details about this file\n \"\"\"\n self.data_service.complete_upload(upload_id, hash_data.value, hash_data.alg)\n if remote_file_id:\n result = self.data_service.update_file(remote_file_id, upload_id)\n return result.json()\n else:\n result = self.data_service.create_file(parent_data.kind, parent_data.id, upload_id)\n return result.json()"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef run(self):\n processes = []\n progress_queue = ProgressQueue(Queue())\n num_chunks = ParallelChunkProcessor.determine_num_chunks(self.config.upload_bytes_per_chunk,\n self.local_file.size)\n work_parcels = ParallelChunkProcessor.make_work_parcels(self.config.upload_workers, num_chunks)\n for (index, num_items) in work_parcels:\n processes.append(self.make_and_start_process(index, num_items, progress_queue))\n wait_for_processes(processes, num_chunks, progress_queue, self.watcher, self.local_file)", "response": "Runs the local file upload."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndetermines how many pieces we can upload in a file.", "response": "def determine_num_chunks(chunk_size, file_size):\n \"\"\"\n Figure out how many pieces we are sending the file in.\n NOTE: duke-data-service requires an empty chunk to be uploaded for empty files.\n \"\"\"\n if file_size == 0:\n return 1\n return int(math.ceil(float(file_size) / float(chunk_size)))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nmake a list of parcels for the given number of workers and number of items.", "response": "def make_work_parcels(upload_workers, num_chunks):\n \"\"\"\n Make groups so we can split up num_chunks into similar sizes.\n Rounds up trying to keep work evenly split so sometimes it will not use all workers.\n For very small numbers it can result in (upload_workers-1) total workers.\n For example if there are two few items to distribute.\n :param upload_workers: int target number of workers\n :param num_chunks: int number of total items we need to send\n :return [(index, num_items)] - an array of tuples where array element will be in a separate process.\n \"\"\"\n chunks_per_worker = int(math.ceil(float(num_chunks) / float(upload_workers)))\n return ParallelChunkProcessor.divide_work(range(num_chunks), chunks_per_worker)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef divide_work(list_of_indexes, batch_size):\n grouped_indexes = [list_of_indexes[i:i + batch_size] for i in range(0, len(list_of_indexes), batch_size)]\n return [(batch[0], len(batch)) for batch in grouped_indexes]", "response": "Given a sequential list of indexes split them into num_parts."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef make_and_start_process(self, index, num_items, progress_queue):\n process = Process(target=upload_async,\n args=(self.data_service.auth.get_auth_data(), self.config, self.upload_id,\n self.local_file.path, index, num_items, progress_queue))\n process.start()\n return process", "response": "Create and start a process to upload num_items chunks from our local file starting at index."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nsend the file to the server.", "response": "def send(self):\n \"\"\"\n For each chunk we need to send, create upload url and send bytes. Raises exception on error.\n \"\"\"\n sent_chunks = 0\n chunk_num = self.index\n with open(self.filename, 'rb') as infile:\n infile.seek(self.index * self.chunk_size)\n while sent_chunks != self.num_chunks_to_send:\n chunk = infile.read(self.chunk_size)\n self._send_chunk(chunk, chunk_num)\n self.progress_queue.processed(1)\n chunk_num += 1\n sent_chunks += 1"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _send_chunk(self, chunk, chunk_num):\n url_info = self.upload_operations.create_file_chunk_url(self.upload_id, chunk_num, chunk)\n self.upload_operations.send_file_external(url_info, chunk)", "response": "Send a single chunk to the remote service."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns True if the given parameter is within its valid interval.", "response": "def _in_valid_interval(self, parameter, value):\n \"\"\"Returns if the parameter is within its valid interval.\n\n :param string parameter: Name of the parameter that has to be checked.\n :param numeric value: Value of the parameter.\n\n :return: Returns :py:const:`True` it the value for the given parameter is valid,\n :py:const:`False` otherwise.\n :rtype: boolean\n \"\"\"\n # return True, if not interval is defined for the parameter\n if parameter not in self._parameterIntervals:\n return True\n\n interval = self._parameterIntervals[parameter]\n\n if interval[2] and interval[3]:\n return interval[0] <= value <= interval[1]\n\n if not interval[2] and interval[3]:\n return interval[0] < value <= interval[1]\n\n if interval[2] and not interval[3]:\n return interval[0] <= value < interval[1]\n\n #if False == interval[2] and False == interval[3]:\n return interval[0] < value < interval[1]"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _get_value_error_message_for_invalid_prarameter(self, parameter, value):\n # return if not interval is defined for the parameter\n if parameter not in self._parameterIntervals:\n return\n\n interval = self._parameterIntervals[parameter]\n return \"%s has to be in %s%s, %s%s. Current value is %s.\" % (\n parameter,\n BaseMethod._interval_definitions[interval[2]][0],\n interval[0], interval[1],\n BaseMethod._interval_definitions[interval[3]][1],\n value\n )", "response": "Returns the error message for the given parameter and value."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef set_parameter(self, name, value):\n if not self._in_valid_interval(name, value):\n raise ValueError(self._get_value_error_message_for_invalid_prarameter(name, value))\n\n #if name in self._parameters:\n # print \"Parameter %s already existed. It's old value will be replaced with %s\" % (name, value)\n\n self._parameters[name] = value", "response": "Sets a parameter for the BaseMethod."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn if the method can already be executed.", "response": "def can_be_executed(self):\n \"\"\"Returns if the method can already be executed.\n\n :return: Returns :py:const:`True` if all required parameters where already set, False otherwise.\n :rtype: boolean\n \"\"\"\n missingParams = filter(lambda rp: rp not in self._parameters, self._requiredParameters)\n return len(missingParams) == 0"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nset a parameter for the BaseForecastingMethod.", "response": "def set_parameter(self, name, value):\n \"\"\"Sets a parameter for the BaseForecastingMethod.\n\n :param string name: Name of the parameter.\n :param numeric value: Value of the parameter.\n \"\"\"\n # set the furecast until variable to None if necessary\n if name == \"valuesToForecast\":\n self._forecastUntil = None\n\n # continue with the parents implementation\n return super(BaseForecastingMethod, self).set_parameter(name, value)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nset the forecasting goal ( timestamp wise.", "response": "def forecast_until(self, timestamp, tsformat=None):\n \"\"\"Sets the forecasting goal (timestamp wise).\n\n This function enables the automatic determination of valuesToForecast.\n\n :param timestamp: timestamp containing the end date of the forecast.\n :param string tsformat: Format of the timestamp. This is used to convert the\n timestamp from UNIX epochs, if necessary. For valid examples\n take a look into the :py:func:`time.strptime` documentation.\n \"\"\"\n if tsformat is not None:\n timestamp = TimeSeries.convert_timestamp_to_epoch(timestamp, tsformat)\n\n self._forecastUntil = timestamp"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncalculate the number of values that need to be forecasted to match the goal set in forecast_until.", "response": "def _calculate_values_to_forecast(self, timeSeries):\n \"\"\"Calculates the number of values, that need to be forecasted to match the goal set in forecast_until.\n\n This sets the parameter \"valuesToForecast\" and should be called at the beginning of the :py:meth:`BaseMethod.execute` implementation.\n\n :param TimeSeries timeSeries: Should be a sorted and normalized TimeSeries instance.\n\n :raise: Raises a :py:exc:`ValueError` if the TimeSeries is either not normalized or sorted.\n \"\"\"\n # do not set anything, if it is not required\n if self._forecastUntil is None:\n return\n\n # check the TimeSeries for correctness\n if not timeSeries.is_sorted():\n raise ValueError(\"timeSeries has to be sorted.\")\n if not timeSeries.is_normalized():\n raise ValueError(\"timeSeries has to be normalized.\")\n\n timediff = timeSeries[-1][0] - timeSeries[-2][0]\n forecastSpan = self._forecastUntil - timeSeries[-1][0]\n\n self.set_parameter(\"valuesToForecast\", int(forecastSpan / timediff) + 1)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nexecute the simple moving average.", "response": "def execute(self, timeSeries):\n \"\"\"Creates a new TimeSeries containing the SMA values for the predefined windowsize.\n\n :param TimeSeries timeSeries: The TimeSeries used to calculate the simple moving average values.\n\n :return: TimeSeries object containing the smooth moving average.\n :rtype: TimeSeries\n\n :raise: Raises a :py:exc:`ValueError` wif the defined windowsize is larger than the number of elements\n in timeSeries\n\n :note: This implementation aims to support independent for loop execution.\n \"\"\"\n windowsize = self._parameters[\"windowsize\"]\n\n if len (timeSeries) < windowsize:\n raise ValueError(\"windowsize is larger than the number of elements in timeSeries.\")\n\n tsLength = len(timeSeries)\n nbrOfLoopRuns = tsLength - windowsize + 1\n\n res = TimeSeries()\n for idx in xrange(nbrOfLoopRuns):\n end = idx + windowsize\n data = timeSeries[idx:end]\n\n timestamp = data[windowsize//2][0]\n value = sum([i[1] for i in data])/windowsize\n\n res.add_entry(timestamp, value)\n\n res.sort_timeseries()\n return res"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add(self, child, min_occurs=1):\n if not min_occurs in (0, 1):\n raise RuntimeError(\"Unexpected min bound for node schema\")\n self.children[child.tag] = child\n self.children_min_occurs[child.tag] = min_occurs\n return child", "response": "Add a child node."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _create_child(self, tag):\n return etree.SubElement(self._root, self._get_namespace_tag(tag))", "response": "Create a new child element with the given tag."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _find_child(self, tag):\n tag = self._get_namespace_tag(tag)\n children = self._root.findall(tag)\n if len(children) > 1:\n raise WSDLParseError(\"Duplicate tag '%s'\" % tag)\n if len(children) == 0:\n return None\n return children[0]", "response": "Find the child with the matching tag."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nensures that the element matching the given tag can have the given value.", "response": "def _check_value(self, tag, value):\n \"\"\"Ensure that the element matching C{tag} can have the given C{value}.\n\n @param tag: The tag to consider.\n @param value: The value to check\n @return: The unchanged L{value}, if valid.\n @raises L{WSDLParseError}: If the value is invalid.\n \"\"\"\n if value is None:\n if self._schema.children_min_occurs[tag] > 0:\n raise WSDLParseError(\"Missing tag '%s'\" % tag)\n return value\n return value"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngets the L { NodeItem } attribute name for the given C { tag.", "response": "def _get_tag(self, name):\n \"\"\"Get the L{NodeItem} attribute name for the given C{tag}.\"\"\"\n if name.endswith(\"_\"):\n if name[:-1] in self._schema.reserved:\n return name[:-1]\n return name"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _get_namespace_tag(self, tag):\n if self._namespace is not None:\n tag = \"{%s}%s\" % (self._namespace, tag)\n return tag", "response": "Return the given C { tag with the namespace prefix added if any."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the child schema for the given tag.", "response": "def _get_schema(self, tag):\n \"\"\"Return the child schema for the given C{tag}.\n\n @raises L{WSDLParseError}: If the tag doesn't belong to the schema.\n \"\"\"\n schema = self._schema.children.get(tag)\n if not schema:\n raise WSDLParseError(\"Unknown tag '%s'\" % tag)\n return schema"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncreating a sequence element with the given root.", "response": "def create(self, root=None, namespace=None):\n \"\"\"Create a sequence element with the given root.\n\n @param root: The C{etree.Element} to root the sequence at, if C{None} a\n new one will be created..\n @result: A L{SequenceItem} with the given root.\n @raises L{ECResponseError}: If the given C{root} has a bad tag.\n \"\"\"\n if root is not None:\n tag = root.tag\n if root.nsmap:\n namespace = root.nsmap[None]\n tag = tag[len(namespace) + 2:]\n if tag != self.tag:\n raise WSDLParseError(\"Expected response with tag '%s', but \"\n \"got '%s' instead\" % (self.tag, tag))\n return SequenceItem(self, root, namespace)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef set(self, child, min_occurs=1, max_occurs=1):\n if isinstance(child, LeafSchema):\n raise RuntimeError(\"Sequence can't have leaf children\")\n if self.child is not None:\n raise RuntimeError(\"Sequence has already a child\")\n if min_occurs is None or max_occurs is None:\n raise RuntimeError(\"Sequence node without min or max\")\n if isinstance(child, LeafSchema):\n raise RuntimeError(\"Sequence node with leaf child type\")\n if not child.tag == \"item\":\n raise RuntimeError(\"Sequence node with bad child tag\")\n\n self.child = child\n self.min_occurs = min_occurs\n self.max_occurs = max_occurs\n return child", "response": "Set the schema for the sequence children."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nappend a new item to the sequence.", "response": "def append(self):\n \"\"\"Append a new item to the sequence, appending it to the end.\n\n @return: The newly created item.\n @raises L{WSDLParseError}: If the operation would result in having\n more child elements than the allowed max.\n \"\"\"\n tag = self._schema.tag\n children = self._root.getchildren()\n if len(children) >= self._schema.max_occurs:\n raise WSDLParseError(\"Too many items in tag '%s'\" % tag)\n schema = self._schema.child\n tag = \"item\"\n if self._namespace is not None:\n tag = \"{%s}%s\" % (self._namespace, tag)\n child = etree.SubElement(self._root, tag)\n return schema.create(child)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef remove(self, item):\n for index, child in enumerate(self._root.getchildren()):\n if child is item._root:\n del self[index]\n return item\n raise WSDLParseError(\"Non existing item in tag '%s'\" %\n self._schema.tag)", "response": "Removes the given item from the sequence."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns the child with the given index.", "response": "def _get_child(self, children, index):\n \"\"\"Return the child with the given index.\"\"\"\n try:\n return children[index]\n except IndexError:\n raise WSDLParseError(\"Non existing item in tag '%s'\" %\n self._schema.tag)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef parse(self, wsdl):\n parser = etree.XMLParser(remove_blank_text=True, remove_comments=True)\n root = etree.fromstring(wsdl, parser=parser)\n types = {}\n responses = {}\n schemas = {}\n namespace = root.attrib[\"targetNamespace\"]\n\n for element in root[0][0]:\n self._remove_namespace_from_tag(element)\n if element.tag in [\"annotation\", \"group\"]:\n continue\n name = element.attrib[\"name\"]\n if element.tag == \"element\":\n if name.endswith(\"Response\"):\n if name in responses:\n raise RuntimeError(\"Schema already defined\")\n responses[name] = element\n elif element.tag == \"complexType\":\n types[name] = [element, False]\n else:\n raise RuntimeError(\"Top-level element with unexpected tag\")\n\n for name, element in responses.iteritems():\n schemas[name] = self._parse_type(element, types)\n schemas[name].namespace = namespace\n\n return schemas", "response": "Parse the given WDSL definition\n and build the associated schemas."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _parse_type(self, element, types):\n name = element.attrib[\"name\"]\n type = element.attrib[\"type\"]\n if not type.startswith(\"tns:\"):\n raise RuntimeError(\"Unexpected element type %s\" % type)\n type = type[4:]\n\n [children] = types[type][0]\n types[type][1] = True\n\n self._remove_namespace_from_tag(children)\n if children.tag not in (\"sequence\", \"choice\"):\n raise RuntimeError(\"Unexpected children type %s\" % children.tag)\n\n if children[0].attrib[\"name\"] == \"item\":\n schema = SequenceSchema(name)\n else:\n schema = NodeSchema(name)\n\n for child in children:\n self._remove_namespace_from_tag(child)\n if child.tag == \"element\":\n name, type, min_occurs, max_occurs = self._parse_child(child)\n if type in self.leaf_types:\n if max_occurs != 1:\n raise RuntimeError(\"Unexpected max value for leaf\")\n if not isinstance(schema, NodeSchema):\n raise RuntimeError(\"Attempt to add leaf to a non-node\")\n schema.add(LeafSchema(name), min_occurs=min_occurs)\n else:\n if name == \"item\": # sequence\n if not isinstance(schema, SequenceSchema):\n raise RuntimeError(\"Attempt to set child for \"\n \"non-sequence\")\n schema.set(self._parse_type(child, types),\n min_occurs=min_occurs,\n max_occurs=max_occurs)\n else:\n if max_occurs != 1:\n raise RuntimeError(\"Unexpected max for node\")\n if not isinstance(schema, NodeSchema):\n raise RuntimeError(\"Unexpected schema type\")\n schema.add(self._parse_type(child, types),\n min_occurs=min_occurs)\n elif child.tag == \"choice\":\n pass\n else:\n raise RuntimeError(\"Unexpected child type\")\n return schema", "response": "Parse a complexType element."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _parse_child(self, child):\n if set(child.attrib) - set([\"name\", \"type\", \"minOccurs\", \"maxOccurs\"]):\n raise RuntimeError(\"Unexpected attribute in child\")\n name = child.attrib[\"name\"]\n type = child.attrib[\"type\"].split(\":\")[1]\n min_occurs = child.attrib.get(\"minOccurs\")\n max_occurs = child.attrib.get(\"maxOccurs\")\n if min_occurs is None:\n min_occurs = \"1\"\n min_occurs = int(min_occurs)\n if max_occurs is None:\n max_occurs = \"1\"\n if max_occurs != \"unbounded\":\n max_occurs = int(max_occurs)\n return name, type, min_occurs, max_occurs", "response": "Parse a single child element."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets existing item from remote service based on keys.", "response": "def get_existing_item(self, item):\n \"\"\"\n Lookup item in remote service based on keys.\n :param item: D4S2Item data contains keys we will use for lookup.\n :return: requests.Response containing the successful result\n \"\"\"\n params = {\n 'project_id': item.project_id,\n 'from_user_id': item.from_user_id,\n 'to_user_id': item.to_user_id,\n }\n resp = requests.get(self.make_url(item.destination), headers=self.json_headers, params=params)\n self.check_response(resp)\n return resp"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef create_item(self, item):\n item_dict = {\n 'project_id': item.project_id,\n 'from_user_id': item.from_user_id,\n 'to_user_id': item.to_user_id,\n 'role': item.auth_role,\n 'user_message': item.user_message\n }\n if item.share_user_ids:\n item_dict['share_user_ids'] = item.share_user_ids\n data = json.dumps(item_dict)\n resp = requests.post(self.make_url(item.destination), headers=self.json_headers, data=data)\n self.check_response(resp)\n return resp", "response": "Create a new item in D4S2 service for the specified item."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsend an item to a specific service", "response": "def send_item(self, destination, item_id, force_send):\n \"\"\"\n Run send method for item_id at destination.\n :param destination: str which type of operation are we doing (SHARE_DESTINATION or DELIVER_DESTINATION)\n :param item_id: str D4S2 service id representing the item we want to send\n :param force_send: bool it's ok to email the item again\n :return: requests.Response containing the successful result\n \"\"\"\n data = json.dumps({\n 'force': force_send,\n })\n url_suffix = \"{}/send/\".format(item_id)\n resp = requests.post(self.make_url(destination, url_suffix), headers=self.json_headers, data=data)\n self.check_response(resp)\n return resp"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck if the response is successful. Raises error if the response is not 200.", "response": "def check_response(self, response):\n \"\"\"\n Raises error if the response isn't successful.\n :param response: requests.Response response to be checked\n \"\"\"\n if response.status_code == 401:\n raise D4S2Error(UNAUTHORIZED_MESSAGE)\n if not 200 <= response.status_code < 300:\n raise D4S2Error(\"Request to {} failed with {}:\\n{}.\".format(response.url, response.status_code,\n response.text))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef send(self, api, force_send):\n item_id = self.get_existing_item_id(api)\n if not item_id:\n item_id = self.create_item_returning_id(api)\n api.send_item(self.destination, item_id, force_send)\n else:\n if force_send:\n api.send_item(self.destination, item_id, force_send)\n else:\n item_type = D4S2Api.DEST_TO_NAME.get(self.destination, \"Item\")\n msg = \"{} already sent. Run with --resend argument to resend.\"\n raise D4S2Error(msg.format(item_type), warning=True)", "response": "Send this item using D4S2Api"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets the id of the item if it exists else None.", "response": "def get_existing_item_id(self, api):\n \"\"\"\n Lookup the id for this item via the D4S2 service.\n :param api: D4S2Api object who communicates with D4S2 server.\n :return str id of this item or None if not found\n \"\"\"\n resp = api.get_existing_item(self)\n items = resp.json()\n num_items = len(items)\n if num_items == 0:\n return None\n else:\n return items[0]['id']"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef create_item_returning_id(self, api):\n resp = api.create_item(self)\n item = resp.json()\n return item['id']", "response": "Create this item in the D4S2 service and return the id of the newly created item"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef share(self, project, to_user, force_send, auth_role, user_message):\n if self._is_current_user(to_user):\n raise ShareWithSelfError(SHARE_WITH_SELF_MESSAGE.format(\"share\"))\n if not to_user.email:\n self._raise_user_missing_email_exception(\"share\")\n self.set_user_project_permission(project, to_user, auth_role)\n return self._share_project(D4S2Api.SHARE_DESTINATION, project, to_user, force_send, auth_role, user_message)", "response": "Send mail and give user specified access to the project."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef set_user_project_permission(self, project, user, auth_role):\n self.remote_store.set_user_project_permission(project, user, auth_role)", "response": "Set user access permissions for a project."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsending a message to the service to email user for the project to_user.", "response": "def deliver(self, project, new_project_name, to_user, share_users, force_send, path_filter, user_message):\n \"\"\"\n Remove access to project_name for to_user, copy to new_project_name if not None,\n send message to service to email user so they can have access.\n :param project: RemoteProject pre-existing project to be delivered\n :param new_project_name: str name of non-existing project to copy project_name to, if None we don't copy\n :param to_user: RemoteUser user we are handing over the project to\n :param share_users: [RemoteUser] who will have project shared with them once to_user accepts the project\n :param force_send: boolean enables resending of email for existing projects\n :param path_filter: PathFilter: filters what files are shared\n :param user_message: str message to be sent with the share\n :return: str email we sent deliver to\n \"\"\"\n if self._is_current_user(to_user):\n raise ShareWithSelfError(SHARE_WITH_SELF_MESSAGE.format(\"deliver\"))\n if not to_user.email:\n self._raise_user_missing_email_exception(\"deliver\")\n self.remove_user_permission(project, to_user)\n if new_project_name:\n project = self._copy_project(project, new_project_name, path_filter)\n return self._share_project(D4S2Api.DELIVER_DESTINATION, project, to_user,\n force_send, user_message=user_message, share_users=share_users)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _share_project(self, destination, project, to_user, force_send, auth_role='', user_message='',\n share_users=None):\n \"\"\"\n Send message to remote service to email/share project with to_user.\n :param destination: str which type of sharing we are doing (SHARE_DESTINATION or DELIVER_DESTINATION)\n :param project: RemoteProject project we are sharing\n :param to_user: RemoteUser user we are sharing with\n :param auth_role: str project role eg 'project_admin' email is customized based on this setting.\n :param user_message: str message to be sent with the share\n :param share_users: [RemoteUser] users to have this project shared with after delivery (delivery only)\n :return: the email the user should receive a message on soon\n \"\"\"\n from_user = self.remote_store.get_current_user()\n share_user_ids = None\n if share_users:\n share_user_ids = [share_user.id for share_user in share_users]\n item = D4S2Item(destination=destination,\n from_user_id=from_user.id,\n to_user_id=to_user.id,\n project_id=project.id,\n project_name=project.name,\n auth_role=auth_role,\n user_message=user_message,\n share_user_ids=share_user_ids)\n item.send(self.api, force_send)\n return to_user.email", "response": "Send message to remote service to email to share project with to_user."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncopies project to new_project_name.", "response": "def _copy_project(self, project, new_project_name, path_filter):\n \"\"\"\n Copy pre-existing project with name project_name to non-existing project new_project_name.\n :param project: remotestore.RemoteProject project to copy from\n :param new_project_name: str project to copy to\n :param path_filter: PathFilter: filters what files are shared\n :return: RemoteProject new project we copied data to\n \"\"\"\n temp_directory = tempfile.mkdtemp()\n new_project_name_or_id = ProjectNameOrId.create_from_name(new_project_name)\n remote_project = self.remote_store.fetch_remote_project(new_project_name_or_id)\n if remote_project:\n raise ValueError(\"A project with name '{}' already exists.\".format(new_project_name))\n activity = CopyActivity(self.remote_store.data_service, project, new_project_name)\n self._download_project(activity, project, temp_directory, path_filter)\n self._upload_project(activity, new_project_name, temp_directory)\n activity.finished()\n shutil.rmtree(temp_directory)\n return self.remote_store.fetch_remote_project(new_project_name_or_id, must_exist=True)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndownloading the project with project_name to temp_directory.", "response": "def _download_project(self, activity, project, temp_directory, path_filter):\n \"\"\"\n Download the project with project_name to temp_directory.\n :param activity: CopyActivity: info about the copy activity are downloading for\n :param project: remotestore.RemoteProject project to download\n :param temp_directory: str path to directory we can download into\n :param path_filter: PathFilter: filters what files are shared\n \"\"\"\n self.print_func(\"Downloading a copy of '{}'.\".format(project.name))\n project_download = ProjectDownload(self.remote_store, project, temp_directory, path_filter,\n file_download_pre_processor=DownloadedFileRelations(activity))\n project_download.run()"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nuploading the contents of temp_directory into project_name", "response": "def _upload_project(self, activity, project_name, temp_directory):\n \"\"\"\n Upload the contents of temp_directory into project_name\n :param activity: CopyActivity: info about the copy activity are uploading for\n :param project_name: str project name we will upload files to\n :param temp_directory: str path to directory who's files we will upload\n \"\"\"\n self.print_func(\"Uploading to '{}'.\".format(project_name))\n items_to_send = [os.path.join(temp_directory, item) for item in os.listdir(os.path.abspath(temp_directory))]\n project_name_or_id = ProjectNameOrId.create_from_name(project_name)\n project_upload = ProjectUpload(self.config, project_name_or_id, items_to_send,\n file_upload_post_processor=UploadedFileRelations(activity))\n project_upload.run()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck if the current user is the passed in user", "response": "def _is_current_user(self, some_user):\n \"\"\"\n Is the specified user the current user?\n :param some_user: RemoteUser user we want to check against the current user\n :return: boolean: True if the current user is the passed in user\n \"\"\"\n current_user = self.remote_store.get_current_user()\n return current_user.id == some_user.id"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef finished(self):\n self.data_service.update_activity(self.id, self.name, self.desc,\n started_on=self.started,\n ended_on=self._current_timestamp_str())", "response": "Mark the activity as finished."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef run(self, data_service, project_file):\n remote_path = project_file.path\n file_dict = data_service.get_file(project_file.id).json()\n file_version_id = file_dict['current_version']['id']\n data_service.create_used_relation(self.activity.id, KindType.file_str, file_version_id)\n self.activity.remote_path_to_file_version_id[remote_path] = file_version_id", "response": "Attach a remote file to activity with used relationship."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef run(self, data_service, file_details):\n file_version_id = file_details['current_version']['id']\n data_service.create_was_generated_by_relation(self.activity.id, KindType.file_str, file_version_id)\n used_entity_id = self._lookup_used_entity_id(file_details)\n data_service.create_was_derived_from_relation(used_entity_id, KindType.file_str,\n file_version_id, KindType.file_str)", "response": "Attach a remote file to activity with was generated by relationship."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _lookup_used_entity_id(self, file_details):\n # Since this uses the response from POST to /files/ this will include the ancestors and not be\n # effected by exclude_response_fields that were used when listing the project\n name_parts = [ancestor['name'] for ancestor in file_details['ancestors']\n if ancestor['kind'] == KindType.folder_str]\n name_parts.append(file_details['name'])\n remote_path = os.sep.join(name_parts)\n return self.activity.remote_path_to_file_version_id[remote_path]", "response": "Return the file_version_id associated with the path from file_details."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef create_config(allow_insecure_config_file=False):\n config = Config()\n config.add_properties(GLOBAL_CONFIG_FILENAME)\n user_config_filename = get_user_config_filename()\n if user_config_filename == LOCAL_CONFIG_FILENAME and not allow_insecure_config_file:\n verify_file_private(user_config_filename)\n config.add_properties(user_config_filename)\n return config", "response": "Create config based on the user s configuration file."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_properties(self, filename):\n filename = os.path.expanduser(filename)\n if os.path.exists(filename):\n with open(filename, 'r') as yaml_file:\n self.update_properties(yaml.safe_load(yaml_file))", "response": "Add properties from a YAML file to the config based on filename replacing previous values."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_portal_url_base(self):\n api_url = urlparse(self.url).hostname\n portal_url = re.sub('^api\\.', '', api_url)\n portal_url = re.sub(r'api', '', portal_url)\n return portal_url", "response": "Determine the root url of the data service from the url specified."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef auth(self):\n return self.values.get(Config.AUTH, os.environ.get(AUTH_ENV_KEY_NAME, None))", "response": "Returns the auth token for use with connecting to the dataservice."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns the number of bytes per chunk to be sent to external store.", "response": "def upload_bytes_per_chunk(self):\n \"\"\"\n Return the bytes per chunk to be sent to external store.\n :return: int bytes per upload chunk\n \"\"\"\n value = self.values.get(Config.UPLOAD_BYTES_PER_CHUNK, DDS_DEFAULT_UPLOAD_CHUNKS)\n return Config.parse_bytes_str(value)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the number of parallel works to use when downloading a file.", "response": "def download_workers(self):\n \"\"\"\n Return the number of parallel works to use when downloading a file.\n :return: int number of workers. Specify None or 1 to disable parallel downloading\n \"\"\"\n # Profiling download on different servers showed half the number of CPUs to be optimum for speed.\n default_workers = int(math.ceil(default_num_workers() / 2))\n return self.values.get(Config.DOWNLOAD_WORKERS, default_workers)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef parse_bytes_str(value):\n if type(value) == str:\n if \"MB\" in value:\n return int(value.replace(\"MB\", \"\")) * MB_TO_BYTES\n else:\n return int(value)\n else:\n return value", "response": "Given a value return the integer number of bytes it represents. Trailing Mb causes the value multiplied by 1024 * 1024"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncalculates the error between two given values.", "response": "def local_error(self, originalValue, calculatedValue):\n \"\"\"Calculates the error between the two given values.\n\n :param list originalValue: List containing the values of the original data.\n :param list calculatedValue: List containing the values of the calculated TimeSeries that\n corresponds to originalValue.\n\n :return: Returns the error measure of the two given values.\n :rtype: numeric\n \"\"\"\n originalValue = originalValue[0]\n calculatedValue = calculatedValue[0]\n\n return abs(originalValue - calculatedValue)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run_instances(self, image_id, min_count, max_count,\n security_groups=None, key_name=None, instance_type=None,\n user_data=None, availability_zone=None, kernel_id=None,\n ramdisk_id=None, subnet_id=None, security_group_ids=None):\n \"\"\"Run new instances.\n\n TODO: blockDeviceMapping, monitoring, subnetId\n \"\"\"\n params = {\"ImageId\": image_id, \"MinCount\": str(min_count),\n \"MaxCount\": str(max_count)}\n if key_name is not None:\n params[\"KeyName\"] = key_name\n if subnet_id is not None:\n params[\"SubnetId\"] = subnet_id\n if security_group_ids is not None:\n for i, id in enumerate(security_group_ids):\n params[\"SecurityGroupId.%d\" % (i + 1)] = id\n else:\n msg = \"You must specify the security_group_ids with the subnet_id\"\n raise ValueError(msg)\n elif security_groups is not None:\n for i, name in enumerate(security_groups):\n params[\"SecurityGroup.%d\" % (i + 1)] = name\n else:\n msg = (\"You must specify either the subnet_id and \"\n \"security_group_ids or security_groups\")\n raise ValueError(msg)\n if user_data is not None:\n params[\"UserData\"] = b64encode(user_data)\n if instance_type is not None:\n params[\"InstanceType\"] = instance_type\n if availability_zone is not None:\n params[\"Placement.AvailabilityZone\"] = availability_zone\n if kernel_id is not None:\n params[\"KernelId\"] = kernel_id\n if ramdisk_id is not None:\n params[\"RamdiskId\"] = ramdisk_id\n query = self.query_factory(\n action=\"RunInstances\", creds=self.creds, endpoint=self.endpoint,\n other_params=params)\n d = query.submit()\n return d.addCallback(self.parser.run_instances)", "response": "Run new instances.\n\n TODO: blockDeviceMapping, monitoring, subnetId"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_console_output(self, instance_id):\n InstanceIDParam = {\"InstanceId\": instance_id}\n query = self.query_factory(\n action=\"GetConsoleOutput\", creds=self.creds,\n endpoint=self.endpoint, other_params=InstanceIDParam)\n d = query.submit()\n return d.addCallback(self.parser.get_console_output)", "response": "Get the console output for a single instance."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef describe_security_groups(self, *names):\n group_names = {}\n if names:\n group_names = dict([(\"GroupName.%d\" % (i + 1), name)\n for i, name in enumerate(names)])\n query = self.query_factory(\n action=\"DescribeSecurityGroups\", creds=self.creds,\n endpoint=self.endpoint, other_params=group_names)\n d = query.submit()\n return d.addCallback(self.parser.describe_security_groups)", "response": "Describe security groups.\n\n @param names: Optionally, a list of security group names to describe.\n Defaults to all security groups in the account.\n @return: A C{Deferred} that will fire with a list of L{SecurityGroup}s\n retrieved from the cloud."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef create_security_group(self, name, description, vpc_id=None):\n parameters = {\"GroupName\": name, \"GroupDescription\": description}\n if vpc_id:\n parameters[\"VpcId\"] = vpc_id\n query = self.query_factory(\n action=\"CreateSecurityGroup\", creds=self.creds,\n endpoint=self.endpoint, other_params=parameters)\n d = query.submit()\n return d.addCallback(self.parser.create_security_group)", "response": "Create a security group."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef authorize_security_group(\n self, group_name=None, group_id=None, source_group_name=\"\", source_group_owner_id=\"\",\n ip_protocol=\"\", from_port=\"\", to_port=\"\", cidr_ip=\"\"):\n \"\"\"\n There are two ways to use C{authorize_security_group}:\n 1) associate an existing group (source group) with the one that you\n are targeting (group_name) with an authorization update; or\n 2) associate a set of IP permissions with the group you are\n targeting with an authorization update.\n\n @param group_name: The group you will be modifying with a new\n authorization.\n @param group_id: The id of the group you will be modifying with\n a new authorization.\n\n Optionally, the following parameters:\n @param source_group_name: Name of security group to authorize access to\n when operating on a user/group pair.\n @param source_group_owner_id: Owner of security group to authorize\n access to when operating on a user/group pair.\n\n If those parameters are not specified, then the following must be:\n @param ip_protocol: IP protocol to authorize access to when operating\n on a CIDR IP.\n @param from_port: Bottom of port range to authorize access to when\n operating on a CIDR IP. This contains the ICMP type if ICMP is\n being authorized.\n @param to_port: Top of port range to authorize access to when operating\n on a CIDR IP. This contains the ICMP code if ICMP is being\n authorized.\n @param cidr_ip: CIDR IP range to authorize access to when operating on\n a CIDR IP.\n\n @return: A C{Deferred} that will fire with a truth value for the\n success of the operation.\n \"\"\"\n if source_group_name and source_group_owner_id:\n parameters = {\n \"SourceSecurityGroupName\": source_group_name,\n \"SourceSecurityGroupOwnerId\": source_group_owner_id,\n }\n elif ip_protocol and from_port and to_port and cidr_ip:\n parameters = {\n \"IpProtocol\": ip_protocol,\n \"FromPort\": from_port,\n \"ToPort\": to_port,\n \"CidrIp\": cidr_ip,\n }\n else:\n msg = (\"You must specify either both group parameters or \"\n \"all the ip parameters.\")\n raise ValueError(msg)\n if group_id:\n parameters[\"GroupId\"] = group_id\n elif group_name:\n parameters[\"GroupName\"] = group_name\n else:\n raise ValueError(\"You must specify either the group name of the group id.\")\n query = self.query_factory(\n action=\"AuthorizeSecurityGroupIngress\", creds=self.creds,\n endpoint=self.endpoint, other_params=parameters)\n d = query.submit()\n return d.addCallback(self.parser.truth_return)", "response": "This method is used to associate an existing security group with a new security group."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef authorize_ip_permission(\n self, group_name, ip_protocol, from_port, to_port, cidr_ip):\n \"\"\"\n This is a convenience function that wraps the \"authorize ip\n permission\" functionality of the C{authorize_security_group} method.\n\n For an explanation of the parameters, see C{authorize_security_group}.\n \"\"\"\n d = self.authorize_security_group(\n group_name,\n ip_protocol=ip_protocol, from_port=from_port, to_port=to_port,\n cidr_ip=cidr_ip)\n return d", "response": "This is a convenience method that wraps the authorize ip\n permission functionality of the C { authorize_security_group method }."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef revoke_group_permission(\n self, group_name, source_group_name, source_group_owner_id):\n \"\"\"\n This is a convenience function that wraps the \"authorize group\"\n functionality of the C{authorize_security_group} method.\n\n For an explanation of the parameters, see C{revoke_security_group}.\n \"\"\"\n d = self.revoke_security_group(\n group_name,\n source_group_name=source_group_name,\n source_group_owner_id=source_group_owner_id)\n return d", "response": "This method is used to revoke group permissions for a group."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a new volume.", "response": "def create_volume(self, availability_zone, size=None, snapshot_id=None):\n \"\"\"Create a new volume.\"\"\"\n params = {\"AvailabilityZone\": availability_zone}\n if ((snapshot_id is None and size is None) or\n (snapshot_id is not None and size is not None)):\n raise ValueError(\"Please provide either size or snapshot_id\")\n if size is not None:\n params[\"Size\"] = str(size)\n if snapshot_id is not None:\n params[\"SnapshotId\"] = snapshot_id\n query = self.query_factory(\n action=\"CreateVolume\", creds=self.creds, endpoint=self.endpoint,\n other_params=params)\n d = query.submit()\n return d.addCallback(self.parser.create_volume)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef describe_snapshots(self, *snapshot_ids):\n snapshot_set = {}\n for pos, snapshot_id in enumerate(snapshot_ids):\n snapshot_set[\"SnapshotId.%d\" % (pos + 1)] = snapshot_id\n query = self.query_factory(\n action=\"DescribeSnapshots\", creds=self.creds,\n endpoint=self.endpoint, other_params=snapshot_set)\n d = query.submit()\n return d.addCallback(self.parser.snapshots)", "response": "Describe available snapshots.\n\n TODO: ownerSet, restorableBySet"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nremove a previously created snapshot.", "response": "def delete_snapshot(self, snapshot_id):\n \"\"\"Remove a previously created snapshot.\"\"\"\n query = self.query_factory(\n action=\"DeleteSnapshot\", creds=self.creds, endpoint=self.endpoint,\n other_params={\"SnapshotId\": snapshot_id})\n d = query.submit()\n return d.addCallback(self.parser.truth_return)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef attach_volume(self, volume_id, instance_id, device):\n query = self.query_factory(\n action=\"AttachVolume\", creds=self.creds, endpoint=self.endpoint,\n other_params={\"VolumeId\": volume_id, \"InstanceId\": instance_id,\n \"Device\": device})\n d = query.submit()\n return d.addCallback(self.parser.attach_volume)", "response": "Attach the given volume to the specified instance at C { device }."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef describe_keypairs(self, *keypair_names):\n keypairs = {}\n for index, keypair_name in enumerate(keypair_names):\n keypairs[\"KeyName.%d\" % (index + 1)] = keypair_name\n query = self.query_factory(\n action=\"DescribeKeyPairs\", creds=self.creds,\n endpoint=self.endpoint, other_params=keypairs)\n d = query.submit()\n return d.addCallback(self.parser.describe_keypairs)", "response": "Returns information about key pairs available."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef create_keypair(self, keypair_name):\n query = self.query_factory(\n action=\"CreateKeyPair\", creds=self.creds, endpoint=self.endpoint,\n other_params={\"KeyName\": keypair_name})\n d = query.submit()\n return d.addCallback(self.parser.create_keypair)", "response": "Create a new 2048 bit RSA key pair and return a unique ID that can be used to reference the created key pair."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef import_keypair(self, keypair_name, key_material):\n query = self.query_factory(\n action=\"ImportKeyPair\", creds=self.creds, endpoint=self.endpoint,\n other_params={\"KeyName\": keypair_name,\n \"PublicKeyMaterial\": b64encode(key_material)})\n d = query.submit()\n return d.addCallback(self.parser.import_keypair, key_material)", "response": "Import an existing SSH key into EC2."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef allocate_address(self):\n # XXX remove empty other_params\n query = self.query_factory(\n action=\"AllocateAddress\", creds=self.creds, endpoint=self.endpoint,\n other_params={})\n d = query.submit()\n return d.addCallback(self.parser.allocate_address)", "response": "Acquire an elastic IP address to be attached subsequently to EC2\n instances."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nrelease a previously allocated address returned by C { allocate_address.", "response": "def release_address(self, address):\n \"\"\"\n Release a previously allocated address returned by C{allocate_address}.\n\n @return: C{True} if the operation succeeded.\n \"\"\"\n query = self.query_factory(\n action=\"ReleaseAddress\", creds=self.creds, endpoint=self.endpoint,\n other_params={\"PublicIp\": address})\n d = query.submit()\n return d.addCallback(self.parser.truth_return)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef describe_addresses(self, *addresses):\n address_set = {}\n for pos, address in enumerate(addresses):\n address_set[\"PublicIp.%d\" % (pos + 1)] = address\n query = self.query_factory(\n action=\"DescribeAddresses\", creds=self.creds,\n endpoint=self.endpoint, other_params=address_set)\n d = query.submit()\n return d.addCallback(self.parser.describe_addresses)", "response": "Describe the elastic IPs allocated in this account."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nparse the instancesSet XML payload and return a list of L { Instance s.", "response": "def instances_set(self, root, reservation):\n \"\"\"Parse instance data out of an XML payload.\n\n @param root: The root node of the XML payload.\n @param reservation: The L{Reservation} associated with the instances\n from the response.\n @return: A C{list} of L{Instance}s.\n \"\"\"\n instances = []\n for instance_data in root.find(\"instancesSet\"):\n instances.append(self.instance(instance_data, reservation))\n return instances"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nparsing the instance data out of an XML payload.", "response": "def instance(self, instance_data, reservation):\n \"\"\"Parse instance data out of an XML payload.\n\n @param instance_data: An XML node containing instance data.\n @param reservation: The L{Reservation} associated with the instance.\n @return: An L{Instance}.\n\n TODO: reason, platform, monitoring, subnetId, vpcId, privateIpAddress,\n ipAddress, stateReason, architecture, rootDeviceName,\n blockDeviceMapping, instanceLifecycle, spotInstanceRequestId.\n \"\"\"\n for group_data in instance_data.find(\"groupSet\"):\n group_id = group_data.findtext(\"groupId\")\n group_name = group_data.findtext(\"groupName\")\n reservation.groups.append((group_id, group_name))\n instance_id = instance_data.findtext(\"instanceId\")\n instance_state = instance_data.find(\n \"instanceState\").findtext(\"name\")\n private_dns_name = instance_data.findtext(\"privateDnsName\")\n dns_name = instance_data.findtext(\"dnsName\")\n private_ip_address = instance_data.findtext(\"privateIpAddress\")\n ip_address = instance_data.findtext(\"ipAddress\")\n key_name = instance_data.findtext(\"keyName\")\n ami_launch_index = instance_data.findtext(\"amiLaunchIndex\")\n products = []\n product_codes = instance_data.find(\"productCodes\")\n if product_codes is not None:\n for product_data in instance_data.find(\"productCodes\"):\n products.append(product_data.text)\n instance_type = instance_data.findtext(\"instanceType\")\n launch_time = instance_data.findtext(\"launchTime\")\n placement = instance_data.find(\"placement\").findtext(\n \"availabilityZone\")\n kernel_id = instance_data.findtext(\"kernelId\")\n ramdisk_id = instance_data.findtext(\"ramdiskId\")\n image_id = instance_data.findtext(\"imageId\")\n instance = model.Instance(\n instance_id, instance_state, instance_type, image_id,\n private_dns_name, dns_name, private_ip_address, ip_address,\n key_name, ami_launch_index, launch_time, placement, products,\n kernel_id, ramdisk_id, reservation=reservation)\n return instance"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef describe_instances(self, xml_bytes):\n root = XML(xml_bytes)\n results = []\n # May be a more elegant way to do this:\n for reservation_data in root.find(\"reservationSet\"):\n # Create a reservation object with the parsed data.\n reservation = model.Reservation(\n reservation_id=reservation_data.findtext(\"reservationId\"),\n owner_id=reservation_data.findtext(\"ownerId\"))\n # Get the list of instances.\n instances = self.instances_set(\n reservation_data, reservation)\n results.extend(instances)\n return results", "response": "Parse the reservations XML payload that is returned from an AWS CloudFormation describeInstances API call and return a list of model. Instances."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef run_instances(self, xml_bytes):\n root = XML(xml_bytes)\n # Get the security group information.\n groups = []\n for group_data in root.find(\"groupSet\"):\n group_id = group_data.findtext(\"groupId\")\n groups.append(group_id)\n # Create a reservation object with the parsed data.\n reservation = model.Reservation(\n reservation_id=root.findtext(\"reservationId\"),\n owner_id=root.findtext(\"ownerId\"),\n groups=groups)\n # Get the list of instances.\n instances = self.instances_set(root, reservation)\n return instances", "response": "Parse the reservations XML payload that is returned from an AWS\n RunInstances API call."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nparse the XML returned by the C { TerminateInstances function and return an iterable of tuples of instanceId previousState currentState", "response": "def terminate_instances(self, xml_bytes):\n \"\"\"Parse the XML returned by the C{TerminateInstances} function.\n\n @param xml_bytes: XML bytes with a C{TerminateInstancesResponse} root\n element.\n @return: An iterable of C{tuple} of (instanceId, previousState,\n currentState) for the ec2 instances that where terminated.\n \"\"\"\n root = XML(xml_bytes)\n result = []\n # May be a more elegant way to do this:\n instances = root.find(\"instancesSet\")\n if instances is not None:\n for instance in instances:\n instanceId = instance.findtext(\"instanceId\")\n previousState = instance.find(\"previousState\").findtext(\n \"name\")\n currentState = instance.find(\"currentState\").findtext(\n \"name\")\n result.append((instanceId, previousState, currentState))\n return result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nparse the XML returned by the C { DescribeSecurityGroups} function and return a list of L { SecurityGroup } instances.", "response": "def describe_security_groups(self, xml_bytes):\n \"\"\"Parse the XML returned by the C{DescribeSecurityGroups} function.\n\n @param xml_bytes: XML bytes with a C{DescribeSecurityGroupsResponse}\n root element.\n @return: A list of L{SecurityGroup} instances.\n \"\"\"\n root = XML(xml_bytes)\n result = []\n for group_info in root.findall(\"securityGroupInfo/item\"):\n id = group_info.findtext(\"groupId\")\n name = group_info.findtext(\"groupName\")\n description = group_info.findtext(\"groupDescription\")\n owner_id = group_info.findtext(\"ownerId\")\n allowed_groups = []\n allowed_ips = []\n ip_permissions = group_info.find(\"ipPermissions\")\n if ip_permissions is None:\n ip_permissions = ()\n for ip_permission in ip_permissions:\n\n # openstack doesn't handle self authorized groups properly\n # XXX this is an upstream problem and should be addressed there\n # lp bug #829609\n ip_protocol = ip_permission.findtext(\"ipProtocol\")\n from_port = ip_permission.findtext(\"fromPort\")\n to_port = ip_permission.findtext(\"toPort\")\n\n if from_port:\n from_port = int(from_port)\n\n if to_port:\n to_port = int(to_port)\n\n for groups in ip_permission.findall(\"groups/item\") or ():\n user_id = groups.findtext(\"userId\")\n group_name = groups.findtext(\"groupName\")\n if user_id and group_name:\n if (user_id, group_name) not in allowed_groups:\n allowed_groups.append((user_id, group_name))\n for ip_ranges in ip_permission.findall(\"ipRanges/item\") or ():\n cidr_ip = ip_ranges.findtext(\"cidrIp\")\n allowed_ips.append(\n model.IPPermission(\n ip_protocol, from_port, to_port, cidr_ip))\n\n allowed_groups = [model.UserIDGroupPair(user_id, group_name)\n for user_id, group_name in allowed_groups]\n\n security_group = model.SecurityGroup(\n id, name, description, owner_id=owner_id,\n groups=allowed_groups, ips=allowed_ips)\n result.append(security_group)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nparses the XML returned by the C{DescribeVolumes function and return a list of L { Volume } instances.", "response": "def describe_volumes(self, xml_bytes):\n \"\"\"Parse the XML returned by the C{DescribeVolumes} function.\n\n @param xml_bytes: XML bytes with a C{DescribeVolumesResponse} root\n element.\n @return: A list of L{Volume} instances.\n\n TODO: attachementSetItemResponseType#deleteOnTermination\n \"\"\"\n root = XML(xml_bytes)\n result = []\n for volume_data in root.find(\"volumeSet\"):\n volume_id = volume_data.findtext(\"volumeId\")\n size = int(volume_data.findtext(\"size\"))\n snapshot_id = volume_data.findtext(\"snapshotId\")\n availability_zone = volume_data.findtext(\"availabilityZone\")\n status = volume_data.findtext(\"status\")\n create_time = volume_data.findtext(\"createTime\")\n create_time = datetime.strptime(\n create_time[:19], \"%Y-%m-%dT%H:%M:%S\")\n volume = model.Volume(\n volume_id, size, status, create_time, availability_zone,\n snapshot_id)\n result.append(volume)\n for attachment_data in volume_data.find(\"attachmentSet\"):\n instance_id = attachment_data.findtext(\"instanceId\")\n device = attachment_data.findtext(\"device\")\n status = attachment_data.findtext(\"status\")\n attach_time = attachment_data.findtext(\"attachTime\")\n attach_time = datetime.strptime(\n attach_time[:19], \"%Y-%m-%dT%H:%M:%S\")\n attachment = model.Attachment(\n instance_id, device, status, attach_time)\n volume.attachments.append(attachment)\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nparse the XML returned by the C { CreateVolume function.", "response": "def create_volume(self, xml_bytes):\n \"\"\"Parse the XML returned by the C{CreateVolume} function.\n\n @param xml_bytes: XML bytes with a C{CreateVolumeResponse} root\n element.\n @return: The L{Volume} instance created.\n \"\"\"\n root = XML(xml_bytes)\n volume_id = root.findtext(\"volumeId\")\n size = int(root.findtext(\"size\"))\n snapshot_id = root.findtext(\"snapshotId\")\n availability_zone = root.findtext(\"availabilityZone\")\n status = root.findtext(\"status\")\n create_time = root.findtext(\"createTime\")\n create_time = datetime.strptime(\n create_time[:19], \"%Y-%m-%dT%H:%M:%S\")\n volume = model.Volume(\n volume_id, size, status, create_time, availability_zone,\n snapshot_id)\n return volume"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef snapshots(self, xml_bytes):\n root = XML(xml_bytes)\n result = []\n for snapshot_data in root.find(\"snapshotSet\"):\n snapshot_id = snapshot_data.findtext(\"snapshotId\")\n volume_id = snapshot_data.findtext(\"volumeId\")\n status = snapshot_data.findtext(\"status\")\n start_time = snapshot_data.findtext(\"startTime\")\n start_time = datetime.strptime(\n start_time[:19], \"%Y-%m-%dT%H:%M:%S\")\n progress = snapshot_data.findtext(\"progress\")[:-1]\n progress = float(progress or \"0\") / 100.\n snapshot = model.Snapshot(\n snapshot_id, volume_id, status, start_time, progress)\n result.append(snapshot)\n return result", "response": "Parse the XML returned by the DescribeSnapshots function and return a list of L { Snapshot } instances."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_snapshot(self, xml_bytes):\n root = XML(xml_bytes)\n snapshot_id = root.findtext(\"snapshotId\")\n volume_id = root.findtext(\"volumeId\")\n status = root.findtext(\"status\")\n start_time = root.findtext(\"startTime\")\n start_time = datetime.strptime(\n start_time[:19], \"%Y-%m-%dT%H:%M:%S\")\n progress = root.findtext(\"progress\")[:-1]\n progress = float(progress or \"0\") / 100.\n return model.Snapshot(\n snapshot_id, volume_id, status, start_time, progress)", "response": "Parse the XML returned by the C { CreateSnapshot function."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef attach_volume(self, xml_bytes):\n root = XML(xml_bytes)\n status = root.findtext(\"status\")\n attach_time = root.findtext(\"attachTime\")\n attach_time = datetime.strptime(\n attach_time[:19], \"%Y-%m-%dT%H:%M:%S\")\n return {\"status\": status, \"attach_time\": attach_time}", "response": "Parse the XML returned by the C { AttachVolume function."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nparsing the XML returned by the C { DescribeKeyPairs function.", "response": "def describe_keypairs(self, xml_bytes):\n \"\"\"Parse the XML returned by the C{DescribeKeyPairs} function.\n\n @param xml_bytes: XML bytes with a C{DescribeKeyPairsResponse} root\n element.\n @return: a C{list} of L{Keypair}.\n \"\"\"\n results = []\n root = XML(xml_bytes)\n keypairs = root.find(\"keySet\")\n if keypairs is None:\n return results\n for keypair_data in keypairs:\n key_name = keypair_data.findtext(\"keyName\")\n key_fingerprint = keypair_data.findtext(\"keyFingerprint\")\n results.append(model.Keypair(key_name, key_fingerprint))\n return results"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nparse the XML returned by the C { CreateKeyPair function.", "response": "def create_keypair(self, xml_bytes):\n \"\"\"Parse the XML returned by the C{CreateKeyPair} function.\n\n @param xml_bytes: XML bytes with a C{CreateKeyPairResponse} root\n element.\n @return: The L{Keypair} instance created.\n \"\"\"\n keypair_data = XML(xml_bytes)\n key_name = keypair_data.findtext(\"keyName\")\n key_fingerprint = keypair_data.findtext(\"keyFingerprint\")\n key_material = keypair_data.findtext(\"keyMaterial\")\n return model.Keypair(key_name, key_fingerprint, key_material)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nparse the XML returned by the C { DescribeAddresses function.", "response": "def describe_addresses(self, xml_bytes):\n \"\"\"Parse the XML returned by the C{DescribeAddresses} function.\n\n @param xml_bytes: XML bytes with a C{DescribeAddressesResponse} root\n element.\n @return: a C{list} of L{tuple} of (publicIp, instancId).\n \"\"\"\n results = []\n root = XML(xml_bytes)\n for address_data in root.find(\"addressesSet\"):\n address = address_data.findtext(\"publicIp\")\n instance_id = address_data.findtext(\"instanceId\")\n results.append((address, instance_id))\n return results"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nparse the XML returned by the C { DescribeAvailibilityZonesResponse function.", "response": "def describe_availability_zones(self, xml_bytes):\n \"\"\"Parse the XML returned by the C{DescribeAvailibilityZones} function.\n\n @param xml_bytes: XML bytes with a C{DescribeAvailibilityZonesResponse}\n root element.\n @return: a C{list} of L{AvailabilityZone}.\n\n TODO: regionName, messageSet\n \"\"\"\n results = []\n root = XML(xml_bytes)\n for zone_data in root.find(\"availabilityZoneInfo\"):\n zone_name = zone_data.findtext(\"zoneName\")\n zone_state = zone_data.findtext(\"zoneState\")\n results.append(model.AvailabilityZone(zone_name, zone_state))\n return results"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsigns this query using its built in credentials.", "response": "def sign(self, hash_type=\"sha256\"):\n \"\"\"Sign this query using its built in credentials.\n\n @param hash_type: if the SignatureVersion is 2, specify the type of\n hash to use, either \"sha1\" or \"sha256\". It defaults to the latter.\n\n This prepares it to be sent, and should be done as the last step before\n submitting the query. Signing is done automatically - this is a public\n method to facilitate testing.\n \"\"\"\n version = self.params[\"SignatureVersion\"]\n if version == \"2\":\n self.params[\"SignatureMethod\"] = \"Hmac%s\" % hash_type.upper()\n self.params[\"Signature\"] = self.signature.compute()"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef submit(self):\n self.sign()\n url = self.endpoint.get_uri()\n method = self.endpoint.method\n params = self.signature.get_canonical_query_params()\n headers = {}\n kwargs = {\"method\": method}\n if method == \"POST\":\n headers[\"Content-Type\"] = \"application/x-www-form-urlencoded\"\n kwargs[\"postdata\"] = params\n else:\n url += \"?%s\" % params\n if self.endpoint.get_host() != self.endpoint.get_canonical_host():\n headers[\"Host\"] = self.endpoint.get_canonical_host()\n if headers:\n kwargs[\"headers\"] = headers\n if self.timeout:\n kwargs[\"timeout\"] = self.timeout\n d = self.get_page(url, **kwargs)\n return d.addErrback(ec2_error_wrapper)", "response": "Submit this query.\n\n @return: A deferred from get_page"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef compute(self):\n if \"Signature\" in self.params:\n raise RuntimeError(\"Existing signature in parameters\")\n if self.signature_version is not None:\n version = self.signature_version\n else:\n version = self.params[\"SignatureVersion\"]\n if str(version) == \"1\":\n bytes = self.old_signing_text()\n hash_type = \"sha1\"\n elif str(version) == \"2\":\n bytes = self.signing_text()\n if self.signature_method is not None:\n signature_method = self.signature_method\n else:\n signature_method = self.params[\"SignatureMethod\"]\n hash_type = signature_method[len(\"Hmac\"):].lower()\n else:\n raise RuntimeError(\"Unsupported SignatureVersion: '%s'\" % version)\n return self.creds.sign(bytes, hash_type)", "response": "Compute and return the signature according to the given data."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef old_signing_text(self):\n result = []\n lower_cmp = lambda x, y: cmp(x[0].lower(), y[0].lower())\n for key, value in sorted(self.params.items(), cmp=lower_cmp):\n result.append(\"%s%s\" % (key, value))\n return \"\".join(result)", "response": "Return the text needed for signing using SignatureVersion 1."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the text to be signed when signing the query.", "response": "def signing_text(self):\n \"\"\"Return the text to be signed when signing the query.\"\"\"\n result = \"%s\\n%s\\n%s\\n%s\" % (self.endpoint.method,\n self.endpoint.get_canonical_host(),\n self.endpoint.path,\n self.get_canonical_query_params())\n return result"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the canonical query params used in signing.", "response": "def get_canonical_query_params(self):\n \"\"\"Return the canonical query params (used in signing).\"\"\"\n result = []\n for key, value in self.sorted_params():\n result.append(\"%s=%s\" % (self.encode(key), self.encode(value)))\n return \"&\".join(result)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nencodes a_string as per the canonicalisation encoding rules.", "response": "def encode(self, string):\n \"\"\"Encode a_string as per the canonicalisation encoding rules.\n\n See the AWS dev reference page 186 (2009-11-30 version).\n @return: a_string encoded.\n \"\"\"\n if isinstance(string, unicode):\n string = string.encode(\"utf-8\")\n return quote(string, safe=\"~\")"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncreate an instance of this from XML bytes.", "response": "def from_xml(cls, xml_bytes):\n \"\"\"\n Create an instance of this from XML bytes.\n\n @param xml_bytes: C{str} bytes of XML to parse\n @return: an instance of L{MultipartInitiationResponse}\n \"\"\"\n root = XML(xml_bytes)\n return cls(root.findtext('Bucket'),\n root.findtext('Key'),\n root.findtext('UploadId'))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef pythag(a, b):\n absA = abs(a)\n absB = abs(b)\n if absA > absB:\n return absA * sqrt(1.0 + (absB / float(absA)) ** 2)\n elif absB == 0.0:\n return 0.0\n else:\n return absB * sqrt(1.0 + (absA / float(absB)) ** 2)", "response": "Compute the Pythagorean theorem"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _initialize_with_array(self, data, rowBased=True):\n if rowBased:\n self.matrix = []\n if len(data) != self._rows:\n raise ValueError(\"Size of Matrix does not match\")\n for col in xrange(self._columns):\n self.matrix.append([])\n for row in xrange(self._rows):\n if len(data[row]) != self._columns:\n raise ValueError(\"Size of Matrix does not match\")\n self.matrix[col].append(data[row][col])\n else:\n if len(data) != self._columns:\n raise ValueError(\"Size of Matrix does not match\")\n for col in data:\n if len(col) != self._rows:\n raise ValueError(\"Size of Matrix does not match\")\n self.matrix = copy.deepcopy(data)", "response": "Set the matrix values from a two dimensional list."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncreating a new Matrix instance from a TimeSeries or MultiDimensionalTimeSeries.", "response": "def from_timeseries(cls, timeSeries):\n \"\"\"Create a new Matrix instance from a TimeSeries or MultiDimensionalTimeSeries\n\n :param TimeSeries timeSeries: The TimeSeries, which should be used to\n create a new Matrix.\n :return: A Matrix with the values of the timeSeries. Each row of\n the Matrix represents one entry of the timeSeries.\n The time of an entry is ignored in the matrix.\n :rtype: Matrix\n\n :raise: Raises an :py:exc:`ValueError`, if the timeSeries is empty.\n \"\"\"\n width = 1\n\n if isinstance(timeSeries, MultiDimensionalTimeSeries):\n width = timeSeries.dimension_count()\n\n matrixData = [[] for dummy in xrange(width)]\n\n for entry in timeSeries:\n for col in xrange(1, len(entry)):\n matrixData[col - 1].append(entry[col])\n if not matrixData[0]:\n raise ValueError(\"Cannot create Matrix from empty Timeseries\")\n mtrx = Matrix.from_two_dim_array(len(matrixData), len(matrixData[0]), matrixData)\n\n # mtrx.initialize(matrixData, rowBased=False)\n return mtrx"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef from_two_dim_array(cls, cols, rows, twoDimArray):\n return Matrix(cols, rows, twoDimArray, rowBased=False, isOneDimArray=False)", "response": "Create a new Matrix instance from a two dimensional array."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef to_multi_dim_timeseries(self):\n ts = MultiDimensionalTimeSeries(dimensions=self.get_width())\n for row in xrange(self.get_height()):\n newEntry = []\n for col in xrange(self.get_width()):\n newEntry.append(self.get_value(col, row))\n ts.add_entry(row, newEntry)\n return ts", "response": "Return a MultiDimensionalTimeSeries with the values of self"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_array(self, rowBased=True):\n if rowBased:\n array = []\n for row in xrange(self._rows):\n newRow = []\n for col in xrange(self._columns):\n newRow.append(self.get_value(col, row))\n array.append(newRow)\n return array\n return copy.deepcopy(self.matrix)", "response": "Return a two dimensional list with the values of the i'th column based."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_matrix_from_list(self, rows, columns, matrix_list, rowBased=True):\n resultMatrix = Matrix(columns, rows, matrix_list, rowBased)\n return resultMatrix", "response": "Create a Matrix instance from a list of values."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef set_value(self, column, row, value):\n self.matrix[column][row] = value", "response": "Set the value of the matrix at the specified column and row."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef invers(self):\n\n if self._columns != self._rows:\n raise ValueError(\"A square matrix is needed\")\n mArray = self.get_array(False)\n appList = [0] * self._columns\n\n # add identity matrix to array in order to use gauss jordan algorithm\n for col in xrange(self._columns):\n mArray.append(appList[:])\n mArray[self._columns + col][col] = 1\n # create new Matrix and execute gass jordan algorithm\n exMatrix = Matrix.from_two_dim_array(2 * self._columns, self._rows, mArray)\n gjResult = exMatrix.gauss_jordan()\n # remove identity matrix from left side\n # TODO Implement slicing directly for Matrix\n gjResult.matrix = gjResult.matrix[self._columns:]\n gjResult._columns = len(gjResult.matrix)\n return gjResult", "response": "Return the invers matrix if it can be calculated."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nmultiplying the matrix with the given matrix and return the result.", "response": "def matrix_multiplication(self, matrix):\n \"\"\"Multiply :py:obj:`self` with the given matrix and return result matrix.\n\n param Matrix matrix: The matrix, which should be multiplied.\n\n :return: Returns a new Matrix with the result of the multiplication\n :rtype: Matrix\n\n :note: Make sure, that the matrices can be multiplied.\n The number of columns of the Matrix instance must match with\n the number of rows of the Matrix given as parameter.\n Use is_matrix_mult_possible(matrix) to test.\n \"\"\"\n resultMatrix = Matrix(matrix.get_width(), self.get_height())\n for r_row in xrange(self._rows):\n for r_col in xrange(matrix.get_width()):\n #blockwise matrix multiplication hack\n if isinstance(self.get_array()[0][0], Matrix):\n blocksize = self.get_array()[0][0].get_width()\n valueT = Matrix(blocksize, blocksize)\n else:\n valueT = 0\n for column in xrange(matrix.get_height()):\n valueT += self.get_value(column, r_row) * matrix.get_value(r_col, column)\n resultMatrix.set_value(r_col, r_row, valueT)\n return resultMatrix"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn a new Matrix with the columns of the current Matrix.", "response": "def flatten(self):\n \"\"\"\n If the current Matrix consists of Blockmatrixes as elementes method\n flattens the Matrix into one Matrix only consisting of the 2nd level\n elements\n\n [[[1 2] [[3 4] to [[1 2 3 4]\n [5 6]] [7 8]]] [5 6 7 8]]\n \"\"\"\n blocksize = self.get_array()[0][0].get_width()\n width = self.get_width() * blocksize\n\n columnsNew = [[] for dummy in xrange(width)]\n\n for row in self.get_array():\n index = 0\n for submatrix in row:\n for column in submatrix.get_array(False):\n columnsNew[index] += column\n index += 1\n\n columnsFlat = sum(columnsNew, [])\n return Matrix(width, len(columnsNew[0]), columnsFlat, rowBased=False)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nconvert a matrix into a n * m matrix.", "response": "def matrix_to_blockmatrix(self, blocksize):\n \"\"\"\n turns an n*m Matrix into a (n/blocksize)*(m/blocksize matrix).\n Each element is another blocksize*blocksize matrix.\n \"\"\"\n if self.get_width() % blocksize or self.get_height() % blocksize:\n raise ValueError(\"Number of rows and columns have to be evenly dividable by blocksize\")\n selfBlocks = []\n for columnIndex in range(0, self.get_width() - 1, blocksize):\n for rowIndex in range(0, self.get_height() - 1, blocksize):\n currentBlock = []\n for blockRows in self.get_array(False)[columnIndex:columnIndex + blocksize]:\n currentBlock += blockRows[rowIndex:rowIndex + blocksize]\n selfBlocks.append(Matrix(blocksize, blocksize, currentBlock, rowBased=False))\n return Matrix(self.get_width() / blocksize, self.get_height() / blocksize, selfBlocks, rowBased=False)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef multiply(self, multiplicator):\n result = Matrix(self.get_width(), self.get_height())\n for row in xrange(self.get_height()):\n for col in xrange(self.get_width()):\n result.set_value(col, row, self.get_value(col, row) * multiplicator)\n return result", "response": "Return a new Matrix with a multiple."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef transform(self):\n t_matrix = Matrix(self._rows, self._columns)\n for col_i, col in enumerate(self.matrix):\n for row_i, entry in enumerate(col):\n t_matrix.set_value(row_i, col_i, entry)\n return t_matrix", "response": "Return a new transformed matrix."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef gauss_jordan(self):\n\n mArray = self.get_array(rowBased=False)\n width = self.get_width()\n height = self.get_height()\n if not height < width:\n raise ValueError(\"\"\"Not enough rows\"\"\")\n # Start with complete matrix and remove in each iteration\n # the first row and the first column\n for offset in xrange(height):\n # Switch lines, if current first value is 0\n if mArray[offset][offset] == 0:\n for i in xrange(offset + 1, height):\n if mArray[offset][i] != 0:\n tmp = []\n for j in xrange(offset, width):\n tmp.append(mArray[j][offset])\n # tmp = mArray[offset][offset:]\n for j in xrange(offset, width):\n mArray[j][offset] = mArray[j][i]\n mArray[j][i] = tmp[j]\n # mArray[offset][offset:] = mArray[i][offset:]\n # mArray[i] = tmp\n break\n\n currentRow = [mArray[j][offset] for j in xrange(offset, width)]\n devider = float(currentRow[0])\n # If no line is found with an value != 0\n # the matrix is not invertible\n if devider == 0:\n raise ValueError(\"Matrix is not invertible\")\n transformedRow = []\n # Devide current row by first element of current row\n for value in currentRow:\n transformedRow.append(value / devider)\n # put transformed row back into matrix\n for j in xrange(offset, width):\n mArray[j][offset] = transformedRow[j - offset]\n # subtract multiples of the current row, from all remaining rows\n # in order to become a 0 at the current first column\n for i in xrange(offset + 1, height):\n multi = mArray[offset][i]\n for j in xrange(offset, width):\n mArray[j][i] = mArray[j][i] - mArray[j][offset] * multi\n for i in xrange(1, height):\n # subtract multiples of the i-the row from all above rows\n for j in xrange(0, i):\n multi = mArray[i][j]\n for col in xrange(i, width):\n mArray[col][j] = mArray[col][j] - mArray[col][i] * multi\n self.matrix = mArray\n return self", "response": "Reduce the matrix to row echelon form."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn Matrices u b v with self = ubv and self = ubv and v is in bidiagonal form.", "response": "def householder(self):\n \"\"\"Return Matrices u,b,v with self = ubv and b is in bidiagonal form\n\n The algorithm uses householder transformations.\n\n :return tuple (u,b,v): A tuple with the Matrix u, b and v.\n and self = ubv (except some rounding errors)\n u is a unitary matrix\n b is a bidiagonal matrix.\n v is a unitary matrix.\n :note: Currently the algorithm only works for squared matrices\n\n :todo: Make sure, that the bidiagonal matrix is 0.0 except for the bidiagonal.\n Due to rounding errors, this is currently not ensured\n \"\"\"\n # copy instance to transform it to bidiagonal form.\n bidiagMatrix = Matrix.from_two_dim_array(self.get_width(), self.get_height(), self.matrix)\n\n # build identity matrix, which is used to calculate householder transformations\n identityMatrixRow = Matrix(self.get_height(), self.get_height())\n for i in xrange(self.get_height()):\n identityMatrixRow.set_value(i, i, 1.0)\n\n identityMatrixCol = Matrix(self.get_width(), self.get_width())\n for i in xrange(self.get_width()):\n identityMatrixCol.set_value(i, i, 1.0)\n\n # zero out the k'th column and row\n for k in xrange(self.get_width() - 1):\n # vector with the values of the k'th column (first k-1 rows are 0)\n x = Vector(self.get_height())\n y = Vector(self.get_height())\n if k > 0:\n x.set_value(0, k - 1, bidiagMatrix.get_value(k, k - 1))\n y.set_value(0, k - 1, bidiagMatrix.get_value(k, k - 1))\n s = 0.0\n for i in xrange(k, self.get_height()):\n val = bidiagMatrix.get_value(k, i)\n x.set_value(0, i, val)\n s += (val ** 2)\n s = sqrt(s)\n # y must have same length as x\n y.set_value(0, k, s)\n tmp = x - y\n norm = sqrt(sum(i[0] ** 2 for i in tmp.get_array()))\n # calculate w = (x-y)/(|x-y|)\n w = tmp / norm\n # uk is the k'th householder matrix for the column\n uk = identityMatrixRow - 2 * (w * w.transform())\n\n bidiagMatrix = uk * bidiagMatrix\n\n if k == 0:\n # set u in first iteration.\n u = uk\n else:\n u = u * uk\n\n # zero out the the row\n if k < self.get_width() - 2:\n x = Vector(self.get_width())\n y = Vector(self.get_width())\n x.set_value(0, k, bidiagMatrix.get_value(k, k))\n y.set_value(0, k, bidiagMatrix.get_value(k, k))\n\n s = 0.0\n for i in xrange(k + 1, bidiagMatrix.get_width()):\n val = bidiagMatrix.get_value(i, k)\n x.set_value(0, i, val)\n s += (val ** 2)\n # length of vector x ignoring the k'th value\n s = sqrt(s)\n # y must have same length as x, since k'th value is equal\n # set k+1 value to s\n y.set_value(0, k + 1, s)\n tmp = x - y\n norm = sqrt(sum(i[0] ** 2 for i in tmp.get_array()))\n w = tmp / norm\n # vk is the k'th householder matrix for the row\n vk = identityMatrixCol - (2 * (w * w.transform()))\n bidiagMatrix = bidiagMatrix * vk\n if k == 0:\n # set v in first iteration\n v = vk\n else:\n v = vk * v\n\n return (u, bidiagMatrix, v)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning the singular value decomposition of the Matrix object.", "response": "def svd(self, maxIteration=50):\n \"\"\"Return the singular value decomposition of the Matrix instance\n\n :param integer maxIteration: The maximmum number of iterations,\n which are executed in the qr decomposition\n\n :return: A tuple with Matrices u, sigma, v with\n so that u * sigma * v^T = self\n :rtype: tuple\n\n :raise: Raises a :py:exc:`ValueError` if the Matrix object has\n more columns than rows\n\n :note: Translation of the FORTRAN implementation if the SVD given\n in the NUMERICAL RECIPES IN FORTRAN 77. THE ART OF SCIENTIFIC\n COMPUTING.\n The algorithm is not yet numerical stable, so the results may\n not be in all cases as expected.\n \"\"\"\n if(self.get_width() > self.get_height()):\n raise ValueError(\"Matrix has more columns than rows.\")\n eps = 1.e-15\n tol = 1.e-64 / eps\n a = self.get_array(False)\n m = len(a[0])\n n = len(a)\n\n v = []\n for k in xrange(n):\n v.append([0.0] * n)\n # output diagonal\n w = [0.0] * n\n # upper diagonal (for bidiagonal form)\n rv1 = [0.0] * n\n\n # Householder Reduction to bidiagional form\n g = 0.0\n anorm = 0.0\n\n for i in xrange(n):\n l = i + 1\n rv1[i] = g\n\n s = 0.0\n # calculate length of relevant row vector in matrix (part of i'th column)\n s = sum(a[i][k] ** 2 for k in xrange(i, m))\n if s <= tol:\n g = 0.0\n else:\n f = a[i][i]\n # square root to get actual length of vector\n g = sqrt(s) if f < 0 else -sqrt(s)\n h = f * g - s\n a[i][i] = f - g\n for j in xrange(l, n):\n s = sum(a[i][k] * a[j][k] for k in xrange(i, m))\n f = s / h\n for k in xrange(i, m):\n a[j][k] += (f * a[i][k])\n w[i] = g\n # calculate length of relevant column vector in matrix (part of i'th row)\n s = 0.0\n s = sum(a[k][i] ** 2 for k in xrange(l, n))\n if s <= tol:\n g = 0.0\n else:\n f = a[l][i]\n g = sqrt(s) if f < 0 else -sqrt(s)\n h = f * g - s\n a[l][i] = f - g\n for k in xrange(l, n):\n rv1[k] = a[k][i] / h\n for j in xrange(l, m):\n s = sum(a[k][j] * a[k][i] for k in xrange(l, n))\n for k in xrange(l, n):\n a[k][j] += (s * rv1[k])\n anorm = max(anorm, (abs(w[i]) + abs(rv1[i])))\n\n # Accumulation of right hand transformations\n for i in xrange(n - 1, -1, -1):\n if g != 0.0:\n for j in xrange(l, n):\n v[i][j] = a[j][i] / (g * a[i + 1][i])\n for j in xrange(l, n):\n s = sum(a[k][i] * v[j][k] for k in xrange(l, n))\n for k in xrange(l, n):\n v[j][k] += (s * v[i][k])\n for j in xrange(l, n):\n v[j][i] = 0.0\n v[i][j] = 0.0\n v[i][i] = 1.0\n g = rv1[i]\n l = i\n\n # Accumulation of left hand transformations\n for i in xrange(n - 1, -1, -1):\n l = i + 1\n g = w[i]\n for j in xrange(l, n):\n a[j][i] = 0.0\n if g != 0.0:\n for j in xrange(l, n):\n s = sum(a[i][k] * a[j][k] for k in xrange(l, m))\n f = s / (a[i][i] * g)\n for k in xrange(i, m):\n a[j][k] += f * a[i][k]\n for j in xrange(i, m):\n a[i][j] /= g\n else:\n for j in xrange(i, m):\n a[i][j] = 0.0\n a[i][i] += 1.0\n\n eps *= anorm\n\n # Diagonalization of the bidiagonal form.\n # Loop over singular values and over allowed iterations\n for k in xrange(n - 1, -1, -1):\n for dummy in xrange(maxIteration):\n for l in xrange(k, -1, -1):\n convergenceTest = False\n if abs(rv1[l]) <= eps:\n convergenceTest = True\n break\n if abs(w[l - 1]) <= eps:\n # convergenceTest = False (already default)\n break\n if not convergenceTest:\n c = 0.0\n s = 1.0\n nm = l - 1\n for i in xrange(l, k + 1):\n f = s * rv1[i]\n rv1[i] = c * rv1[i]\n if abs(f) <= eps:\n break\n g = w[i]\n h = pythag(f, g)\n w[i] = h\n c = g / h\n s = -f / h\n for j in xrange(m):\n y = a[nm][j]\n z = a[i][j]\n a[nm][j] = (y * c) + (z * s)\n a[i][j] = -(y * s) + (z * c)\n z = w[k]\n if l == k:\n # convergence\n if z < 0.0:\n w[k] = -z\n for j in xrange(n):\n v[k][j] = -v[k][j]\n break\n\n x = w[l]\n y = w[k - 1]\n g = rv1[k - 1]\n h = rv1[k]\n f = ((y - z) * (y + z) + (g - h) * (g + h)) / (2.0 * h * y)\n g = pythag(f, 1.0)\n f = ((x - z) * (x + z) + h * ((y / (f + sign(g, f))) - h)) / x\n c = 1.0\n s = 1.0\n for i in xrange(l + 1, k + 1):\n g = rv1[i]\n y = w[i]\n h = s * g\n g = c * g\n z = pythag(f, h)\n rv1[i - 1] = z\n c = f / z\n s = h / z\n f = (x * c) + (g * s)\n g = -x * s + g * c\n h = y * s\n y = y * c\n for jj in xrange(n):\n x = v[i - 1][jj]\n z = v[i][jj]\n v[i - 1][jj] = (x * c) + (z * s)\n v[i][jj] = -(x * s) + (z * c)\n z = pythag(f, h)\n w[i - 1] = z\n if z != 0.0:\n z = 1.0 / z\n c = f * z\n s = h * z\n f = (c * g) + (s * y)\n x = -s * g + c * y\n for jj in xrange(m):\n y = a[i - 1][jj]\n z = a[i][jj]\n a[i - 1][jj] = (y * c) + (z * s)\n a[i][jj] = -(y * s) + (z * c)\n rv1[l] = 0.0\n rv1[k] = f\n w[k] = x\n # Build Matrix instances for the result\n uM = Matrix.from_two_dim_array(len(a), len(a[0]), a)\n\n diagMatrix = Matrix(len(w), len(w))\n for i in xrange(len(w)):\n diagMatrix.set_value(i, i, w[i])\n\n vM = Matrix.from_two_dim_array(len(v), len(v[0]), v)\n\n return uM, diagMatrix, vM"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef pseudoinverse(self):\n transform = False\n if self.get_width() > self.get_height():\n transform = True\n u, sigma, v = self.transform().svd()\n else:\n u, sigma, v = self.svd()\n # calculate inverse of sigma\n for i in xrange(min(sigma.get_height(), sigma.get_width())):\n val = sigma.get_value(i, i)\n # divide only if the value is not 0 or close to zero (rounding errors)\n eps = 1.e-15\n if eps < val or val < -eps:\n sigma.set_value(i, i, 1 / val)\n if transform:\n return (v * sigma * u.transform()).transform()\n else:\n return v * sigma * u.transform()", "response": "Return the pseudoinverse of the entry."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncreating vector from matrix.", "response": "def initialize_from_matrix(cls, matrix, column):\n \"\"\"Create vector from matrix\n\n :param Matrix matrix: The Matrix, which should be used to create the vector.\n :param integer column: The column of the matrix, which should be used\n to create the new vector.\n :raise: Raises an :py:exc:`IndexError` if the matrix does not have the specified column.\n \"\"\"\n vec = Vector(matrix.get_height())\n for row in xrange(matrix.get_height()):\n vec.set_value(0, row, matrix.get_value(column, row))\n return vec"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nunify the vector. The length of the vector will be 1. :return: Return the instance itself :rtype: Vector", "response": "def unify(self):\n \"\"\"Unifies the vector. The length of the vector will be 1.\n\n :return: Return the instance itself\n :rtype: Vector\n \"\"\"\n length = float(self.norm())\n for row in xrange(self.get_height()):\n self.set_value(0, row, self.get_value(0, row) / length)\n return self"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef text(self):\n divisions = list(self.divisions)\n if len(divisions) == 0:\n return ''\n elif len(divisions) == 1:\n return divisions[0].text.strip()\n else:\n return super().text", "response": "Get the entire text content as str"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _calculate(self, startingPercentage, endPercentage, startDate, endDate):\n # get the defined subset of error values\n errorValues = self._get_error_values(startingPercentage, endPercentage, startDate, endDate)\n errorValues = filter(lambda item: item is not None, errorValues)\n\n return sorted(errorValues)[len(errorValues)//2]", "response": "This method is used to calculate the error value for a specific resource."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef moving_frequency(self, data_frame):\n f = []\n for i in range(0, (data_frame.td[-1].astype('int') - self.window)):\n f.append(sum(data_frame.action_type[(data_frame.td >= i) & (data_frame.td < (i + self.window))] == 1) /\n float(self.window))\n\n diff_mov_freq = (np.array(f[1:-1]) - np.array(f[0:-2])) / np.array(f[0:-2])\n\n duration = math.ceil(data_frame.td[-1])\n\n return diff_mov_freq, duration", "response": "This method returns the moving frequency of the assessment."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef incoordination_score(self, data_frame):\n diff = data_frame.td[1:-1].values - data_frame.td[0:-2].values\n inc_s = np.var(diff[np.arange(1, len(diff), 2)], dtype=np.float64) * 1000.0\n duration = math.ceil(data_frame.td[-1])\n return inc_s, duration", "response": "This method calculates the variance of the time interval in msec between taps\n "} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef mean_alnt_target_distance(self, data_frame):\n dist = np.sqrt((data_frame.x[1:-1].values-data_frame.x[0:-2].values)**2+\n (data_frame.y[1:-1].values-data_frame.y[0:-2].values)**2)\n matd = np.mean(dist[np.arange(1,len(dist),2)])\n duration = math.ceil(data_frame.td[-1])\n return matd, duration", "response": "This method calculates the mean target distance between the alternate tapping and the alternate target distance in pixels\n "} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef __train(self, n_neighbors=3):\n\n # m = self.labels.drop(['id','MDS_UPDRSIII'], axis=1).values\n # print(itemfreq(m))\n #\n # for i, row in enumerate(self.labels.drop(['id','MDS_UPDRSIII'], axis=1).values):\n # print(np.bincount(row))\n\n try:\n for obs in self.observations:\n features, ids = self.__get_features_for_observation(observation=obs, skip_id=3497,\n last_column_is_id=True)\n normalised_data = whiten(features)\n\n x = pd.DataFrame(normalised_data)\n y = self.labels[obs].values\n # x_train, x_test, y_train, y_test = train_test_split(x, y, random_state=42)\n\n knn = KNeighborsClassifier(n_neighbors=n_neighbors, weights='distance')\n # knn.fit(x_train, y_train)\n knn.fit(x, y)\n\n # print('Accuracy of K-NN classifier: {:.2f}'.format(knn.score(x, y)))\n # print('Accuracy of K-NN classifier on training set: {:.2f}'.format(knn.score(x_train, y_train)))\n # print('Accuracy of K-NN classifier on test set: {:.2f}'.format(knn.score(x_test, y_test)))\n # print('------')\n\n if not self.knns:\n self.knns = [[obs, knn]]\n else:\n self.knns.append([obs, knn])\n except IOError as e:\n ierr = \"({}): {}\".format(e.errno, e.strerror)\n logging.error(\"Error training Clinical UPDRS, file not found, I/O error %s\", ierr)\n\n except ValueError as verr:\n logging.error(\"Error training Clinical UPDRS ValueError ->%s\", verr.message)\n\n except:\n logging.error(\"Unexpected error on training Clinical UPDRS init: %s\", sys.exc_info()[0])", "response": "Train the K - nearest neighbors vote for the current class."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef __get_features_for_observation(self, data_frame=None, observation='LA-LL',\n skip_id=None, last_column_is_id=False):\n \"\"\"\n Extract the features for a given observation from a data frame\n\n :param data_frame: data frame to get features from\n :type data_frame: pandas.DataFrame\n :param observation: observation name\n :type observation: string\n :param skip_id: skip any test with a given id (optional)\n :type skip_id: int\n :param last_column_is_id: skip the last column of the data frame (useful when id is last column - optional)\n :type last_column_is_id: bool\n :return features: the features\n :rtype features: np.array\n \"\"\"\n try:\n features = np.array([])\n\n if data_frame is None:\n data_frame = self.data_frame\n\n for index, row in data_frame.iterrows():\n if not skip_id == row['id']:\n features_row = np.nan_to_num(row[row.keys().str.contains(observation)].values)\n features_row = np.append(features_row, row['id'])\n features = np.vstack([features, features_row]) if features.size else features_row\n\n # not the same when getting a single point\n if last_column_is_id:\n if np.ndim(features) > 1:\n to_return = features[:,:-1]\n else:\n to_return = features[:-1]\n else:\n to_return = features\n\n return to_return, data_frame['id'].values\n except:\n logging.error(\" observation not found in data frame\")", "response": "Extract the features for a given observation from a data frame."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _namify_arguments(mapping):\n result = []\n for name, parameter in mapping.iteritems():\n parameter.name = name\n result.append(parameter)\n return result", "response": "Ensure that a mapping of names to parameters has the correct name."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _merge_associative_list(alist, path, value):\n for key in path[:-1]:\n for item in alist:\n if item[0] == key:\n alist = item[1]\n break\n else:\n subalist = []\n alist.append((key, subalist))\n alist = subalist\n alist.append((path[-1], value))", "response": "Merge a value into an associative list at the given path maintaining\n insertion order."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ncoercing a single value according to this parameter s settings.", "response": "def coerce(self, value):\n \"\"\"Coerce a single value according to this parameter's settings.\n\n @param value: A L{str}, or L{None}. If L{None} is passed - meaning no\n value is avalable at all, not even the empty string - and this\n parameter is optional, L{self.default} will be returned.\n \"\"\"\n if value is None:\n if self.optional:\n return self.default\n else:\n value = \"\"\n if value == \"\":\n if not self.allow_none:\n raise MissingParameterError(self.name, kind=self.kind)\n return self.default\n try:\n self._check_range(value)\n parsed = self.parse(value)\n if self.validator and not self.validator(parsed):\n raise ValueError(value)\n return parsed\n except ValueError:\n try:\n value = value.decode(\"utf-8\")\n message = \"Invalid %s value %s\" % (self.kind, value)\n except UnicodeDecodeError:\n message = \"Invalid %s value\" % self.kind\n raise InvalidParameterValueError(message)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck that the given value is in the expected range.", "response": "def _check_range(self, value):\n \"\"\"Check that the given C{value} is in the expected range.\"\"\"\n if self.min is None and self.max is None:\n return\n\n measure = self.measure(value)\n prefix = \"Value (%s) for parameter %s is invalid. %s\"\n\n if self.min is not None and measure < self.min:\n message = prefix % (value, self.name,\n self.lower_than_min_template % self.min)\n raise InvalidParameterValueError(message)\n\n if self.max is not None and measure > self.max:\n message = prefix % (value, self.name,\n self.greater_than_max_template % self.max)\n raise InvalidParameterValueError(message)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nconvert a dictionary of relative index to a list of parsed s.", "response": "def parse(self, value):\n \"\"\"\n Convert a dictionary of {relative index: value} to a list of parsed\n C{value}s.\n \"\"\"\n indices = []\n if not isinstance(value, dict):\n # We interpret non-list inputs as a list of one element, for\n # compatibility with certain EC2 APIs.\n return [self.item.coerce(value)]\n for index in value.keys():\n try:\n indices.append(int(index))\n except ValueError:\n raise UnknownParameterError(index)\n result = [None] * len(value)\n for index_index, index in enumerate(sorted(indices)):\n v = value[str(index)]\n if index < 0:\n raise UnknownParameterError(index)\n result[index_index] = self.item.coerce(v)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef format(self, value):\n if isinstance(value, Arguments):\n return dict((str(i), self.item.format(v)) for i, v in value)\n return dict((str(i + 1), self.item.format(v))\n for i, v in enumerate(value))", "response": "Convert a list like ::\n\n to a dictionary like ::\n\n 1 2 3."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nconvert a dictionary of raw values to a dictionary of processed values.", "response": "def parse(self, value):\n \"\"\"\n Convert a dictionary of raw values to a dictionary of processed values.\n \"\"\"\n result = {}\n rest = {}\n for k, v in value.iteritems():\n if k in self.fields:\n if (isinstance(v, dict)\n and not self.fields[k].supports_multiple):\n if len(v) == 1:\n # We support \"foo.1\" as \"foo\" as long as there is only\n # one \"foo.#\" parameter provided.... -_-\n v = v.values()[0]\n else:\n raise InvalidParameterCombinationError(k)\n result[k] = self.fields[k].coerce(v)\n else:\n rest[k] = v\n for k, v in self.fields.iteritems():\n if k not in result:\n result[k] = v.coerce(None)\n if rest:\n raise UnknownParametersError(result, rest)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef format(self, value):\n if not isinstance(value, Arguments):\n value = value.iteritems()\n return dict((k, self.fields[k].format(v)) for k, v in value)", "response": "Convert a dictionary of processed values to a dictionary of raw values."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _wrap(self, value):\n if isinstance(value, dict):\n if any(isinstance(name, int) for name in value.keys()):\n if not all(isinstance(name, int) for name in value.keys()):\n raise RuntimeError(\"Integer and non-integer keys: %r\"\n % value.keys())\n items = sorted(value.iteritems(), key=itemgetter(0))\n return [self._wrap(val) for _, val in items]\n else:\n return Arguments(value)\n elif isinstance(value, list):\n return [self._wrap(x) for x in value]\n else:\n return value", "response": "Wrap the given L { tree } with L { Arguments } as necessary."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef extract(self, params):\n structure = Structure(fields=dict([(p.name, p)\n for p in self._parameters]))\n try:\n tree = structure.coerce(self._convert_flat_to_nest(params))\n rest = {}\n except UnknownParametersError, error:\n tree = error.result\n rest = self._convert_nest_to_flat(error.unknown)\n return Arguments(tree), rest", "response": "Extract parameters from a raw C { dict according to this schema."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef bundle(self, *arguments, **extra):\n params = {}\n\n for argument in arguments:\n params.update(argument)\n\n params.update(extra)\n result = {}\n for name, value in params.iteritems():\n if value is None:\n continue\n segments = name.split('.')\n first = segments[0]\n parameter = self.get_parameter(first)\n if parameter is None:\n raise RuntimeError(\"Parameter '%s' not in schema\" % name)\n else:\n if value is None:\n result[name] = \"\"\n else:\n result[name] = parameter.format(value)\n\n return self._convert_nest_to_flat(result)", "response": "Bundle the given arguments in a C { dict with EC2 - style format."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_parameter(self, name):\n for parameter in self._parameters:\n if parameter.name == name:\n return parameter", "response": "Get the parameter with the given name."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _convert_flat_to_nest(self, params):\n result = {}\n for k, v in params.iteritems():\n last = result\n segments = k.split('.')\n for index, item in enumerate(segments):\n if index == len(segments) - 1:\n newd = v\n else:\n newd = {}\n if not isinstance(last, dict):\n raise InconsistentParameterError(k)\n if type(last.get(item)) is dict and type(newd) is not dict:\n raise InconsistentParameterError(k)\n last = last.setdefault(item, newd)\n return result", "response": "Convert a flat structure to a nested structure."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _convert_nest_to_flat(self, params, _result=None, _prefix=None):\n if _result is None:\n _result = {}\n for k, v in params.iteritems():\n if _prefix is None:\n path = k\n else:\n path = _prefix + '.' + k\n if isinstance(v, dict):\n self._convert_nest_to_flat(v, _result=_result, _prefix=path)\n else:\n _result[path] = v\n return _result", "response": "Convert a nested dict to a flat dict."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nextend this schema with any number of schema items.", "response": "def extend(self, *schema_items, **kwargs):\n \"\"\"\n Add any number of schema items to a new schema.\n\n Takes the same arguments as the constructor, and returns a new\n L{Schema} instance.\n\n If parameters, result, or errors is specified, they will be merged with\n the existing parameters, result, or errors.\n \"\"\"\n new_kwargs = {\n 'name': self.name,\n 'doc': self.doc,\n 'parameters': self._parameters[:],\n 'result': self.result.copy() if self.result else {},\n 'errors': self.errors.copy() if self.errors else set()}\n if 'parameters' in kwargs:\n new_params = kwargs.pop('parameters')\n new_kwargs['parameters'].extend(new_params)\n new_kwargs['result'].update(kwargs.pop('result', {}))\n new_kwargs['errors'].update(kwargs.pop('errors', set()))\n new_kwargs.update(kwargs)\n\n if schema_items:\n parameters = self._convert_old_schema(schema_items)\n new_kwargs['parameters'].extend(parameters)\n return Schema(**new_kwargs)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nconverts an ugly old schema using dotted names to the hot new schema.", "response": "def _convert_old_schema(self, parameters):\n \"\"\"\n Convert an ugly old schema, using dotted names, to the hot new schema,\n using List and Structure.\n\n The old schema assumes that every other dot implies an array. So a list\n of two parameters,\n\n [Integer(\"foo.bar.baz.quux\"), Integer(\"foo.bar.shimmy\")]\n\n becomes::\n\n [List(\n \"foo\",\n item=Structure(\n fields={\"baz\": List(item=Integer()),\n \"shimmy\": Integer()}))]\n\n By design, the old schema syntax ignored the names \"bar\" and \"quux\".\n \"\"\"\n # 'merged' here is an associative list that maps parameter names to\n # Parameter instances, OR sub-associative lists which represent nested\n # lists and structures.\n # e.g.,\n # [Integer(\"foo\")]\n # becomes\n # [(\"foo\", Integer(\"foo\"))]\n # and\n # [Integer(\"foo.bar\")]\n # (which represents a list of integers called \"foo\" with a meaningless\n # index name of \"bar\") becomes\n # [(\"foo\", [(\"bar\", Integer(\"foo.bar\"))])].\n merged = []\n for parameter in parameters:\n segments = parameter.name.split('.')\n _merge_associative_list(merged, segments, parameter)\n result = [self._inner_convert_old_schema(node, 1) for node in merged]\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef wait_for_processes(processes, size, progress_queue, watcher, item):\n while size > 0:\n progress_type, value = progress_queue.get()\n if progress_type == ProgressQueue.PROCESSED:\n chunk_size = value\n watcher.transferring_item(item, increment_amt=chunk_size)\n size -= chunk_size\n elif progress_type == ProgressQueue.START_WAITING:\n watcher.start_waiting()\n elif progress_type == ProgressQueue.DONE_WAITING:\n watcher.done_waiting()\n else:\n error_message = value\n for process in processes:\n process.terminate()\n raise ValueError(error_message)\n for process in processes:\n process.join()", "response": "Wait for processes to finish downloading a file."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef verify_file_private(filename):\n if platform.system().upper() != 'WINDOWS':\n filename = os.path.expanduser(filename)\n if os.path.exists(filename):\n file_stat = os.stat(filename)\n if mode_allows_group_or_other(file_stat.st_mode):\n raise ValueError(CONFIG_FILE_PERMISSIONS_ERROR)", "response": "Verify that the file is private."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nupdate progress that the item is about to be transferred.", "response": "def transferring_item(self, item, increment_amt=1):\n \"\"\"\n Update progress that item is about to be transferred.\n :param item: LocalFile, LocalFolder, or LocalContent(project) that is about to be sent.\n :param increment_amt: int amount to increase our count(how much progress have we made)\n \"\"\"\n self.cnt += increment_amt\n percent_done = int(float(self.cnt) / float(self.total) * 100.0)\n if KindType.is_project(item):\n details = 'project'\n else:\n details = os.path.basename(item.path)\n self.progress_bar.update(percent_done, '{} {}'.format(self.msg_verb, details))\n self.progress_bar.show()"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef finished(self):\n self.progress_bar.set_state(ProgressBar.STATE_DONE)\n self.progress_bar.show()", "response": "Print final progress bar."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nstarts waiting progress bar until done_waiting is called.", "response": "def start_waiting(self):\n \"\"\"\n Show waiting progress bar until done_waiting is called.\n Only has an effect if we are in waiting state.\n \"\"\"\n if not self.waiting:\n self.waiting = True\n wait_msg = \"Waiting for project to become ready for {}\".format(self.msg_verb)\n self.progress_bar.show_waiting(wait_msg)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef done_waiting(self):\n if self.waiting:\n self.waiting = False\n self.progress_bar.show_running()", "response": "Show running progress bar if we are in waiting state."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nshowing waiting progress bar until done_waiting is called.", "response": "def show_waiting(self, wait_msg):\n \"\"\"\n Show waiting progress bar until done_waiting is called.\n Only has an effect if we are in waiting state.\n :param wait_msg: str: message describing what we are waiting for\n \"\"\"\n self.wait_msg = wait_msg\n self.set_state(ProgressBar.STATE_WAITING)\n self.show()"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _visit_content(item, parent, visitor):\n if KindType.is_project(item):\n visitor.visit_project(item)\n elif KindType.is_folder(item):\n visitor.visit_folder(item, parent)\n else:\n visitor.visit_file(item, parent)\n if not KindType.is_file(item):\n for child in item.children:\n ProjectWalker._visit_content(child, item, visitor)", "response": "Recursively visit the content of the item."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncreates a URL based on the given service endpoint bucket and object name.", "response": "def s3_url_context(service_endpoint, bucket=None, object_name=None):\n \"\"\"\n Create a URL based on the given service endpoint and suitable for\n the given bucket or object.\n\n @param service_endpoint: The service endpoint on which to base the\n resulting URL.\n @type service_endpoint: L{AWSServiceEndpoint}\n\n @param bucket: If given, the name of a bucket to reference.\n @type bucket: L{unicode}\n\n @param object_name: If given, the name of an object or object\n subresource to reference.\n @type object_name: L{unicode}\n \"\"\"\n\n # Define our own query parser which can handle the consequences of\n # `?acl` and such (subresources). At its best, parse_qsl doesn't\n # let us differentiate between these and empty values (such as\n # `?acl=`).\n def p(s):\n results = []\n args = s.split(u\"&\")\n for a in args:\n pieces = a.split(u\"=\")\n if len(pieces) == 1:\n results.append((unquote(pieces[0]),))\n elif len(pieces) == 2:\n results.append(tuple(map(unquote, pieces)))\n else:\n raise Exception(\"oh no\")\n return results\n\n query = []\n path = []\n if bucket is None:\n path.append(u\"\")\n else:\n if isinstance(bucket, bytes):\n bucket = bucket.decode(\"utf-8\")\n path.append(bucket)\n if object_name is None:\n path.append(u\"\")\n else:\n if isinstance(object_name, bytes):\n object_name = object_name.decode(\"utf-8\")\n if u\"?\" in object_name:\n object_name, query = object_name.split(u\"?\", 1)\n query = p(query)\n object_name_components = object_name.split(u\"/\")\n if object_name_components[0] == u\"\":\n object_name_components.pop(0)\n if object_name_components:\n path.extend(object_name_components)\n else:\n path.append(u\"\")\n return _S3URLContext(\n scheme=service_endpoint.scheme.decode(\"utf-8\"),\n host=service_endpoint.get_host().decode(\"utf-8\"),\n port=service_endpoint.port,\n path=path,\n query=query,\n )"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nlist all buckets owned by the authenticated sender of the request.", "response": "def list_buckets(self):\n \"\"\"\n List all buckets.\n\n Returns a list of all the buckets owned by the authenticated sender of\n the request.\n \"\"\"\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(),\n )\n query = self._query_factory(details)\n d = self._submit(query)\n d.addCallback(self._parse_list_buckets)\n return d"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nparsing XML bucket list response.", "response": "def _parse_list_buckets(self, (response, xml_bytes)):\n \"\"\"\n Parse XML bucket list response.\n \"\"\"\n root = XML(xml_bytes)\n buckets = []\n for bucket_data in root.find(\"Buckets\"):\n name = bucket_data.findtext(\"Name\")\n date_text = bucket_data.findtext(\"CreationDate\")\n date_time = parseTime(date_text)\n bucket = Bucket(name, date_time)\n buckets.append(bucket)\n return buckets"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a new bucket.", "response": "def create_bucket(self, bucket):\n \"\"\"\n Create a new bucket.\n \"\"\"\n details = self._details(\n method=b\"PUT\",\n url_context=self._url_context(bucket=bucket),\n )\n query = self._query_factory(details)\n return self._submit(query)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndeletes a bucket. The bucket must be empty before it can be deleted.", "response": "def delete_bucket(self, bucket):\n \"\"\"\n Delete a bucket.\n\n The bucket must be empty before it can be deleted.\n \"\"\"\n details = self._details(\n method=b\"DELETE\",\n url_context=self._url_context(bucket=bucket),\n )\n query = self._query_factory(details)\n return self._submit(query)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_bucket(self, bucket, marker=None, max_keys=None, prefix=None):\n args = []\n if marker is not None:\n args.append((\"marker\", marker))\n if max_keys is not None:\n args.append((\"max-keys\", \"%d\" % (max_keys,)))\n if prefix is not None:\n args.append((\"prefix\", prefix))\n if args:\n object_name = \"?\" + urlencode(args)\n else:\n object_name = None\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(bucket=bucket, object_name=object_name),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(self._parse_get_bucket)\n return d", "response": "Get all the objects in a bucket."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets the location of a bucket.", "response": "def get_bucket_location(self, bucket):\n \"\"\"\n Get the location (region) of a bucket.\n\n @param bucket: The name of the bucket.\n @return: A C{Deferred} that will fire with the bucket's region.\n \"\"\"\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(bucket=bucket, object_name=\"?location\"),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(self._parse_bucket_location)\n return d"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_bucket_lifecycle(self, bucket):\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(bucket=bucket, object_name=\"?lifecycle\"),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(self._parse_lifecycle_config)\n return d", "response": "Get the lifecycle configuration of a bucket."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nparsing a C{LifecycleConfiguration XML document.", "response": "def _parse_lifecycle_config(self, (response, xml_bytes)):\n \"\"\"Parse a C{LifecycleConfiguration} XML document.\"\"\"\n root = XML(xml_bytes)\n rules = []\n\n for content_data in root.findall(\"Rule\"):\n id = content_data.findtext(\"ID\")\n prefix = content_data.findtext(\"Prefix\")\n status = content_data.findtext(\"Status\")\n expiration = int(content_data.findtext(\"Expiration/Days\"))\n rules.append(\n LifecycleConfigurationRule(id, prefix, status, expiration))\n\n return LifecycleConfiguration(rules)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_bucket_website_config(self, bucket):\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(bucket=bucket, object_name='?website'),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(self._parse_website_config)\n return d", "response": "Get the website configuration of a bucket."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nparse a C{WebsiteConfiguration XML document.", "response": "def _parse_website_config(self, (response, xml_bytes)):\n \"\"\"Parse a C{WebsiteConfiguration} XML document.\"\"\"\n root = XML(xml_bytes)\n index_suffix = root.findtext(\"IndexDocument/Suffix\")\n error_key = root.findtext(\"ErrorDocument/Key\")\n\n return WebsiteConfiguration(index_suffix, error_key)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngets the notification configuration of a bucket.", "response": "def get_bucket_notification_config(self, bucket):\n \"\"\"\n Get the notification configuration of a bucket.\n\n @param bucket: The name of the bucket.\n @return: A C{Deferred} that will request the bucket's notification\n configuration.\n \"\"\"\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(bucket=bucket, object_name=\"?notification\"),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(self._parse_notification_config)\n return d"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nparsing a C{NotificationConfiguration XML document.", "response": "def _parse_notification_config(self, (response, xml_bytes)):\n \"\"\"Parse a C{NotificationConfiguration} XML document.\"\"\"\n root = XML(xml_bytes)\n topic = root.findtext(\"TopicConfiguration/Topic\")\n event = root.findtext(\"TopicConfiguration/Event\")\n\n return NotificationConfiguration(topic, event)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_bucket_versioning_config(self, bucket):\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(bucket=bucket, object_name=\"?versioning\"),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(self._parse_versioning_config)\n return d", "response": "Get the versioning configuration of a bucket."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nparse a C{VersioningConfiguration XML document.", "response": "def _parse_versioning_config(self, (response, xml_bytes)):\n \"\"\"Parse a C{VersioningConfiguration} XML document.\"\"\"\n root = XML(xml_bytes)\n mfa_delete = root.findtext(\"MfaDelete\")\n status = root.findtext(\"Status\")\n\n return VersioningConfiguration(mfa_delete=mfa_delete, status=status)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_bucket_acl(self, bucket):\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(bucket=bucket, object_name=\"?acl\"),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(self._parse_acl)\n return d", "response": "Get the access control policy for a bucket."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef put_object(self, bucket, object_name, data=None, content_type=None,\n metadata={}, amz_headers={}, body_producer=None):\n \"\"\"\n Put an object in a bucket.\n\n An existing object with the same name will be replaced.\n\n @param bucket: The name of the bucket.\n @param object_name: The name of the object.\n @type object_name: L{unicode}\n @param data: The data to write.\n @param content_type: The type of data being written.\n @param metadata: A C{dict} used to build C{x-amz-meta-*} headers.\n @param amz_headers: A C{dict} used to build C{x-amz-*} headers.\n @return: A C{Deferred} that will fire with the result of request.\n \"\"\"\n details = self._details(\n method=b\"PUT\",\n url_context=self._url_context(bucket=bucket, object_name=object_name),\n headers=self._headers(content_type),\n metadata=metadata,\n amz_headers=amz_headers,\n body=data,\n body_producer=body_producer,\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(itemgetter(1))\n return d", "response": "Put an object in a bucket."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef copy_object(self, source_bucket, source_object_name, dest_bucket=None,\n dest_object_name=None, metadata={}, amz_headers={}):\n \"\"\"\n Copy an object stored in S3 from a source bucket to a destination\n bucket.\n\n @param source_bucket: The S3 bucket to copy the object from.\n @param source_object_name: The name of the object to copy.\n @param dest_bucket: Optionally, the S3 bucket to copy the object to.\n Defaults to C{source_bucket}.\n @param dest_object_name: Optionally, the name of the new object.\n Defaults to C{source_object_name}.\n @param metadata: A C{dict} used to build C{x-amz-meta-*} headers.\n @param amz_headers: A C{dict} used to build C{x-amz-*} headers.\n @return: A C{Deferred} that will fire with the result of request.\n \"\"\"\n dest_bucket = dest_bucket or source_bucket\n dest_object_name = dest_object_name or source_object_name\n amz_headers[\"copy-source\"] = \"/%s/%s\" % (source_bucket,\n source_object_name)\n details = self._details(\n method=b\"PUT\",\n url_context=self._url_context(\n bucket=dest_bucket, object_name=dest_object_name,\n ),\n metadata=metadata,\n amz_headers=amz_headers,\n )\n d = self._submit(self._query_factory(details))\n return d", "response": "Copy an object stored in S3 from a source bucket to a destination bucket."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_object(self, bucket, object_name):\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(bucket=bucket, object_name=object_name),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(itemgetter(1))\n return d", "response": "Get an object from a bucket."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef head_object(self, bucket, object_name):\n details = self._details(\n method=b\"HEAD\",\n url_context=self._url_context(bucket=bucket, object_name=object_name),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(lambda (response, body): _to_dict(response.responseHeaders))\n return d", "response": "Retrieve object metadata only."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndeletes an object from a bucket.", "response": "def delete_object(self, bucket, object_name):\n \"\"\"\n Delete an object from a bucket.\n\n Once deleted, there is no method to restore or undelete an object.\n \"\"\"\n details = self._details(\n method=b\"DELETE\",\n url_context=self._url_context(bucket=bucket, object_name=object_name),\n )\n d = self._submit(self._query_factory(details))\n return d"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef put_object_acl(self, bucket, object_name, access_control_policy):\n data = access_control_policy.to_xml()\n details = self._details(\n method=b\"PUT\",\n url_context=self._url_context(\n bucket=bucket, object_name='%s?acl' % (object_name,),\n ),\n body=data,\n )\n query = self._query_factory(details)\n d = self._submit(query)\n d.addCallback(self._parse_acl)\n return d", "response": "Set access control policy on an object."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsets request payment configuration on bucket to payer.", "response": "def put_request_payment(self, bucket, payer):\n \"\"\"\n Set request payment configuration on bucket to payer.\n\n @param bucket: The name of the bucket.\n @param payer: The name of the payer.\n @return: A C{Deferred} that will fire with the result of the request.\n \"\"\"\n data = RequestPayment(payer).to_xml()\n details = self._details(\n method=b\"PUT\",\n url_context=self._url_context(bucket=bucket, object_name=\"?requestPayment\"),\n body=data,\n )\n d = self._submit(self._query_factory(details))\n return d"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget the request payment configuration on a bucket.", "response": "def get_request_payment(self, bucket):\n \"\"\"\n Get the request payment configuration on a bucket.\n\n @param bucket: The name of the bucket.\n @return: A C{Deferred} that will fire with the name of the payer.\n \"\"\"\n details = self._details(\n method=b\"GET\",\n url_context=self._url_context(bucket=bucket, object_name=\"?requestPayment\"),\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(self._parse_get_request_payment)\n return d"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ninitiates a multipart upload to a bucket.", "response": "def init_multipart_upload(self, bucket, object_name, content_type=None,\n amz_headers={}, metadata={}):\n \"\"\"\n Initiate a multipart upload to a bucket.\n\n @param bucket: The name of the bucket\n @param object_name: The object name\n @param content_type: The Content-Type for the object\n @param metadata: C{dict} containing additional metadata\n @param amz_headers: A C{dict} used to build C{x-amz-*} headers.\n @return: C{str} upload_id\n \"\"\"\n objectname_plus = '%s?uploads' % object_name\n details = self._details(\n method=b\"POST\",\n url_context=self._url_context(bucket=bucket, object_name=objectname_plus),\n headers=self._headers(content_type),\n metadata=metadata,\n amz_headers=amz_headers,\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(\n lambda (response, body): MultipartInitiationResponse.from_xml(body)\n )\n return d"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef upload_part(self, bucket, object_name, upload_id, part_number,\n data=None, content_type=None, metadata={},\n body_producer=None):\n \"\"\"\n Upload a part of data corresponding to a multipart upload.\n\n @param bucket: The bucket name\n @param object_name: The object name\n @param upload_id: The multipart upload id\n @param part_number: The part number\n @param data: Data (optional, requires body_producer if not specified)\n @param content_type: The Content-Type\n @param metadata: Additional metadata\n @param body_producer: an C{IBodyProducer} (optional, requires data if\n not specified)\n @return: the C{Deferred} from underlying query.submit() call\n \"\"\"\n parms = 'partNumber=%s&uploadId=%s' % (str(part_number), upload_id)\n objectname_plus = '%s?%s' % (object_name, parms)\n details = self._details(\n method=b\"PUT\",\n url_context=self._url_context(bucket=bucket, object_name=objectname_plus),\n headers=self._headers(content_type),\n metadata=metadata,\n body=data,\n )\n d = self._submit(self._query_factory(details))\n d.addCallback(lambda (response, data): _to_dict(response.responseHeaders))\n return d", "response": "Upload a part of data corresponding to a multipart upload."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef complete_multipart_upload(self, bucket, object_name, upload_id,\n parts_list, content_type=None, metadata={}):\n \"\"\"\n Complete a multipart upload.\n\n N.B. This can be possibly be a slow operation.\n\n @param bucket: The bucket name\n @param object_name: The object name\n @param upload_id: The multipart upload id\n @param parts_list: A List of all the parts\n (2-tuples of part sequence number and etag)\n @param content_type: The Content-Type of the object\n @param metadata: C{dict} containing additional metadata\n @return: a C{Deferred} that fires after request is complete\n \"\"\"\n data = self._build_complete_multipart_upload_xml(parts_list)\n objectname_plus = '%s?uploadId=%s' % (object_name, upload_id)\n details = self._details(\n method=b\"POST\",\n url_context=self._url_context(bucket=bucket, object_name=objectname_plus),\n headers=self._headers(content_type),\n metadata=metadata,\n body=data,\n )\n d = self._submit(self._query_factory(details))\n # TODO - handle error responses\n d.addCallback(\n lambda (response, body): MultipartCompletionResponse.from_xml(body)\n )\n return d", "response": "Complete a multipart upload."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsetting the content type based on the file extension used in the object.", "response": "def set_content_type(self):\n \"\"\"\n Set the content type based on the file extension used in the object\n name.\n \"\"\"\n if self.object_name and not self.content_type:\n # XXX nothing is currently done with the encoding... we may\n # need to in the future\n self.content_type, encoding = mimetypes.guess_type(\n self.object_name, strict=False)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nbuilds the list of headers needed in order to perform S3 operations.", "response": "def get_headers(self, instant):\n \"\"\"\n Build the list of headers needed in order to perform S3 operations.\n \"\"\"\n headers = {'x-amz-date': _auth_v4.makeAMZDate(instant)}\n if self.body_producer is None:\n data = self.data\n if data is None:\n data = b\"\"\n headers[\"x-amz-content-sha256\"] = hashlib.sha256(data).hexdigest()\n else:\n data = None\n headers[\"x-amz-content-sha256\"] = b\"UNSIGNED-PAYLOAD\"\n for key, value in self.metadata.iteritems():\n headers[\"x-amz-meta-\" + key] = value\n for key, value in self.amz_headers.iteritems():\n headers[\"x-amz-\" + key] = value\n\n # Before we check if the content type is set, let's see if we can set\n # it by guessing the the mimetype.\n self.set_content_type()\n if self.content_type is not None:\n headers[\"Content-Type\"] = self.content_type\n if self.creds is not None:\n headers[\"Authorization\"] = self.sign(\n headers,\n data,\n s3_url_context(self.endpoint, self.bucket, self.object_name),\n instant,\n method=self.action)\n return headers"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef sign(self, headers, data, url_context, instant, method,\n region=REGION_US_EAST_1):\n \"\"\"Sign this query using its built in credentials.\"\"\"\n headers[\"host\"] = url_context.get_encoded_host()\n\n if data is None:\n request = _auth_v4._CanonicalRequest.from_request_components(\n method=method,\n url=url_context.get_encoded_path(),\n headers=headers,\n headers_to_sign=('host', 'x-amz-date'),\n payload_hash=None,\n )\n else:\n request = _auth_v4._CanonicalRequest.from_request_components_and_payload(\n method=method,\n url=url_context.get_encoded_path(),\n headers=headers,\n headers_to_sign=('host', 'x-amz-date'),\n payload=data,\n )\n\n return _auth_v4._make_authorization_header(\n region=region,\n service=\"s3\",\n canonical_request=request,\n credentials=self.creds,\n instant=instant)", "response": "Sign this request using the built in credentials."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef submit(self, url_context=None, utcnow=datetime.datetime.utcnow):\n if not url_context:\n url_context = s3_url_context(\n self.endpoint, self.bucket, self.object_name)\n d = self.get_page(\n url_context.get_encoded_url(),\n method=self.action,\n postdata=self.data or b\"\",\n headers=self.get_headers(utcnow()),\n )\n\n return d.addErrback(s3_error_wrapper)", "response": "Submit this query.\n\n @return: A deferred from get_page"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef attributes(self):\n if 'id' in self.node.attrib:\n yield PlaceholderAttribute('id', self.node.attrib['id'])\n\n if 'tei-tag' in self.node.attrib:\n yield PlaceholderAttribute('tei-tag', self.node.attrib['tei-tag'])\n\n \"\"\"Contain attributes applicable to this element\"\"\"\n for attributes in self.node.iterchildren('attributes'):\n for attribute in self.__iter_attributes__(attributes):\n yield attribute", "response": "Iterate over the attributes applicable to this element."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nyields all the text divisions of this element.", "response": "def divisions(self):\n \"\"\"\n Recursively get all the text divisions directly part of this element. If an element contains parts or text without tag. Those will be returned in order and wrapped with a TextDivision.\n \"\"\"\n\n from .placeholder_division import PlaceholderDivision\n \n placeholder = None\n for item in self.__parts_and_divisions:\n if item.tag == 'part':\n if not placeholder:\n placeholder = PlaceholderDivision()\n placeholder.parts.append(item)\n else:\n if placeholder:\n yield placeholder\n placeholder = None\n yield item\n if placeholder:\n yield placeholder"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting the parts below this element.", "response": "def parts(self):\n \"\"\"\n Get the parts directly below this element.\n \"\"\"\n\n for item in self.__parts_and_divisions:\n if item.tag == 'part':\n yield item\n else:\n # Divisions shouldn't be beneath a part, but here's a fallback\n # for if this does happen\n for part in item.parts:\n yield part"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nyield the parts and divisions of this element.", "response": "def __parts_and_divisions(self):\n \"\"\"\n The parts and divisions directly part of this element.\n \"\"\"\n\n from .division import Division\n from .part import Part\n from .placeholder_part import PlaceholderPart\n\n text = self.node.text\n if text:\n stripped_text = text.replace('\\n', '')\n if stripped_text.strip():\n yield PlaceholderPart(stripped_text)\n\n for item in self.node:\n if item.tag == 'part':\n yield Part(item)\n elif item.tag == 'div':\n yield Division(item)\n \n if item.tail:\n stripped_tail = item.tail.replace('\\n', '')\n if stripped_tail.strip():\n yield PlaceholderPart(stripped_tail)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nconverting an element to a single string and allow the passed inject method to place content before any element.", "response": "def tostring(self, inject):\n \"\"\"\n Convert an element to a single string and allow the passed inject method to place content before any\n element.\n \"\"\"\n\n return inject(self, '\\n'.join(f'{division.tostring(inject)}' for division in self.divisions))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef RCL(input_shape,\n rec_conv_layers, \n dense_layers,\n output_layer=[1, 'sigmoid'],\n padding='same', \n optimizer='adam',\n loss='binary_crossentropy'):\n \"\"\"Summary\n \n Args:\n input_shape (tuple): The shape of the input layer.\n output_nodes (int): Number of nodes in the output layer. It depends on the loss function used.\n rec_conv_layers (list): RCL descriptor \n [\n [\n [(filter, kernel), (pool_size, stride), leak, drop],\n [(filter, kernel), (pool_size, stride), leak, drop],\n [(filter, kernel), (pool_size, stride), leak, drop, timesteps],\n ],\n ...\n [\n [],[],[]\n ]\n ]\n dense_layers (TYPE): Dense layer descriptor [[fully_connected, leak, drop], ... []]\n padding (str, optional): Type of padding for conv and pooling layers\n optimizer (str or object optional): Keras optimizer as string or keras optimizer\n \n Returns:\n model: The compiled Kears model, ready for training.\n \"\"\"\n \n inputs = Input(shape=input_shape)\n \n for i, c in enumerate(rec_conv_layers):\n \n conv = Conv1D(c[0][0][0], c[0][0][1], padding=padding)(inputs)\n batch = BatchNormalization()(conv)\n act = LeakyReLU(alpha=c[0][2])(batch)\n pool = MaxPooling1D(pool_size=c[0][1][0], strides=c[0][1][1], padding=padding)(act)\n d1 = Dropout(c[0][3])(pool)\n \n inner = time_steps( input=d1,\n conv_layer=c[1],\n time_conv_layer=c[2],\n padding=padding)\n \n drop = Flatten()(inner)\n\n for i, d in enumerate(dense_layers):\n dense = Dense(d[0], activation='relu')(drop)\n bn = BatchNormalization()(dense)\n act = LeakyReLU(alpha=d[1])(bn)\n drop = Dropout(d[2])(act)\n \n output = Dense(output_layer[0], activation=output_layer[1])(drop)\n \n model = Model(inputs=inputs, outputs=output)\n \n model.compile(loss=loss, optimizer=optimizer, metrics=['accuracy'])\n \n return model", "response": "Summary function for training a new RCL model."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef VOICE(input_shape,\n conv_layers,\n dense_layers,\n output_layer=[1, 'sigmoid'],\n padding='same', \n optimizer='adam',\n loss='binary_crossentropy'):\n \"\"\"Conv1D CNN used primarily for voice data.\n \n Args:\n input_shape (tuple): The shape of the input layer\n targets (int): Number of targets\n conv_layers (list): Conv layer descriptor [[(filter, kernel), (pool_size, stride), leak, drop], ... []]\n dense_layers (TYPE): Dense layer descriptor [[fully_connected, leak, drop]]\n padding (str, optional): Type of padding for conv and pooling layers\n optimizer (str or object optional): Keras optimizer as string or keras optimizer\n \n Returns:\n TYPE: model, build_arguments\n \"\"\"\n\n inputs = Input(shape=input_shape)\n \n for i, c in enumerate(conv_layers):\n if i == 0:\n conv = Conv1D(c[0][0], c[0][1], padding=padding)(inputs)\n else:\n conv = Conv1D(c[0][0], c[0][1], padding=padding)(drop)\n bn = BatchNormalization()(conv)\n act = LeakyReLU(alpha=c[2])(bn)\n pool = MaxPooling1D(pool_size=c[1][0], strides=c[1][1], padding=padding)(act)\n drop = Dropout(c[3])(pool)\n \n drop = Flatten()(drop)\n\n for i, d in enumerate(dense_layers):\n dense = Dense(d[0], activation='relu')(drop)\n bn = BatchNormalization()(dense)\n act = LeakyReLU(alpha=d[1])(bn)\n drop = Dropout(d[2])(act)\n \n output = Dense(output_layer[0], activation=output_layer[1])(drop)\n \n model = Model(inputs=inputs, outputs=output)\n model.compile(loss=loss, optimizer=optimizer, metrics=['accuracy'])\n \n return model", "response": "Function to build a single voice data base on the input_shape."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ninitialize the ErrorMeasure. During initialization, all :py:meth:`BaseErrorMeasure.local_errors` are calculated. :param TimeSeries originalTimeSeries: TimeSeries containing the original data. :param TimeSeries calculatedTimeSeries: TimeSeries containing calculated data. Calculated data is smoothed or forecasted data. :return: Return :py:const:`True` if the error could be calculated, :py:const:`False` otherwise based on the minimalErrorCalculationPercentage. :rtype: boolean :raise: Raises a :py:exc:`StandardError` if the error measure is initialized multiple times.", "response": "def initialize(self, originalTimeSeries, calculatedTimeSeries):\n \"\"\"Initializes the ErrorMeasure.\n\n During initialization, all :py:meth:`BaseErrorMeasure.local_errors` are calculated.\n\n :param TimeSeries originalTimeSeries: TimeSeries containing the original data.\n :param TimeSeries calculatedTimeSeries: TimeSeries containing calculated data.\n Calculated data is smoothed or forecasted data.\n\n :return: Return :py:const:`True` if the error could be calculated, :py:const:`False`\n otherwise based on the minimalErrorCalculationPercentage.\n :rtype: boolean\n\n :raise: Raises a :py:exc:`StandardError` if the error measure is initialized multiple times.\n \"\"\"\n # ErrorMeasure was already initialized.\n if 0 < len(self._errorValues):\n raise StandardError(\"An ErrorMeasure can only be initialized once.\")\n\n # sort the TimeSeries to reduce the required comparison operations\n originalTimeSeries.sort_timeseries()\n calculatedTimeSeries.sort_timeseries()\n\n # Performance optimization\n append = self._errorValues.append\n appendDate = self._errorDates.append\n local_error = self.local_error\n\n minCalcIdx = 0\n\n # calculate all valid local errors\n for orgPair in originalTimeSeries:\n for calcIdx in xrange(minCalcIdx, len(calculatedTimeSeries)):\n calcPair = calculatedTimeSeries[calcIdx]\n\n # Skip values that can not be compared\n if calcPair[0] != orgPair[0]:\n continue\n\n append(local_error(orgPair[1:], calcPair[1:]))\n appendDate(orgPair[0])\n\n # return False, if the error cannot be calculated\n calculatedErrors = len(filter(lambda item: item is not None, self._errorValues))\n minCalculatedErrors = self._minimalErrorCalculationPercentage * len(originalTimeSeries)\n\n if calculatedErrors < minCalculatedErrors:\n self._errorValues = []\n self._errorDates = []\n return False\n\n return True"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the defined subset of self. _errorValues.", "response": "def _get_error_values(self, startingPercentage, endPercentage, startDate, endDate):\n \"\"\"Gets the defined subset of self._errorValues.\n\n Both parameters will be correct at this time.\n\n :param float startingPercentage: Defines the start of the interval. This has to be a value in [0.0, 100.0].\n It represents the value, where the error calculation should be started.\n 25.0 for example means that the first 25% of all calculated errors will be ignored.\n :param float endPercentage: Defines the end of the interval. This has to be a value in [0.0, 100.0].\n It represents the value, after which all error values will be ignored. 90.0 for example means that\n the last 10% of all local errors will be ignored.\n :param float startDate: Epoch representing the start date used for error calculation.\n :param float endDate: Epoch representing the end date used in the error calculation.\n\n :return: Returns a list with the defined error values.\n :rtype: list\n\n :raise: Raises a ValueError if startDate or endDate do not represent correct boundaries for error calculation.\n \"\"\"\n if startDate is not None:\n possibleDates = filter(lambda date: date >= startDate, self._errorDates)\n if 0 == len(possibleDates):\n raise ValueError(\"%s does not represent a valid startDate.\" % startDate)\n\n startIdx = self._errorDates.index(min(possibleDates))\n else:\n startIdx = int((startingPercentage * len(self._errorValues)) / 100.0)\n\n if endDate is not None:\n possibleDates = filter(lambda date: date <= endDate, self._errorDates)\n if 0 == len(possibleDates):\n raise ValueError(\"%s does not represent a valid endDate.\" % endDate)\n\n endIdx = self._errorDates.index(max(possibleDates)) + 1\n else:\n endIdx = int((endPercentage * len(self._errorValues)) / 100.0)\n\n return self._errorValues[startIdx:endIdx]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncalculate the error for the given interval startingPercentage and endPercentage.", "response": "def get_error(self, startingPercentage=0.0, endPercentage=100.0, startDate=None, endDate=None):\n \"\"\"Calculates the error for the given interval (startingPercentage, endPercentage) between the TimeSeries\n given during :py:meth:`BaseErrorMeasure.initialize`.\n\n :param float startingPercentage: Defines the start of the interval. This has to be a value in [0.0, 100.0].\n It represents the value, where the error calculation should be started.\n 25.0 for example means that the first 25% of all calculated errors will be ignored.\n :param float endPercentage: Defines the end of the interval. This has to be a value in [0.0, 100.0].\n It represents the value, after which all error values will be ignored. 90.0 for example means that\n the last 10% of all local errors will be ignored.\n :param float startDate: Epoch representing the start date used for error calculation.\n :param float endDate: Epoch representing the end date used in the error calculation.\n\n :return: Returns a float representing the error.\n :rtype: float\n\n :raise: Raises a :py:exc:`ValueError` in one of the following cases:\n\n - startingPercentage not in [0.0, 100.0]\n - endPercentage not in [0.0, 100.0]\n - endPercentage < startingPercentage\n\n :raise: Raises a :py:exc:`StandardError` if :py:meth:`BaseErrorMeasure.initialize` was not successfull before.\n \"\"\"\n # not initialized:\n if len(self._errorValues) == 0:\n raise StandardError(\"The last call of initialize(...) was not successfull.\")\n\n # check for wrong parameters\n if not (0.0 <= startingPercentage <= 100.0):\n raise ValueError(\"startingPercentage has to be in [0.0, 100.0].\")\n if not (0.0 <= endPercentage <= 100.0):\n raise ValueError(\"endPercentage has to be in [0.0, 100.0].\")\n if endPercentage < startingPercentage:\n raise ValueError(\"endPercentage has to be greater or equal than startingPercentage.\")\n\n return self._calculate(startingPercentage, endPercentage, startDate, endDate)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef confidence_interval(self, confidenceLevel):\n\n if not (confidenceLevel >= 0 and confidenceLevel <= 1):\n raise ValueError(\"Parameter percentage has to be in [0,1]\")\n\n underestimations = []\n overestimations = []\n for error in self._errorValues:\n if error is None:\n # None was in the lists causing some confidenceLevels not be calculated, not sure if that was intended, I suggested ignoring None values\n continue\n #Want 0 errors in both lists!\n if error >= 0:\n overestimations.append(error)\n if error <= 0:\n underestimations.append(error)\n\n #sort and cut off at confidence level.\n overestimations.sort()\n underestimations.sort(reverse=True)\n\n overIdx = int(len(overestimations) * confidenceLevel) - 1\n underIdx = int(len(underestimations) * confidenceLevel) - 1\n\n overestimation = 0.0\n underestimation = 0.0\n\n if overIdx >= 0:\n overestimation = overestimations[overIdx]\n else:\n print len(overestimations), confidenceLevel\n\n if underIdx >= 0:\n underestimation = underestimations[underIdx]\n\n return underestimation, overestimation", "response": "Calculates the interval of the underestimations and overestimations for the given error level."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef load_segmented_data(filename):\n data = pd.read_csv(filename, index_col=0)\n data.index = data.index.astype(np.datetime64)\n \n return data", "response": "Helper function to load segmented gait time series data."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef load_finger_tapping_cloudupdrs_data(filename, convert_times=1000.0):\n data_m = np.genfromtxt(filename, delimiter=',', invalid_raise=False, skip_footer=1)\n date_times = pd.to_datetime((data_m[:, 0] - data_m[0, 0]))\n time_difference = (data_m[:, 0] - data_m[0, 0]) / convert_times\n data = {'td': time_difference, 'action_type': data_m[:, 2],'x': data_m[:, 3], 'y': data_m[:, 4],\n 'x_target': data_m[:, 7], 'y_target': data_m[:, 8]}\n data_frame = pd.DataFrame(data, index=date_times, columns=['td', 'action_type','x', 'y', 'x_target', 'y_target'])\n return data_frame", "response": "This method loads the data from a CSV file and returns a DataFrame with the data in the cloudupdrs format."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef numerical_integration(signal, sampling_frequency):\n \n integrate = sum(signal[1:]) / sampling_frequency + sum(signal[:-1])\n integrate /= sampling_frequency * 2\n \n return np.array(integrate)", "response": "Numerically integrates a signal with it s sampling frequency."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfinds the local maxima and minima (peaks) in a 1-dimensional signal. Converted from MATLAB script :param array signal: A 1-dimensional array or list (the signal). :type signal: array :param delta: The peak threashold. A point is considered a maximum peak if it has the maximal value, and was preceded (to the left) by a value lower by delta. :type delta: float :param x: Indices in local maxima and minima are replaced with the corresponding values in x (None default). :type x: array :return maxtab: The highest peaks. :rtype maxtab: numpy.ndarray :return mintab: The lowest peaks. :rtype mintab: numpy.ndarray", "response": "def peakdet(signal, delta, x=None):\n \"\"\"\n Find the local maxima and minima (peaks) in a 1-dimensional signal.\n Converted from MATLAB script \n\n :param array signal: A 1-dimensional array or list (the signal).\n :type signal: array\n :param delta: The peak threashold. A point is considered a maximum peak if it has the maximal value, and was preceded (to the left) by a value lower by delta.\n :type delta: float\n :param x: Indices in local maxima and minima are replaced with the corresponding values in x (None default).\n :type x: array\n :return maxtab: The highest peaks.\n :rtype maxtab: numpy.ndarray\n :return mintab: The lowest peaks.\n :rtype mintab: numpy.ndarray\n \"\"\"\n \n maxtab = []\n mintab = []\n\n if x is None:\n x = np.arange(len(signal))\n\n v = np.asarray(signal)\n\n if len(v) != len(x):\n sys.exit('Input vectors v and x must have same length')\n\n if not np.isscalar(delta):\n sys.exit('Input argument delta must be a scalar')\n\n if delta <= 0:\n sys.exit('Input argument delta must be positive')\n\n mn, mx = np.inf, -np.inf\n mnpos, mxpos = np.nan, np.nan\n\n lookformax = True\n\n for i in np.arange(len(v)):\n this = v[i]\n if this > mx:\n mx = this\n mxpos = x[i]\n if this < mn:\n mn = this\n mnpos = x[i]\n\n if lookformax:\n if this < mx - delta:\n maxtab.append((mxpos, mx))\n mn = this\n mnpos = x[i]\n lookformax = False\n else:\n if this > mn + delta:\n mintab.append((mnpos, mn))\n mx = this\n mxpos = x[i]\n lookformax = True\n\n return np.array(maxtab), np.array(mintab)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncomputes number of samples between signal peaks using the real part of FFT.", "response": "def compute_interpeak(data, sample_rate):\n \"\"\"\n Compute number of samples between signal peaks using the real part of FFT.\n\n :param data: 1-dimensional time series data.\n :type data: array\n :param sample_rate: Sample rate of accelerometer reading (Hz)\n :type sample_rate: float\n :return interpeak: Number of samples between peaks \n :rtype interpeak: int\n\n :Examples:\n\n >>> import numpy as np\n >>> from mhealthx.signals import compute_interpeak\n >>> data = np.random.random(10000)\n >>> sample_rate = 100\n >>> interpeak = compute_interpeak(data, sample_rate)\n \"\"\"\n\n # Real part of FFT:\n freqs = fftfreq(data.size, d=1.0/sample_rate)\n f_signal = rfft(data)\n\n # Maximum non-zero frequency:\n imax_freq = np.argsort(f_signal)[-2]\n freq = np.abs(freqs[imax_freq])\n\n # Inter-peak samples:\n interpeak = np.int(np.round(sample_rate / freq))\n\n return interpeak"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef butter_lowpass_filter(data, sample_rate, cutoff=10, order=4, plot=False):\n \n\n nyquist = 0.5 * sample_rate\n normal_cutoff = cutoff / nyquist\n b, a = butter(order, normal_cutoff, btype='low', analog=False)\n \n if plot:\n w, h = freqz(b, a, worN=8000)\n plt.subplot(2, 1, 1)\n plt.plot(0.5*sample_rate*w/np.pi, np.abs(h), 'b')\n plt.plot(cutoff, 0.5*np.sqrt(2), 'ko')\n plt.axvline(cutoff, color='k')\n plt.xlim(0, 0.5*sample_rate)\n plt.title(\"Lowpass Filter Frequency Response\")\n plt.xlabel('Frequency [Hz]')\n plt.grid()\n plt.show()\n\n y = lfilter(b, a, data)\n\n return y", "response": "Low - pass filter for the given sample rate and cutoff."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfinding indices of zero crossings from positive to negative values _", "response": "def crossings_nonzero_pos2neg(data):\n \"\"\"\n Find `indices of zero crossings from positive to negative values `_.\n\n :param data: numpy array of floats\n :type data: numpy array of floats\n :return crossings: crossing indices to data\n :rtype crossings: numpy array of integers\n\n :Examples:\n\n >>> import numpy as np\n >>> from mhealthx.signals import crossings_nonzero_pos2neg\n >>> data = np.random.random(100)\n >>> crossings = crossings_nonzero_pos2neg(data)\n \n \"\"\"\n import numpy as np\n\n if isinstance(data, np.ndarray):\n pass\n elif isinstance(data, list):\n data = np.asarray(data)\n else:\n raise IOError('data should be a numpy array')\n\n pos = data > 0\n\n crossings = (pos[:-1] & ~pos[1:]).nonzero()[0]\n\n return crossings"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef autocorrelate(data, unbias=2, normalize=2):\n\n # Autocorrelation:\n coefficients = correlate(data, data, 'full')\n size = np.int(coefficients.size/2)\n coefficients = coefficients[size:]\n N = coefficients.size\n\n # Unbiased:\n if unbias:\n if unbias == 1:\n coefficients /= (N - np.arange(N))\n elif unbias == 2:\n coefficient_ratio = coefficients[0]/coefficients[-1]\n coefficients /= np.linspace(coefficient_ratio, 1, N)\n else:\n raise IOError(\"unbias should be set to 1, 2, or None\")\n\n # Normalize:\n if normalize:\n if normalize == 1:\n coefficients /= np.abs(coefficients[0])\n elif normalize == 2:\n coefficients /= np.max(np.abs(coefficients))\n else:\n raise IOError(\"normalize should be set to 1, 2, or None\")\n\n return coefficients, N", "response": "Compute the autocorrelation coefficients for a time series data."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_signal_peaks_and_prominences(data):\n peaks, _ = sig.find_peaks(data)\n prominences = sig.peak_prominences(data, peaks)[0]\n \n return peaks, prominences", "response": "Get the signal peaks and peak prominences."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef smoothing_window(data, window=[1, 1, 1]):\n \n for i in range(len(data) - sum(window)):\n \n start_window_from = i\n start_window_to = i+window[0]\n\n end_window_from = start_window_to + window[1]\n end_window_to = end_window_from + window[2]\n\n if np.all(data[start_window_from: start_window_to] == data[end_window_from: end_window_to]):\n data[start_window_from: end_window_to] = data[start_window_from]\n \n return data", "response": "This function is a smoother function so we can fix misclassifications. It will take a sliding window of the data and apply the border to the middle of the original data."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef plot_segmentation(data, peaks, segment_indexes, figsize=(10, 5)):\n fig, ax = plt.subplots(figsize=figsize)\n plt.plot(data);\n \n for segment in np.unique(segment_indexes):\n plt.plot(peaks[np.where(segment_indexes == segment)[0]], data[peaks][np.where(segment_indexes == segment)[0]], 'o')\n \n plt.show()", "response": "Plots the data and segmentation based on the peaks and segment indexes."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef centroid_sort(centroids):\n\n dimensions = len(centroids[0])\n negative_base_point = array(dimensions*[-100])\n\n decorated = [\n (euclidean(centroid, negative_base_point), centroid)\n for centroid in centroids\n ]\n decorated.sort()\n\n return array([centroid for dist, centroid in decorated])", "response": "Sort the centroids of the current set of cluster centroids."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef non_zero_index(arr):\n for index, row in enumerate(arr):\n if non_zero_row(row):\n return index\n raise ValueError('No non-zero values')", "response": "Returns the index of the first non - zero entry in an array."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn True if the row is completely free of zeros.", "response": "def non_zero_row(arr):\n \"\"\"\n 0. Empty row returns False.\n\n >>> arr = array([])\n >>> non_zero_row(arr)\n\n False\n\n 1. Row with a zero returns False.\n\n >>> arr = array([1, 4, 3, 0, 5, -1, -2])\n >>> non_zero_row(arr)\n\n False\n 2. Row with no zeros returns True.\n\n >>> arr = array([-1, -0.1, 0.001, 2])\n >>> non_zero_row(arr)\n\n True\n\n :param arr: array\n :type arr: numpy array\n :return empty: If row is completely free of zeros\n :rtype empty: bool\n \"\"\"\n\n if len(arr) == 0:\n return False\n\n for item in arr:\n if item == 0:\n return False\n\n return True"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngenerating indexes for sliding windows with overlap", "response": "def window_features(idx, window_size=100, overlap=10):\n \"\"\" Generate indexes for a sliding window with overlap\n \n :param array idx: The indexes that need to be windowed.\n :param int window_size: The size of the window.\n :param int overlap: How much should each window overlap.\n \n :return array view: The indexes for the windows with overlap.\n \"\"\"\n overlap = window_size - overlap\n sh = (idx.size - window_size + 1, window_size)\n st = idx.strides * 2\n view = np.lib.stride_tricks.as_strided(idx, strides=st, shape=sh)[0::overlap]\n \n return view"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the version info from pypi for this app.", "response": "def get_pypi_version():\n \"\"\"\n Returns the version info from pypi for this app.\n \"\"\"\n try:\n response = requests.get(PYPI_URL, timeout=HALF_SECOND_TIMEOUT)\n response.raise_for_status()\n data = response.json()\n version_str = data[\"info\"][\"version\"]\n return _parse_version_str(version_str)\n except requests.exceptions.ConnectionError:\n raise VersionException(UNABLE_TO_ACCESS_PYPI + \" Failed to connect.\")\n except requests.exceptions.Timeout:\n raise VersionException(UNABLE_TO_ACCESS_PYPI + \" Timeout\")"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_project_by_id(self, project_id):\n return self._create_item_response(\n self.data_service.get_project_by_id(project_id),\n Project)", "response": "Get details about project with the specified uuid."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_project(self, name, description):\n return self._create_item_response(\n self.data_service.create_project(name, description),\n Project)", "response": "Create a new project with the specified name and description."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef create_folder(self, folder_name, parent_kind_str, parent_uuid):\n return self._create_item_response(\n self.data_service.create_folder(folder_name, parent_kind_str, parent_uuid),\n Folder\n )", "response": "Create a folder under a particular parent"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting direct files and folders of a project.", "response": "def get_project_children(self, project_id, name_contains=None):\n \"\"\"\n Get direct files and folders of a project.\n :param project_id: str: uuid of the project to list contents\n :param name_contains: str: filter children based on a pattern\n :return: [File|Folder]: list of Files/Folders contained by the project\n \"\"\"\n return self._create_array_response(\n self.data_service.get_project_children(\n project_id, name_contains\n ),\n DDSConnection._folder_or_file_constructor\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_folder_children(self, folder_id, name_contains=None):\n return self._create_array_response(\n self.data_service.get_folder_children(\n folder_id, name_contains\n ),\n DDSConnection._folder_or_file_constructor\n )", "response": "Get direct files and folders of a folder."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget a FileDownload object that contains temporary url settings needed to download the contents of a file.", "response": "def get_file_download(self, file_id):\n \"\"\"\n Get a file download object that contains temporary url settings needed to download the contents of a file.\n :param file_id: str: uuid of the file\n :return: FileDownload\n \"\"\"\n return self._create_item_response(\n self.data_service.get_file_url(file_id),\n FileDownload\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef upload_file(self, local_path, project_id, parent_data, existing_file_id=None, remote_filename=None):\n path_data = PathData(local_path)\n hash_data = path_data.get_hash()\n file_upload_operations = FileUploadOperations(self.data_service, None)\n upload_id = file_upload_operations.create_upload(project_id, path_data, hash_data,\n remote_filename=remote_filename,\n storage_provider=self.config.storage_provider_id)\n context = UploadContext(self.config, self.data_service, upload_id, path_data)\n ParallelChunkProcessor(context).run()\n remote_file_data = file_upload_operations.finish_upload(upload_id, hash_data, parent_data, existing_file_id)\n return File(self, remote_file_data)", "response": "Uploads a file under a specific location in DDSConnection possibly replacing an existing file."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _folder_or_file_constructor(dds_connection, data_dict):\n kind = data_dict['kind']\n if kind == KindType.folder_str:\n return Folder(dds_connection, data_dict)\n elif data_dict['kind'] == KindType.file_str:\n return File(dds_connection, data_dict)", "response": "Create a File or Folder based on the kind value in data_dict."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget details for a folder id.", "response": "def get_folder_by_id(self, folder_id):\n \"\"\"\n Get folder details for a folder id.\n :param folder_id: str: uuid of the folder\n :return: Folder\n \"\"\"\n return self._create_item_response(\n self.data_service.get_folder(folder_id),\n Folder\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_file_by_id(self, file_id):\n return self._create_item_response(\n self.data_service.get_file(file_id),\n File\n )", "response": "Get file details by id."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsaving the contents of the remote file to a local file.", "response": "def save_to_path(self, file_path, chunk_size=DOWNLOAD_FILE_CHUNK_SIZE):\n \"\"\"\n Save the contents of the remote file to a local path.\n :param file_path: str: file path\n :param chunk_size: chunk size used to write local file\n \"\"\"\n response = self._get_download_response()\n with open(file_path, 'wb') as f:\n for chunk in response.iter_content(chunk_size=chunk_size):\n if chunk: # filter out keep-alive new chunks\n f.write(chunk)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_child(self):\n path_parts = self.remote_path.split(os.sep)\n return self._get_child_recurse(path_parts, self.node)", "response": "Find file or folder at the remote_path"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nadds this task to the lookup based on it s wait_for_task_id property.", "response": "def add(self, task):\n \"\"\"\n Add this task to the lookup based on it's wait_for_task_id property.\n :param task: Task: task to add to the list\n \"\"\"\n wait_id = task.wait_for_task_id\n task_list = self.wait_id_to_task.get(wait_id, [])\n task_list.append(task)\n self.wait_id_to_task[wait_id] = task_list"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate a task for the command that will wait for parent_task_id before starting.", "response": "def add(self, parent_task_id, command):\n \"\"\"\n Create a task for the command that will wait for parent_task_id before starting.\n :param parent_task_id: int: id of task to wait for or None if it can start immediately\n :param command: TaskCommand: contains data function to run\n :return: int: task id we created for this command\n \"\"\"\n task_id = self._claim_next_id()\n self.waiting_task_list.add(Task(task_id, parent_task_id, command))\n return task_id"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef run(self):\n for task in self.get_next_tasks(None):\n self.executor.add_task(task, None)\n while not self.executor.is_done():\n done_task_and_result = self.executor.wait_for_tasks()\n for task, task_result in done_task_and_result:\n self._add_sub_tasks_to_executor(task, task_result)", "response": "Runs all tasks in this runner on the executor."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nadd all subtasks for parent_task to the executor.", "response": "def _add_sub_tasks_to_executor(self, parent_task, parent_task_result):\n \"\"\"\n Add all subtasks for parent_task to the executor.\n :param parent_task: Task: task that has just finished\n :param parent_task_result: object: result of task that is finished\n \"\"\"\n for sub_task in self.waiting_task_list.get_next_tasks(parent_task.id):\n self.executor.add_task(sub_task, parent_task_result)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_task(self, task, parent_task_result):\n self.tasks.append((task, parent_task_result))\n self.task_id_to_task[task.id] = task", "response": "Add a task to run with the specified result from this tasks parent"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef wait_for_tasks(self):\n finished_tasks_and_results = []\n while len(finished_tasks_and_results) == 0:\n if self.is_done():\n break\n self.start_tasks()\n self.process_all_messages_in_queue()\n finished_tasks_and_results = self.get_finished_results()\n return finished_tasks_and_results", "response": "Wait for one or more tasks to finish or return empty list."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef start_tasks(self):\n while self.tasks_at_once > len(self.pending_results) and self._has_more_tasks():\n task, parent_result = self.tasks.popleft()\n self.execute_task(task, parent_result)", "response": "Start tasks until we have enough results."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nexecutes a single task in another process saving the result to our pending_results list.", "response": "def execute_task(self, task, parent_result):\n \"\"\"\n Run a single task in another process saving the result to our list of pending results.\n :param task: Task: function and data we can run in another process\n :param parent_result: object: result from our parent task\n \"\"\"\n task.before_run(parent_result)\n context = task.create_context(self.message_queue)\n pending_result = self.pool.apply_async(execute_task_async, (task.func, task.id, context))\n self.pending_results.append(pending_result)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef process_single_message_from_queue(self):\n try:\n message = self.message_queue.get_nowait()\n task_id, data = message\n task = self.task_id_to_task[task_id]\n task.on_message(data)\n return True\n except queue.Empty:\n return False", "response": "Tries to read a single message from the queue and let the associated task process it."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_finished_results(self):\n task_and_results = []\n for pending_result in self.pending_results:\n if pending_result.ready():\n ret = pending_result.get()\n task_id, result = ret\n task = self.task_id_to_task[task_id]\n # process any pending messages for this task (will also process other tasks messages)\n self.process_all_messages_in_queue()\n task.after_run(result)\n task_and_results.append((task, result))\n self.pending_results.remove(pending_result)\n return task_and_results", "response": "Get the results for all tasks that have finished."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef to_xml(body_element):\n doctype = b\"\"\"\\n\"\"\"\n if body_element is None:\n return succeed(b\"\")\n d = flattenString(None, body_element)\n d.addCallback(lambda flattened: doctype + flattened)\n return d", "response": "Serialize a L{twisted.web.template.Tag} to a UTF-8 encoded XML document\n with an XML doctype header."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting a Route53 client.", "response": "def get_route53_client(agent, region, cooperator=None):\n \"\"\"\n Get a non-registration Route53 client.\n \"\"\"\n if cooperator is None:\n cooperator = task\n return region.get_client(\n _Route53Client,\n agent=agent,\n creds=region.creds,\n region=REGION_US_EAST_1,\n endpoint=AWSServiceEndpoint(_OTHER_ENDPOINT),\n cooperator=cooperator,\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nconstructs an L { _Operation representing a Route53 service API call.", "response": "def _route53_op(body=None, **kw):\n \"\"\"\n Construct an L{_Operation} representing a I{Route53} service API call.\n \"\"\"\n op = _Operation(service=b\"route53\", **kw)\n if body is None:\n return succeed(op)\n d = to_xml(body)\n d.addCallback(lambda body: attr.assoc(op, body=body))\n return d"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef hostedzone_from_element(zone):\n return HostedZone(\n name=maybe_bytes_to_unicode(zone.find(\"Name\").text).encode(\"ascii\").decode(\"idna\"),\n identifier=maybe_bytes_to_unicode(zone.find(\"Id\").text).replace(u\"/hostedzone/\", u\"\"),\n rrset_count=int(zone.find(\"ResourceRecordSetCount\").text),\n reference=maybe_bytes_to_unicode(zone.find(\"CallerReference\").text),\n )", "response": "Construct a HostedZone instance from an I { HostedZone XML element."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef to_element(change):\n return tags.Change(\n tags.Action(\n change.action,\n ),\n tags.ResourceRecordSet(\n tags.Name(\n unicode(change.rrset.label),\n ),\n tags.Type(\n change.rrset.type,\n ),\n tags.TTL(\n u\"{}\".format(change.rrset.ttl),\n ),\n tags.ResourceRecords(list(\n tags.ResourceRecord(tags.Value(rr.to_text()))\n for rr\n in sorted(change.rrset.records)\n ))\n ),\n )", "response": "Converts an IRRSetChange to an Element which describes this\n ."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nadding a method class to the regitry.", "response": "def add(self, method_class, action, version=None):\n \"\"\"Add a method class to the regitry.\n\n @param method_class: The method class to add\n @param action: The action that the method class can handle\n @param version: The version that the method class can handle\n \"\"\"\n by_version = self._by_action.setdefault(action, {})\n if version in by_version:\n raise RuntimeError(\"A method was already registered for action\"\n \" %s in version %s\" % (action, version))\n by_version[version] = method_class"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef check(self, action, version=None):\n if action not in self._by_action:\n raise APIError(400, \"InvalidAction\", \"The action %s is not valid \"\n \"for this web service.\" % action)\n by_version = self._by_action[action]\n if None not in by_version:\n # There's no catch-all method, let's try the version-specific one\n if version not in by_version:\n raise APIError(400, \"InvalidVersion\", \"Invalid API version.\")", "response": "Check if the given action is supported in the given version."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngets the method class handing the given action and version.", "response": "def get(self, action, version=None):\n \"\"\"Get the method class handing the given action and version.\"\"\"\n by_version = self._by_action[action]\n if version in by_version:\n return by_version[version]\n else:\n return by_version[None]"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef scan(self, module, onerror=None, ignore=None):\n from venusian import Scanner\n scanner = Scanner(registry=self)\n kwargs = {\"onerror\": onerror, \"categories\": [\"method\"]}\n if ignore is not None:\n # Only pass it if specified, for backward compatibility\n kwargs[\"ignore\"] = ignore\n scanner.scan(module, **kwargs)", "response": "Scan the given module object for L { Method s and register them."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef calculate_parameters(self, independentTs, dependentTs):\n listX, listY = self.match_time_series(independentTs, dependentTs)\n\n if len(listX) == 0 or len(listY) == 0:\n raise ValueError(\"Lists need to have some equal dates or cannot be empty\")\n if len(listX) != len(listY):\n raise ValueError(\"Each Timeseries need to have distinct dates\")\n\n xValues = map(lambda item: item[1], listX)\n yValues = map(lambda item: item[1], listY)\n\n xMean = FusionMethods[\"mean\"](xValues)\n yMean = FusionMethods[\"mean\"](yValues)\n\n xDeviation = map(lambda item: (item - xMean), xValues)\n yDeviation = map(lambda item: (item - yMean), yValues)\n\n try:\n parameter1 = sum(x * y for x, y in zip(xDeviation, yDeviation)) / sum(x * x for x in xDeviation)\n except ZeroDivisionError:\n # error occures if xDeviation is always 0, which means that all x values are the same\n raise ValueError(\"Not enough distinct x values\")\n parameter0 = yMean - (parameter1 * xMean)\n\n return (parameter0, parameter1)", "response": "Calculate and return the parameters for the regression line."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef predict(self, timeseriesX, n, m):\n new_entries = []\n for entry in timeseriesX:\n predicted_value = m * entry[1] + n\n new_entries.append([entry[0], predicted_value])\n return TimeSeries.from_twodim_list(new_entries)", "response": "Calculates the dependent timeseries Y for the given parameters and independent timeseries X."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef match_time_series(self, timeseries1, timeseries2):\n time1 = map(lambda item: item[0], timeseries1.to_twodim_list())\n time2 = map(lambda item: item[0], timeseries2.to_twodim_list())\n\n matches = filter(lambda x: (x in time1), time2)\n listX = filter(lambda x: (x[0] in matches), timeseries1.to_twodim_list())\n listY = filter(lambda x: (x[0] in matches), timeseries2.to_twodim_list())\n\n return listX, listY", "response": "Return two lists containing the matched values and the dates of the two input time series."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the least - squares solution to a linear matrix equation.", "response": "def lstsq(cls, a, b):\n \"\"\"Return the least-squares solution to a linear matrix equation.\n\n :param Matrix a: Design matrix with the values of the independent variables.\n :param Matrix b: Matrix with the \"dependent variable\" values.\n b can only have one column.\n\n :raise: Raises an :py:exc:`ValueError`, if\n - the number of rows of a and b does not match.\n - b has more than one column.\n :note: The algorithm solves the following equations.\n beta = a^+ b.\n \"\"\"\n # Check if the size of the input matrices matches\n if a.get_height() != b.get_height():\n raise ValueError(\"Size of input matrices does not match\")\n if b.get_width() != 1:\n raise ValueError(\"Matrix with dependent variable has more than 1 column\")\n aPseudo = a.pseudoinverse()\n # The following code could be used if c is regular.\n # aTrans = a.transform()\n # c = aTrans * a\n # invers() raises an ValueError, if c is not invertible\n # cInvers = c.invers()\n # beta = cInvers * aTrans * b\n beta = aPseudo * b\n return beta"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _calculate(self, startingPercentage, endPercentage, startDate, endDate):\n # get the defined subset of error values\n errorValues = self._get_error_values(startingPercentage, endPercentage, startDate, endDate)\n errorValues = filter(lambda item: item is not None, errorValues)\n\n return float(sum(errorValues)) / float(len(errorValues))", "response": "This method calculates the error value for the current language."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncalculate the error between two values.", "response": "def local_error(self, originalValue, calculatedValue):\n \"\"\"Calculates the error between the two given values.\n\n :param list originalValue: List containing the values of the original data.\n :param list calculatedValue: List containing the values of the calculated TimeSeries that\n corresponds to originalValue.\n\n :return: Returns the error measure of the two given values.\n :rtype: numeric\n \"\"\"\n originalValue = originalValue[0]\n calculatedValue = calculatedValue[0]\n\n if 0 == originalValue:\n return None\n\n return (math.fabs((calculatedValue - originalValue)/float(originalValue))) * 100.0"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _get_historic_means(self, timeSeries):\n # calculate the history values\n historyLength = self._historyLength\n historicMeans = []\n append = historicMeans.append\n\n # not most optimized loop in case of calculation operations\n for startIdx in xrange(len(timeSeries) - historyLength - 1):\n value = 0\n for idx in xrange(startIdx, startIdx + historyLength):\n value += abs(timeSeries[idx+1][1] - timeSeries[idx][1])\n\n append(value / float(historyLength))\n\n return historicMeans", "response": "Calculates the mean value for the history of the MeanAbsoluteScaledError."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef initialize(self, originalTimeSeries, calculatedTimeSeries):\n # ErrorMeasure was already initialized.\n if 0 < len(self._errorValues):\n raise StandardError(\"An ErrorMeasure can only be initialized once.\")\n\n # calculating the number of datapoints used within the history\n if isinstance(self._historyLength, float):\n self._historyLength = int((self._historyLength * len(originalTimeSeries)) / 100.0)\n\n # sort the TimeSeries to reduce the required comparison operations\n originalTimeSeries.sort_timeseries()\n calculatedTimeSeries.sort_timeseries()\n\n self._historicMeans = self._get_historic_means(originalTimeSeries)\n\n # Performance optimization\n append = self._errorValues.append\n appendDates = self._errorDates.append\n local_error = self.local_error\n minCalcIdx = self._historyLength + 1\n\n # calculate all valid local errors\n for orgPair in originalTimeSeries[minCalcIdx:]:\n for calcIdx in xrange(minCalcIdx, len(calculatedTimeSeries)):\n calcPair = calculatedTimeSeries[calcIdx]\n\n # Skip values that can not be compared\n if calcPair[0] != orgPair[0]:\n continue\n\n append(local_error(orgPair[1:], calcPair[1:]))\n appendDates(orgPair[0])\n\n # return False, if the error cannot be calculated\n if len(filter(lambda item: item is not None, self._errorValues)) < self._minimalErrorCalculationPercentage * len(originalTimeSeries):\n self._errorValues = []\n self._errorDates = []\n self._historicMeans = []\n return False\n\n return True", "response": "Initializes the ErrorMeasure.\n\n During initialization, all :py:meth:`BaseErrorMeasure.local_error()` are calculated.\n\n :param TimeSeries originalTimeSeries: TimeSeries containing the original data.\n :param TimeSeries calculatedTimeSeries: TimeSeries containing calculated data.\n Calculated data is smoothed or forecasted data.\n\n :return: Return :py:const:`True` if the error could be calculated, :py:const:`False`\n otherwise based on the minimalErrorCalculationPercentage.\n :rtype: boolean\n\n :raise: Raises a :py:exc:`StandardError` if the error measure is initialized multiple times."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _calculate(self, startingPercentage, endPercentage, startDate, endDate):\n # get the defined subset of error values\n errorValues = self._get_error_values(startingPercentage, endPercentage, startDate, endDate)\n\n # get the historic mean\n if startDate is not None:\n possibleDates = filter(lambda date: date >= startDate, self._errorDates)\n\n # This piece of code is not required, because _get_error_values already ensured that the startDate\n # was correct. Otherwise it would have thrown an exception.\n #if 0 == len(possibleDates):\n # raise ValueError(\"%s does not represent a valid startDate.\" % startDate)\n\n meanIdx = self._errorDates.index(min(possibleDates))\n else:\n meanIdx = int((startingPercentage * len(self._errorValues)) / 100.0)\n\n mad = sum(errorValues) / float(len(errorValues))\n historicMean = self._historicMeans[meanIdx]\n\n return mad / historicMean", "response": "This method calculates the error value for a given date range."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef __train(self, n_clusters=4):\n\n try:\n for obs in self.observations:\n features, ids = self.__get_features_for_observation(observation=obs, last_column_is_id=True)\n # the last column is the observation id\n normalised_data = whiten(features)\n\n # skip any rows that contain just zero values... they create nans\n first_safe_row = pdkit.utils.non_zero_index(normalised_data)\n observation_ids = features.tolist()\n sd = features[first_safe_row] / normalised_data[first_safe_row]\n\n # Calculate centroids and sort result\n centroids_array, _ = kmeans(normalised_data, n_clusters)\n sorted_centroids = pdkit.utils.centroid_sort(centroids_array)\n\n if not self.clusters:\n self.clusters = [[obs, sd.tolist(), sorted_centroids.tolist()]]\n else:\n self.clusters.append([obs, sd.tolist(),sorted_centroids.tolist()])\n except IOError as e:\n ierr = \"({}): {}\".format(e.errno, e.strerror)\n logging.error(\"Error training UPDRS, file not found, I/O error %s\", ierr)\n\n except ValueError as verr:\n logging.error(\"Error training UPDRS ValueError ->%s\", verr.message)\n\n except:\n logging.error(\"Unexpected error on training UPDRS init: %s\", sys.exc_info()[0])", "response": "This method is used to train the UPDRS cluster."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_single_score(self, point, centroids=None, sd=None):\n normalised_point = array(point) / array(sd)\n\n observation_score = {\n 'original': point,\n 'normalised': normalised_point.tolist(),\n }\n\n distances = [\n euclidean(normalised_point, centroid)\n for centroid in centroids\n ]\n\n return int(distances.index(min(distances)))", "response": "This method returns a score for a given point and a list of current centroids."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef score(self, measurement, output_format='array'):\n scores = np.array([])\n for obs in self.observations:\n c, sd = self.__get_centroids_sd(obs)\n p, ids = self.__get_features_for_observation(data_frame = measurement, observation=obs,\n last_column_is_id=True)\n\n scores = np.append(scores, [self.get_single_score(p, centroids=c, sd=sd)], axis=0)\n\n if output_format == 'array':\n return scores.astype(int)\n else:\n return np.array_str(scores.astype(int))", "response": "Method to score a given measurement against the trained KNN clusters."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nfunctioning run by CreateProjectCommand to create the project.", "response": "def download_file_part_run(download_context):\n \"\"\"\n Function run by CreateProjectCommand to create the project.\n Runs in a background process.\n :param download_context: UploadContext: contains data service setup and project name to create.\n \"\"\"\n destination_dir, file_url_data_dict, seek_amt, bytes_to_read = download_context.params\n project_file = ProjectFile(file_url_data_dict)\n local_path = project_file.get_local_path(destination_dir)\n retry_chunk_downloader = RetryChunkDownloader(project_file, local_path,\n seek_amt, bytes_to_read,\n download_context)\n retry_chunk_downloader.run()\n return 'ok'"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ndownloading the contents of the specified project name or id to dest_directory.", "response": "def run(self):\n \"\"\"\n Download the contents of the specified project name or id to dest_directory.\n \"\"\"\n files_to_download = self.get_files_to_download()\n total_files_size = self.get_total_files_size(files_to_download)\n if self.file_download_pre_processor:\n self.run_preprocessor(files_to_download)\n\n self.try_create_dir(self.dest_directory)\n watcher = ProgressPrinter(total_files_size, msg_verb='downloading')\n self.download_files(files_to_download, watcher)\n watcher.finished()\n\n warnings = self.check_warnings()\n if warnings:\n watcher.show_warning(warnings)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef run_preprocessor(self, files_to_download):\n for project_file in files_to_download:\n self.file_download_pre_processor.run(self.remote_store.data_service, project_file)", "response": "Run file_download_pre_processor for each file we are about to download."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef try_create_dir(self, path):\n if not os.path.exists(path):\n os.mkdir(path)\n elif not os.path.isdir(path):\n ValueError(\"Unable to create directory:\" + path + \" because a file already exists with the same name.\")", "response": "Try to create a directory if it doesn t exist and raise error if there is a non - directory with the same name."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting list of remote folders based on the list of all file urls", "response": "def _get_parent_remote_paths(self):\n \"\"\"\n Get list of remote folders based on the list of all file urls\n :return: set([str]): set of remote folders (that contain files)\n \"\"\"\n parent_paths = set([item.get_remote_parent_path() for item in self.file_urls])\n if '' in parent_paths:\n parent_paths.remove('')\n return parent_paths"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef make_local_directories(self):\n for remote_path in self._get_parent_remote_paths():\n local_path = os.path.join(self.dest_directory, remote_path)\n self._assure_dir_exists(local_path)", "response": "Create directories necessary to download the files into dest_directory"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef make_big_empty_files(self):\n for file_url in self.file_urls:\n local_path = file_url.get_local_path(self.dest_directory)\n with open(local_path, \"wb\") as outfile:\n if file_url.size > 0:\n outfile.seek(int(file_url.size) - 1)\n outfile.write(b'\\0')", "response": "Write out a empty file so the workers can seek to where they should write their data."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ndivide file_url size into an array of start and end tuples.", "response": "def make_ranges(self, file_url):\n \"\"\"\n Divides file_url size into an array of ranges to be downloaded by workers.\n :param: file_url: ProjectFileUrl: file url to download\n :return: [(int,int)]: array of (start, end) tuples\n \"\"\"\n size = file_url.size\n bytes_per_chunk = self.determine_bytes_per_chunk(size)\n start = 0\n ranges = []\n while size > 0:\n amount = bytes_per_chunk\n if amount > size:\n amount = size\n ranges.append((start, start + amount - 1))\n start += amount\n size -= amount\n return ranges"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef determine_bytes_per_chunk(self, size):\n workers = self.settings.config.download_workers\n if not workers or workers == 'None':\n workers = 1\n bytes_per_chunk = int(math.ceil(size / float(workers)))\n if bytes_per_chunk < self.bytes_per_chunk:\n bytes_per_chunk = self.bytes_per_chunk\n return bytes_per_chunk", "response": "Determines the size of a worker s memory chunk."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn tuple that contains a list large files and a list of small files based on size parameter :param size: int: size (in bytes) that determines if a file is large or small :return: ([ProjectFileUrl],[ProjectFileUrl]): (large file urls, small file urls)", "response": "def split_file_urls_by_size(self, size):\n \"\"\"\n Return tuple that contains a list large files and a list of small files based on size parameter\n :param size: int: size (in bytes) that determines if a file is large or small\n :return: ([ProjectFileUrl],[ProjectFileUrl]): (large file urls, small file urls)\n \"\"\"\n large_items = []\n small_items = []\n for file_url in self.file_urls:\n if file_url.size >= size:\n large_items.append(file_url)\n else:\n small_items.append(file_url)\n return large_items, small_items"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef check_downloaded_files_sizes(self):\n for file_url in self.file_urls:\n local_path = file_url.get_local_path(self.dest_directory)\n self.check_file_size(file_url.size, local_path)", "response": "Check that the files sizes are correct."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef check_file_size(file_size, path):\n stat_info = os.stat(path)\n if stat_info.st_size != file_size:\n format_str = \"Error occurred downloading {}. Got a file size {}. Expected file size:{}\"\n msg = format_str.format(path, stat_info.st_size, file_size)\n raise ValueError(msg)", "response": "Raise an error if we didn t get all of the file."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncreates data needed by upload_project_run", "response": "def create_context(self, message_queue, task_id):\n \"\"\"\n Create data needed by upload_project_run(DukeDS connection info).\n :param message_queue: Queue: queue background process can send messages to us on\n :param task_id: int: id of this command's task so message will be routed correctly\n \"\"\"\n params = (self.settings.dest_directory, self.file_url.json_data, self.seek_amt, self.bytes_to_read)\n return DownloadContext(self.settings, params, message_queue, task_id)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the url and headers to use for downloading part of a file.", "response": "def get_url_and_headers_for_range(self, file_download):\n \"\"\"\n Return url and headers to use for downloading part of a file, adding range headers.\n :param file_download: FileDownload: contains data about file we will download\n :return: str, dict: url to download and headers to use\n \"\"\"\n headers = self.get_range_headers()\n if file_download.http_headers:\n headers.update(file_download.http_headers)\n separator = \"\"\n if not file_download.url.startswith(\"/\"):\n separator = \"/\"\n url = '{}{}{}'.format(file_download.host, separator, file_download.url)\n return url, headers"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef download_chunk(self, url, headers):\n response = requests.get(url, headers=headers, stream=True)\n if response.status_code == SWIFT_EXPIRED_STATUS_CODE \\\n or response.status_code == S3_EXPIRED_STATUS_CODE:\n raise DownloadInconsistentError(response.text)\n response.raise_for_status()\n self.actual_bytes_read = 0\n self._write_response_to_file(response)\n self._verify_download_complete()", "response": "Download a chunk of a file and write it to our file."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _write_response_to_file(self, response):\n with open(self.local_path, 'r+b') as outfile: # open file for read/write (no truncate)\n outfile.seek(self.seek_amt)\n for chunk in response.iter_content(chunk_size=self.bytes_per_chunk):\n if chunk: # filter out keep-alive chunks\n outfile.write(chunk)\n self._on_bytes_read(len(chunk))", "response": "Write the response to the appropriate section of the file at self. local_path."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _on_bytes_read(self, num_bytes_read):\n self.actual_bytes_read += num_bytes_read\n if self.actual_bytes_read > self.bytes_to_read:\n raise TooLargeChunkDownloadError(self.actual_bytes_read, self.bytes_to_read, self.local_path)\n self.download_context.send_processed_message(num_bytes_read)", "response": "Called when the bytes read of the data is received."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _verify_download_complete(self):\n if self.actual_bytes_read > self.bytes_to_read:\n raise TooLargeChunkDownloadError(self.actual_bytes_read, self.bytes_to_read, self.local_path)\n elif self.actual_bytes_read < self.bytes_to_read:\n raise PartialChunkDownloadError(self.actual_bytes_read, self.bytes_to_read, self.local_path)", "response": "Verify that the download is complete."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef optimize(self, timeSeries, forecastingMethods=None, startingPercentage=0.0, endPercentage=100.0):\n\n if forecastingMethods is None or len(forecastingMethods) == 0:\n raise ValueError(\"forecastingMethods cannot be empty.\")\n\n self._startingPercentage = startingPercentage\n self._endPercentage = endPercentage\n\n results = []\n for forecastingMethod in forecastingMethods:\n results.append([forecastingMethod] + self.optimize_forecasting_method(timeSeries, forecastingMethod))\n\n # get the forecasting method with the smallest error\n bestForecastingMethod = min(results, key=lambda item: item[1].get_error(self._startingPercentage, self._endPercentage))\n\n for parameter in bestForecastingMethod[2]:\n bestForecastingMethod[0].set_parameter(parameter, bestForecastingMethod[2][parameter])\n\n return bestForecastingMethod", "response": "Runs the optimization of the given TimeSeries."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\noptimizing the parameters for the given timeSeries and forecastingMethod. :param TimeSeries timeSeries: TimeSeries instance, containing hte original data. :param BaseForecastingMethod forecastingMethod: ForecastingMethod that is used to optimize the parameters. :return: Returns a tuple containing only the smallest BaseErrorMeasure instance as defined in :py:meth:`BaseOptimizationMethod.__init__` and the forecastingMethods parameter. :rtype: tuple", "response": "def optimize_forecasting_method(self, timeSeries, forecastingMethod):\n \"\"\"Optimizes the parameters for the given timeSeries and forecastingMethod.\n\n :param TimeSeries timeSeries: TimeSeries instance, containing hte original data.\n :param BaseForecastingMethod forecastingMethod: ForecastingMethod that is used to optimize the parameters.\n\n :return: Returns a tuple containing only the smallest BaseErrorMeasure instance as defined in\n :py:meth:`BaseOptimizationMethod.__init__` and the forecastingMethods parameter.\n :rtype: tuple\n \"\"\"\n tuneableParameters = forecastingMethod.get_optimizable_parameters()\n\n remainingParameters = []\n for tuneableParameter in tuneableParameters:\n remainingParameters.append([tuneableParameter, [item for item in self._generate_next_parameter_value(tuneableParameter, forecastingMethod)]])\n\n # Collect the forecasting results\n forecastingResults = self.optimization_loop(timeSeries, forecastingMethod, remainingParameters)\n\n # Debugging GridSearchTest.inner_optimization_result_test\n #print \"\"\n #print \"GridSearch\"\n #print \"Instance / SMAPE / Alpha\"\n #for item in forecastingResults:\n # print \"%s / %s / %s\" % (\n # str(item[0])[-12:-1],\n # str(item[0].get_error(self._startingPercentage, self._endPercentage))[:8],\n # item[1][\"smoothingFactor\"]\n #)\n #print \"\"\n\n # Collect the parameters that resulted in the smallest error\n bestForecastingResult = min(forecastingResults, key=lambda item: item[0].get_error(self._startingPercentage, self._endPercentage))\n\n # return the determined parameters\n return bestForecastingResult"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef energy_data():\n cur = db.cursor().execute(\"\"\"SELECT timestamp, current FROM Readings\"\"\")\n original = TimeSeries()\n original.initialize_from_sql_cursor(cur)\n original.normalize(\"day\", fusionMethod = \"sum\")\n return itty.Response(json.dumps(original, cls=PycastEncoder), content_type='application/json')", "response": "Get the energy data for the current readings for the device 8."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nperforming Holt Winters Parameter Optimization on the given data.", "response": "def optimize(request):\n \"\"\"\n Performs Holt Winters Parameter Optimization on the given post data.\n Expects the following values set in the post of the request:\n seasonLength - integer\n valuesToForecast - integer\n data - two dimensional array of [timestamp, value]\n \"\"\"\n #Parse arguments\n seasonLength = int(request.POST.get('seasonLength', 6))\n valuesToForecast = int(request.POST.get('valuesToForecast', 0))\n data = json.loads(request.POST.get('data', []))\n\n original = TimeSeries.from_twodim_list(data)\n original.normalize(\"day\") #due to bug in TimeSeries.apply\n original.set_timeformat(\"%d.%m\")\n\n #optimize smoothing\n hwm = HoltWintersMethod(seasonLength = seasonLength, valuesToForecast = valuesToForecast)\n gridSearch = GridSearch(SMAPE)\n optimal_forecasting, error, optimal_params = gridSearch.optimize(original, [hwm])\n\n #perform smoothing\n smoothed = optimal_forecasting.execute(original)\n smoothed.set_timeformat(\"%d.%m\")\n result = { 'params': optimal_params,\n 'original': original,\n 'smoothed': smoothed,\n 'error': round(error.get_error(), 3)\n }\n return itty.Response(json.dumps(result, cls=PycastEncoder), content_type='application/json')"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nperforming Holt Winters Smoothing on the given post data. Expects the following values set in the post of the request: smoothingFactor - float trendSmoothingFactor - float seasonSmoothingFactor - float seasonLength - integer valuesToForecast - integer data - two dimensional array of [timestamp, value]", "response": "def holtWinters(request):\n \"\"\"\n Performs Holt Winters Smoothing on the given post data.\n Expects the following values set in the post of the request:\n smoothingFactor - float\n trendSmoothingFactor - float\n seasonSmoothingFactor - float\n seasonLength - integer\n valuesToForecast - integer\n data - two dimensional array of [timestamp, value]\n \"\"\"\n #Parse arguments\n smoothingFactor = float(request.POST.get('smoothingFactor', 0.2))\n trendSmoothingFactor = float(request.POST.get('trendSmoothingFactor', 0.3))\n seasonSmoothingFactor = float(request.POST.get('seasonSmoothingFactor', 0.4))\n seasonLength = int(request.POST.get('seasonLength', 6))\n valuesToForecast = int(request.POST.get('valuesToForecast', 0))\n data = json.loads(request.POST.get('data', []))\n\n #perform smoothing\n hwm = HoltWintersMethod(smoothingFactor = smoothingFactor,\n trendSmoothingFactor = trendSmoothingFactor,\n seasonSmoothingFactor = seasonSmoothingFactor,\n seasonLength = seasonLength,\n valuesToForecast = valuesToForecast)\n original = TimeSeries.from_twodim_list(data)\n original.set_timeformat(\"%d.%m\")\n smoothed = hwm.execute(original)\n smoothed.set_timeformat(\"%d.%m\")\n\n error = SMAPE()\n error.initialize(original, smoothed)\n\n #process the result\n result = { 'original': original,\n 'smoothed': smoothed,\n 'error': round(error.get_error(), 3)\n }\n return itty.Response(json.dumps(result, cls=PycastEncoder), content_type='application/json')"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef parse(self, schema, strict=True):\n self.args, self.rest = schema.extract(self._raw_params)\n if strict and self.rest:\n raise APIError(400, \"UnknownParameter\",\n \"The parameter %s is not \"\n \"recognized\" % self.rest.keys()[0])", "response": "Update the internal state of the object with the parsed parameters."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _name_to_child_map(children):\n name_to_child = {}\n for child in children:\n name_to_child[child.name] = child\n return name_to_child", "response": "Create a map of name to child based on a list of children."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nupdating remote_ids based on the names of children.", "response": "def _update_remote_children(remote_parent, children):\n \"\"\"\n Update remote_ids based on on parent matching up the names of children.\n :param remote_parent: RemoteProject/RemoteFolder who has children\n :param children: [LocalFolder,LocalFile] children to set remote_ids based on remote children\n \"\"\"\n name_to_child = _name_to_child_map(children)\n for remote_child in remote_parent.children:\n local_child = name_to_child.get(remote_child.name)\n if local_child:\n local_child.update_remote_ids(remote_child)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nbuild a tree of LocalFile or LocalFolder based on a path.", "response": "def _build_project_tree(path, followsymlinks, file_filter):\n \"\"\"\n Build a tree of LocalFolder with children or just a LocalFile based on a path.\n :param path: str path to a directory to walk\n :param followsymlinks: bool should we follow symlinks when walking\n :param file_filter: FileFilter: include method returns True if we should include a file/folder\n :return: the top node of the tree LocalFile or LocalFolder\n \"\"\"\n result = None\n if os.path.isfile(path):\n result = LocalFile(path)\n else:\n result = _build_folder_tree(os.path.abspath(path), followsymlinks, file_filter)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nbuilds a tree of LocalFolder objects based on a path.", "response": "def _build_folder_tree(top_abspath, followsymlinks, file_filter):\n \"\"\"\n Build a tree of LocalFolder with children based on a path.\n :param top_abspath: str path to a directory to walk\n :param followsymlinks: bool should we follow symlinks when walking\n :param file_filter: FileFilter: include method returns True if we should include a file/folder\n :return: the top node of the tree LocalFolder\n \"\"\"\n path_to_content = {}\n child_to_parent = {}\n ignore_file_patterns = IgnoreFilePatterns(file_filter)\n ignore_file_patterns.load_directory(top_abspath, followsymlinks)\n for dir_name, child_dirs, child_files in os.walk(top_abspath, followlinks=followsymlinks):\n abspath = os.path.abspath(dir_name)\n folder = LocalFolder(abspath)\n path_to_content[abspath] = folder\n # If we have a parent add us to it.\n parent_path = child_to_parent.get(abspath)\n if parent_path:\n path_to_content[parent_path].add_child(folder)\n remove_child_dirs = []\n for child_dir in child_dirs:\n # Record dir_name as the parent of child_dir so we can call add_child when get to it.\n abs_child_path = os.path.abspath(os.path.join(dir_name, child_dir))\n if ignore_file_patterns.include(abs_child_path, is_file=False):\n child_to_parent[abs_child_path] = abspath\n else:\n remove_child_dirs.append(child_dir)\n for remove_child_dir in remove_child_dirs:\n child_dirs.remove(remove_child_dir)\n for child_filename in child_files:\n abs_child_filename = os.path.join(dir_name, child_filename)\n if ignore_file_patterns.include(abs_child_filename, is_file=True):\n folder.add_child(LocalFile(abs_child_filename))\n return path_to_content.get(top_abspath)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nadds the path and any children files and folders to the list of content.", "response": "def add_path(self, path):\n \"\"\"\n Add the path and any children files/folders to the list of content.\n :param path: str path to add\n \"\"\"\n abspath = os.path.abspath(path)\n self.children.append(_build_project_tree(abspath, self.followsymlinks, self.file_filter))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef update_remote_ids(self, remote_project):\n if remote_project:\n self.remote_id = remote_project.id\n _update_remote_children(remote_project, self.children)", "response": "Compare against remote_project saving off the matching uuids of matching content."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nupdating the remote id based on remote_folder and check children against this folder s children.", "response": "def update_remote_ids(self, remote_folder):\n \"\"\"\n Set remote id based on remote_folder and check children against this folder's children.\n :param remote_folder: RemoteFolder to compare against\n \"\"\"\n self.remote_id = remote_folder.id\n _update_remote_children(remote_folder, self.children)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nupdating the remote_id and need_to_send flag based on the hash info.", "response": "def update_remote_ids(self, remote_file):\n \"\"\"\n Based on a remote file try to assign a remote_id and compare hash info.\n :param remote_file: RemoteFile remote data pull remote_id from\n \"\"\"\n self.remote_id = remote_file.id\n hash_data = self.path_data.get_hash()\n if hash_data.matches(remote_file.hash_alg, remote_file.file_hash):\n self.need_to_send = False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef count_chunks(self, bytes_per_chunk):\n chunks = math.ceil(float(self.size) / float(bytes_per_chunk))\n return max(chunks, 1)", "response": "Count how many chunks we need to upload into a specific file."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef matches(self, hash_alg, hash_value):\n return self.alg == hash_alg and self.value == hash_value", "response": "Returns True if the specified hash algorithm and hash value match the specified arguments."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nguess the mimetype of a file or application / octet - stream", "response": "def mime_type(self):\n \"\"\"\n Guess the mimetype of a file or 'application/octet-stream' if unable to guess.\n :return: str: mimetype\n \"\"\"\n mime_type, encoding = mimetypes.guess_type(self.path)\n if not mime_type:\n mime_type = 'application/octet-stream'\n return mime_type"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreads the whole file into memory.", "response": "def read_whole_file(self):\n \"\"\"\n Slurp the whole file into memory.\n Should only be used with relatively small files.\n :return: str: file contents\n \"\"\"\n chunk = None\n with open(self.path, 'rb') as infile:\n chunk = infile.read()\n return chunk"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding a file to the hash.", "response": "def add_file(self, filename, block_size=4096):\n \"\"\"\n Add an entire file to this hash.\n :param filename: str filename of the file to hash\n :param block_size: int size of chunks when reading the file\n \"\"\"\n with open(filename, \"rb\") as f:\n for chunk in iter(lambda: f.read(block_size), b\"\"):\n self.hash.update(chunk)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ninherit info from parent and return new object", "response": "def get(cls, parent=None, id=None, data=None):\n \"\"\"Inherit info from parent and return new object\"\"\"\n # TODO - allow fetching of parent based on child?\n\n if parent is not None:\n route = copy(parent.route)\n else:\n route = {}\n\n if id is not None and cls.ID_NAME is not None:\n route[cls.ID_NAME] = id\n\n obj = cls(key=parent.key, route=route, config=parent.config)\n\n if data:\n # This is used in \"get all\" queries\n obj.data = data\n else:\n obj.fetch()\n\n return obj"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_all(cls, parent=None, **params):\n\n if parent is not None:\n route = copy(parent.route)\n else:\n route = {}\n if cls.ID_NAME is not None:\n # Empty string triggers \"get all resources\"\n route[cls.ID_NAME] = \"\"\n\n base_obj = cls(key=parent.key, route=route, config=parent.config)\n \"\"\"Perform a read request against the resource\"\"\"\n\n start = datetime.now()\n r = requests.get(\n base_obj._url(), auth=(base_obj.key, \"\"), params=params)\n cls._delay_for_ratelimits(start)\n\n if r.status_code not in cls.TRUTHY_CODES:\n return base_obj._handle_request_exception(r)\n\n response = r.json()\n objects_data = response.get(base_obj.ENVELOPE or base_obj, [])\n\n return_objects = []\n for data in objects_data:\n # Note that this approach does not get meta data\n return_objects.append(\n cls.get(\n parent=parent,\n id=data.get(cls.ID_NAME, data.get(\"id\")),\n data=data))\n\n return return_objects", "response": "Get all the objects in the cache."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting the URL for the resource", "response": "def _url(self):\n \"\"\"Get the URL for the resource\"\"\"\n if self.ID_NAME not in self.route.keys() and \"id\" in self.data.keys():\n self.route[self.ID_NAME] = self.data[\"id\"]\n return self.config.BASE + self.PATH.format(**self.route)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _handle_request_exception(request):\n try:\n data = request.json()\n except:\n data = {}\n\n code = request.status_code\n if code == requests.codes.bad:\n raise BadRequestException(response=data)\n\n if code == requests.codes.unauthorized:\n raise UnauthorizedException(response=data)\n\n if code == requests.codes.not_found:\n raise NotFoundException(response=data)\n\n # Generic error fallback\n request.raise_for_status()", "response": "Raise the proper exception based on the response"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef fetch(self):\n start = datetime.now()\n r = requests.get(self._url(), auth=(self.key, \"\"))\n self._delay_for_ratelimits(start)\n\n if r.status_code not in self.TRUTHY_CODES:\n return self._handle_request_exception(r)\n\n response = r.json()\n if self.ENVELOPE:\n self.data = response.get(self.ENVELOPE, {})\n else:\n self.data = response\n\n # Move to separate function so it can be overrridden\n self._process_meta(response)", "response": "Perform a read request against the resource"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _process_meta(self, response):\n for key in self.META_ENVELOPES:\n self.meta[key] = response.get(key)", "response": "Process additional data sent in response"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef delete(self):\n\n start = datetime.now()\n r = requests.delete(self._url(), auth=(self.key, \"\"))\n self._delay_for_ratelimits(start)\n\n if r.status_code not in self.TRUTHY_CODES:\n return self._handle_request_exception(r)", "response": "Delete the object from the server"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nchange attributes of the item", "response": "def patch(self, **kwargs):\n \"\"\"Change attributes of the item\"\"\"\n start = datetime.now()\n r = requests.patch(self._url(), auth=(self.key, \"\"), data=kwargs)\n self._delay_for_ratelimits(start)\n\n if r.status_code not in self.TRUTHY_CODES:\n return self._handle_request_exception(r)\n\n # Refetch for safety. We could modify based on response,\n # but I'm afraid of some edge cases and marshal functions.\n self.fetch()"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef create(cls, parent=None, **kwargs):\n\n if parent is None:\n raise Exception(\"Parent class is required\")\n\n route = copy(parent.route)\n if cls.ID_NAME is not None:\n route[cls.ID_NAME] = \"\"\n\n obj = cls(key=parent.key, route=route, config=parent.config)\n\n start = datetime.now()\n response = requests.post(obj._url(), auth=(obj.key, \"\"), data=kwargs)\n cls._delay_for_ratelimits(start)\n\n if response.status_code not in cls.TRUTHY_CODES:\n return cls._handle_request_exception(response)\n\n # No envelope on post requests\n data = response.json()\n obj.route[obj.ID_NAME] = data.get(\"id\", data.get(obj.ID_NAME))\n obj.data = data\n\n return obj", "response": "Create an object and return it"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ndelaying for ratelimits if request was shorter than max request time.", "response": "def _delay_for_ratelimits(cls, start):\n \"\"\"If request was shorter than max request time, delay\"\"\"\n stop = datetime.now()\n duration_microseconds = (stop - start).microseconds\n if duration_microseconds < cls.REQUEST_TIME_MICROSECONDS:\n time.sleep((cls.REQUEST_TIME_MICROSECONDS - duration_microseconds)\n / MICROSECONDS_PER_SECOND)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef load_data(filename, format_file='cloudupdrs'):\n \n logging.debug(\"{} data --> Loaded\".format(format_file))\n \n data_frame = load_data(filename, format_file)\n data_frame.sampling_rate = get_sampling_rate_from_timestamp(data_frame)\n \n return data_frame", "response": "This method loads the data from a file and returns a DataFrame with the data."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nruns the configured method and writes the HTTP response status and text to the output stream.", "response": "def run(self):\n \"\"\"\n Run the configured method and write the HTTP response status and text\n to the output stream.\n \"\"\"\n region = AWSServiceRegion(access_key=self.key, secret_key=self.secret,\n uri=self.endpoint)\n query = self.query_factory(action=self.action, creds=region.creds,\n endpoint=region.ec2_endpoint,\n other_params=self.parameters)\n\n def write_response(response):\n print >> self.output, \"URL: %s\" % query.client.url\n print >> self.output\n print >> self.output, \"HTTP status code: %s\" % query.client.status\n print >> self.output\n print >> self.output, response\n\n def write_error(failure):\n if failure.check(AWSError):\n message = failure.value.original\n else:\n message = failure.getErrorMessage()\n if message.startswith(\"Error Message: \"):\n message = message[len(\"Error Message: \"):]\n\n print >> self.output, \"URL: %s\" % query.client.url\n print >> self.output\n if getattr(query.client, \"status\", None) is not None:\n print >> self.output, \"HTTP status code: %s\" % (\n query.client.status,)\n print >> self.output\n print >> self.output, message\n\n if getattr(failure.value, \"response\", None) is not None:\n print >> self.output\n print >> self.output, failure.value.response\n\n deferred = query.submit()\n deferred.addCallback(write_response)\n deferred.addErrback(write_error)\n return deferred"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the canonical host as for the Host HTTP header specification.", "response": "def get_canonical_host(self):\n \"\"\"\n Return the canonical host as for the Host HTTP header specification.\n \"\"\"\n host = self.host.lower()\n if self.port is not None:\n host = \"%s:%s\" % (host, self.port)\n return host"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsets host and port from a canonical host string as for the Host HTTP header specification.", "response": "def set_canonical_host(self, canonical_host):\n \"\"\"\n Set host and port from a canonical host string as for the Host HTTP\n header specification.\n \"\"\"\n parts = canonical_host.lower().split(\":\")\n self.host = parts[0]\n if len(parts) > 1 and parts[1]:\n self.port = int(parts[1])\n else:\n self.port = None"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting a URL representation of the service.", "response": "def get_uri(self):\n \"\"\"Get a URL representation of the service.\"\"\"\n uri = \"%s://%s%s\" % (self.scheme, self.get_canonical_host(), self.path)\n return uri"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nrun the optimization on the given TimeSeries.", "response": "def optimize(self, timeSeries, forecastingMethods=None, startingPercentage=0.0, endPercentage=100.0):\n \"\"\"Runs the optimization on the given TimeSeries.\n\n :param TimeSeries timeSeries: TimeSeries instance that requires an optimized forecast.\n :param list forecastingMethods: List of forecastingMethods that will be used for optimization.\n :param float startingPercentage: Defines the start of the interval. This has to be a value in [0.0, 100.0].\n It represents the value, where the error calculation should be started.\n 25.0 for example means that the first 25% of all calculated errors will be ignored.\n :param float endPercentage: Defines the end of the interval. This has to be a value in [0.0, 100.0].\n It represents the value, after which all error values will be ignored. 90.0 for example means that\n the last 10% of all local errors will be ignored.\n\n\n :return: Returns the optimized forecasting method with the smallest error.\n :rtype: (BaseForecastingMethod, Dictionary)\n\n :raise: Raises a :py:exc:`ValueError` ValueError if no forecastingMethods is empty.\n \"\"\"\n # no forecasting methods provided\n if forecastingMethods is None or len(forecastingMethods) == 0:\n raise ValueError(\"forecastingMethods cannot be empty.\")"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef include_path(self, path):\n self.seen_paths.add(path)\n return self.filter.include(path)", "response": "Returns True if we should include the path"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn which include_paths or exclude_paths that were not used via include_path method.", "response": "def get_unused_paths(self):\n \"\"\"\n Returns which include_paths or exclude_paths that were not used via include_path method.\n :return: [str] list of filtering paths that were not used.\n \"\"\"\n return [path for path in self.filter.paths if path not in self.seen_paths]"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nam child_path a parent directory of parent_path?", "response": "def is_child(child_path, parent_path):\n \"\"\"\n Is parent_path a parent(or grandparent) directory of child_path.\n :param child_path: str: remote file path\n :param parent_path: str: remote file path\n :return: bool: True when parent_path is child_path's parent\n \"\"\"\n parent_dir = os.path.join(parent_path, '')\n child_dir = os.path.join(child_path, '')\n return child_dir.startswith(parent_dir)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef parent_child_paths(path, some_path):\n return PathFilterUtil.is_child(path, some_path) or PathFilterUtil.is_child(some_path, path)", "response": "Is path a parent of some_path?"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef method(method_class):\n\n def callback(scanner, name, method_class):\n if method_class.actions is not None:\n actions = method_class.actions\n else:\n actions = [name]\n\n if method_class.versions is not None:\n versions = method_class.versions\n else:\n versions = [None]\n\n for action in actions:\n for version in versions:\n scanner.registry.add(method_class,\n action=action,\n version=version)\n\n from venusian import attach\n attach(method_class, callback, category=\"method\")\n return method_class", "response": "Decorator to use to mark an API method."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ntake the byte content of a Python script and returns a hex encoded version of it.", "response": "def hexlify(script, minify=False):\n \"\"\"\n Takes the byte content of a Python script and returns a hex encoded\n version of it.\n\n Based on the hexlify script in the microbit-micropython repository.\n \"\"\"\n if not script:\n return ''\n # Convert line endings in case the file was created on Windows.\n script = script.replace(b'\\r\\n', b'\\n')\n script = script.replace(b'\\r', b'\\n')\n if minify:\n if not can_minify:\n raise ValueError(\"No minifier is available\")\n script = nudatus.mangle(script.decode('utf-8')).encode('utf-8')\n # Add header, pad to multiple of 16 bytes.\n data = b'MP' + struct.pack(' _MAX_SIZE:\n # 'MP' = 2 bytes, script length is another 2 bytes.\n raise ValueError(\"Python script must be less than 8188 bytes.\")\n # Convert to .hex format.\n output = [':020000040003F7'] # extended linear address, 0x0003.\n addr = _SCRIPT_ADDR\n for i in range(0, len(data), 16):\n chunk = data[i:min(i + 16, len(data))]\n chunk = struct.pack('>BHB', len(chunk), addr & 0xffff, 0) + chunk\n checksum = (-(sum(bytearray(chunk)))) & 0xff\n hexline = ':%s%02X' % (strfunc(binascii.hexlify(chunk)).upper(),\n checksum)\n output.append(hexline)\n addr += 16\n return '\\n'.join(output)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef unhexlify(blob):\n lines = blob.split('\\n')[1:]\n output = []\n for line in lines:\n # Discard the address, length etc. and reverse the hexlification\n output.append(binascii.unhexlify(line[9:-2]))\n # Check the header is correct (\"MP\")\n if (output[0][0:2].decode('utf-8') != u'MP'):\n return ''\n # Strip off header\n output[0] = output[0][4:]\n # and strip any null bytes from the end\n output[-1] = output[-1].strip(b'\\x00')\n script = b''.join(output)\n try:\n result = script.decode('utf-8')\n return result\n except UnicodeDecodeError:\n # Return an empty string because in certain rare circumstances (where\n # the source hex doesn't include any embedded Python code) this\n # function may be passed in \"raw\" bytes from MicroPython.\n return ''", "response": "Takes a hexlified script and turns it back into a string of Python code."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ngive a string representing the MicroPython runtime hex embeds a Python script into the resulting string.", "response": "def embed_hex(runtime_hex, python_hex=None):\n \"\"\"\n Given a string representing the MicroPython runtime hex, will embed a\n string representing a hex encoded Python script into it.\n\n Returns a string representation of the resulting combination.\n\n Will raise a ValueError if the runtime_hex is missing.\n\n If the python_hex is missing, it will return the unmodified runtime_hex.\n \"\"\"\n if not runtime_hex:\n raise ValueError('MicroPython runtime hex required.')\n if not python_hex:\n return runtime_hex\n py_list = python_hex.split()\n runtime_list = runtime_hex.split()\n embedded_list = []\n # The embedded list should be the original runtime with the Python based\n # hex embedded two lines from the end.\n embedded_list.extend(runtime_list[:-5])\n embedded_list.extend(py_list)\n embedded_list.extend(runtime_list[-5:])\n return '\\n'.join(embedded_list) + '\\n'"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngives a hex file containing the MicroPython runtime and an embedded Python script extract the original Python script.", "response": "def extract_script(embedded_hex):\n \"\"\"\n Given a hex file containing the MicroPython runtime and an embedded Python\n script, will extract the original Python script.\n\n Returns a string containing the original embedded script.\n \"\"\"\n hex_lines = embedded_hex.split('\\n')\n script_addr_high = hex((_SCRIPT_ADDR >> 16) & 0xffff)[2:].upper().zfill(4)\n script_addr_low = hex(_SCRIPT_ADDR & 0xffff)[2:].upper().zfill(4)\n start_script = None\n within_range = False\n # Look for the script start address\n for loc, val in enumerate(hex_lines):\n if val[0:9] == ':02000004':\n # Reached an extended address record, check if within script range\n within_range = val[9:13].upper() == script_addr_high\n elif within_range and val[0:3] == ':10' and \\\n val[3:7].upper() == script_addr_low:\n start_script = loc\n break\n if start_script:\n # Find the end of the script\n end_script = None\n for loc, val in enumerate(hex_lines[start_script:]):\n if val[9:41] == 'FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF':\n end_script = loc + start_script\n break\n # Pass the extracted hex through unhexlify\n return unhexlify('\\n'.join(\n hex_lines[start_script - 1:end_script if end_script else -6]))\n return ''"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfinds the microbit file that is to be flashed.", "response": "def find_microbit():\n \"\"\"\n Returns a path on the filesystem that represents the plugged in BBC\n micro:bit that is to be flashed. If no micro:bit is found, it returns\n None.\n\n Works on Linux, OSX and Windows. Will raise a NotImplementedError\n exception if run on any other operating system.\n \"\"\"\n # Check what sort of operating system we're on.\n if os.name == 'posix':\n # 'posix' means we're on Linux or OSX (Mac).\n # Call the unix \"mount\" command to list the mounted volumes.\n mount_output = check_output('mount').splitlines()\n mounted_volumes = [x.split()[2] for x in mount_output]\n for volume in mounted_volumes:\n if volume.endswith(b'MICROBIT'):\n return volume.decode('utf-8') # Return a string not bytes.\n elif os.name == 'nt':\n # 'nt' means we're on Windows.\n\n def get_volume_name(disk_name):\n \"\"\"\n Each disk or external device connected to windows has an attribute\n called \"volume name\". This function returns the volume name for\n the given disk/device.\n\n Code from http://stackoverflow.com/a/12056414\n \"\"\"\n vol_name_buf = ctypes.create_unicode_buffer(1024)\n ctypes.windll.kernel32.GetVolumeInformationW(\n ctypes.c_wchar_p(disk_name), vol_name_buf,\n ctypes.sizeof(vol_name_buf), None, None, None, None, 0)\n return vol_name_buf.value\n\n #\n # In certain circumstances, volumes are allocated to USB\n # storage devices which cause a Windows popup to raise if their\n # volume contains no media. Wrapping the check in SetErrorMode\n # with SEM_FAILCRITICALERRORS (1) prevents this popup.\n #\n old_mode = ctypes.windll.kernel32.SetErrorMode(1)\n try:\n for disk in 'ABCDEFGHIJKLMNOPQRSTUVWXYZ':\n path = '{}:\\\\'.format(disk)\n #\n # Don't bother looking if the drive isn't removable\n #\n if ctypes.windll.kernel32.GetDriveTypeW(path) != 2:\n continue\n if os.path.exists(path) and \\\n get_volume_name(path) == 'MICROBIT':\n return path\n finally:\n ctypes.windll.kernel32.SetErrorMode(old_mode)\n else:\n # No support for unknown operating systems.\n raise NotImplementedError('OS \"{}\" not supported.'.format(os.name))"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngives a string representation of a hex file, this function copies it to the specified path thus causing the device mounted at that point to be flashed. If the hex_file is empty it will raise a ValueError. If the filename at the end of the path does not end in '.hex' it will raise a ValueError.", "response": "def save_hex(hex_file, path):\n \"\"\"\n Given a string representation of a hex file, this function copies it to\n the specified path thus causing the device mounted at that point to be\n flashed.\n\n If the hex_file is empty it will raise a ValueError.\n\n If the filename at the end of the path does not end in '.hex' it will raise\n a ValueError.\n \"\"\"\n if not hex_file:\n raise ValueError('Cannot flash an empty .hex file.')\n if not path.endswith('.hex'):\n raise ValueError('The path to flash must be for a .hex file.')\n with open(path, 'wb') as output:\n output.write(hex_file.encode('ascii'))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef flash(path_to_python=None, paths_to_microbits=None,\n path_to_runtime=None, python_script=None, minify=False):\n \"\"\"\n Given a path to or source of a Python file will attempt to create a hex\n file and then flash it onto the referenced BBC micro:bit.\n\n If the path_to_python & python_script are unspecified it will simply flash\n the unmodified MicroPython runtime onto the device.\n\n If used, the python_script argument should be a bytes object representing\n a UTF-8 encoded string. For example::\n\n script = \"from microbit import *\\\\ndisplay.scroll('Hello, World!')\"\n uflash.flash(python_script=script.encode('utf-8'))\n\n If paths_to_microbits is unspecified it will attempt to find the device's\n path on the filesystem automatically.\n\n If the path_to_runtime is unspecified it will use the built in version of\n the MicroPython runtime. This feature is useful if a custom build of\n MicroPython is available.\n\n If the automatic discovery fails, then it will raise an IOError.\n \"\"\"\n # Check for the correct version of Python.\n if not ((sys.version_info[0] == 3 and sys.version_info[1] >= 3) or\n (sys.version_info[0] == 2 and sys.version_info[1] >= 7)):\n raise RuntimeError('Will only run on Python 2.7, or 3.3 and later.')\n # Grab the Python script (if needed).\n python_hex = ''\n if path_to_python:\n if not path_to_python.endswith('.py'):\n raise ValueError('Python files must end in \".py\".')\n with open(path_to_python, 'rb') as python_script:\n python_hex = hexlify(python_script.read(), minify)\n elif python_script:\n python_hex = hexlify(python_script, minify)\n\n runtime = _RUNTIME\n # Load the hex for the runtime.\n if path_to_runtime:\n with open(path_to_runtime) as runtime_file:\n runtime = runtime_file.read()\n # Generate the resulting hex file.\n micropython_hex = embed_hex(runtime, python_hex)\n # Find the micro:bit.\n if not paths_to_microbits:\n found_microbit = find_microbit()\n if found_microbit:\n paths_to_microbits = [found_microbit]\n # Attempt to write the hex file to the micro:bit.\n if paths_to_microbits:\n for path in paths_to_microbits:\n hex_path = os.path.join(path, 'micropython.hex')\n print('Flashing Python to: {}'.format(hex_path))\n save_hex(micropython_hex, hex_path)\n else:\n raise IOError('Unable to find micro:bit. Is it plugged in?')", "response": "This function will create a hex file and then flash it onto the device."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ngive a path_to_hex file this function will attempt to extract the embedded script from it and save it to output_path or stdout", "response": "def extract(path_to_hex, output_path=None):\n \"\"\"\n Given a path_to_hex file this function will attempt to extract the\n embedded script from it and save it either to output_path or stdout\n \"\"\"\n with open(path_to_hex, 'r') as hex_file:\n python_script = extract_script(hex_file.read())\n if output_path:\n with open(output_path, 'w') as output_file:\n output_file.write(python_script)\n else:\n print(python_script)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nwatch a file for changes by polling its last modification time.", "response": "def watch_file(path, func, *args, **kwargs):\n \"\"\"\n Watch a file for changes by polling its last modification time. Call the\n provided function with *args and **kwargs upon modification.\n \"\"\"\n if not path:\n raise ValueError('Please specify a file to watch')\n print('Watching \"{}\" for changes'.format(path))\n last_modification_time = os.path.getmtime(path)\n try:\n while True:\n time.sleep(1)\n new_modification_time = os.path.getmtime(path)\n if new_modification_time == last_modification_time:\n continue\n func(*args, **kwargs)\n last_modification_time = new_modification_time\n except KeyboardInterrupt:\n pass"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nmeasure the execution time of a function call and print the result.", "response": "def timing(func):\n \"\"\"Measure the execution time of a function call and print the result.\"\"\"\n @functools.wraps(func)\n def wrap(*args, **kw):\n t0 = time()\n result = func(*args, **kw)\n t1 = time()\n print('func:%r args:[%r, %r] took: %2.4f sec' %\n (func.__name__, args, kw, t1 - t0))\n return result\n return wrap"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef deprecated(func):\n @functools.wraps(func)\n def new_func(*args, **kwargs):\n if sys.version_info < (3, 0):\n warnings.warn_explicit(\n \"Call to deprecated function {}.\".format(func.__name__),\n category=DeprecationWarning,\n filename=func.func_code.co_filename,\n lineno=func.func_code.co_firstlineno + 1\n )\n else:\n warnings.warn_explicit(\n \"Call to deprecated function {}.\".format(func.__name__),\n category=DeprecationWarning,\n filename=func.__code__.co_filename,\n lineno=func.__code__.co_firstlineno + 1\n )\n return func(*args, **kwargs)\n return new_func", "response": "Mark functions as deprecated."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_currency(currency_str):\n path = 'units/currencies.csv' # always use slash in Python packages\n filepath = pkg_resources.resource_filename('mpu', path)\n with open(filepath, 'r') as fp:\n reader = csv.reader(fp, delimiter=',', quotechar='\"')\n next(reader, None) # skip the headers\n for row in reader:\n is_currency = currency_str in [row[0], row[1], row[2]]\n if is_currency:\n entity = row[0]\n name = row[1]\n code = row[2]\n numeric_code = row[3]\n symbol = row[4]\n if len(row[5]) == 0:\n exponent = None\n else:\n exponent = int(row[5])\n if len(row[6]) > 0:\n withdrawal_date = row[6]\n else:\n withdrawal_date = None\n subunits = row[7]\n return Currency(name=name,\n code=code,\n numeric_code=numeric_code,\n symbol=symbol,\n exponent=exponent,\n entities=[entity],\n withdrawal_date=withdrawal_date,\n subunits=subunits)\n raise ValueError('Could not find currency \\'{}\\''.format(currency_str))", "response": "Convert an identifier for a currency into a Currency object."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a Currency object from a JSON dump.", "response": "def from_json(cls, json):\n \"\"\"Create a Currency object from a JSON dump.\"\"\"\n obj = cls(name=json['name'],\n code=json['code'],\n numeric_code=json['numeric_code'],\n symbol=json['symbol'],\n exponent=json['exponent'],\n entities=json['entities'],\n withdrawal_date=json['withdrawal_date'],\n subunits=json['subunits'])\n return obj"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef print_table(table):\n table = [[str(cell) for cell in row] for row in table]\n column_widths = [len(cell) for cell in table[0]]\n for row in table:\n for x, cell in enumerate(row):\n column_widths[x] = max(column_widths[x], len(cell))\n\n formatters = []\n for width in column_widths:\n formatters.append('{:>' + str(width) + '}')\n formatter = ' '.join(formatters)\n for row in table:\n print(formatter.format(*row))", "response": "Print a list of tables as a single table."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if the input is a valid e - mail address.", "response": "def is_email(potential_email_address):\n \"\"\"\n Check if potential_email_address is a valid e-mail address.\n\n Please note that this function has no false-negatives but many\n false-positives. So if it returns that the input is not a valid\n e-mail adress, it certainly isn't. If it returns True, it might still be\n invalid. For example, the domain could not be registered.\n\n Parameters\n ----------\n potential_email_address : str\n\n Returns\n -------\n is_email : bool\n\n Examples\n --------\n >>> is_email('')\n False\n >>> is_email('info@martin-thoma.de')\n True\n >>> is_email('info@math.martin-thoma.de')\n True\n >>> is_email('Martin Thoma ')\n False\n >>> is_email('info@martin-thoma')\n False\n \"\"\"\n context, mail = parseaddr(potential_email_address)\n first_condition = len(context) == 0 and len(mail) != 0\n dot_after_at = ('@' in potential_email_address and\n '.' in potential_email_address.split('@')[1])\n return first_condition and dot_after_at"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nconverting a string to a boolean.", "response": "def str2bool(string_, default='raise'):\n \"\"\"\n Convert a string to a bool.\n\n Parameters\n ----------\n string_ : str\n default : {'raise', False}\n Default behaviour if none of the \"true\" strings is detected.\n\n Returns\n -------\n boolean : bool\n\n Examples\n --------\n >>> str2bool('True')\n True\n >>> str2bool('1')\n True\n >>> str2bool('0')\n False\n \"\"\"\n true = ['true', 't', '1', 'y', 'yes', 'enabled', 'enable', 'on']\n false = ['false', 'f', '0', 'n', 'no', 'disabled', 'disable', 'off']\n if string_.lower() in true:\n return True\n elif string_.lower() in false or (not default):\n return False\n else:\n raise ValueError('The value \\'{}\\' cannot be mapped to boolean.'\n .format(string_))"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nconverts a string to a bool or to None.", "response": "def str2bool_or_none(string_, default='raise'):\n \"\"\"\n Convert a string to a bool or to None.\n\n Parameters\n ----------\n string_ : str\n default : {'raise', False}\n Default behaviour if none of the \"true\" or \"none\" strings is detected.\n\n Returns\n -------\n bool_or_none : bool or None\n\n Examples\n --------\n >>> str2bool_or_none('True')\n True\n >>> str2bool_or_none('1')\n True\n >>> str2bool_or_none('0')\n False\n >>> str2bool_or_none('undefined')\n \"\"\"\n if is_none(string_, default=False):\n return None\n else:\n return str2bool(string_, default)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck if a string is equivalent to None.", "response": "def is_none(string_, default='raise'):\n \"\"\"\n Check if a string is equivalent to None.\n\n Parameters\n ----------\n string_ : str\n default : {'raise', False}\n Default behaviour if none of the \"None\" strings is detected.\n\n Returns\n -------\n is_none : bool\n\n Examples\n --------\n >>> is_none('2', default=False)\n False\n >>> is_none('undefined', default=False)\n True\n \"\"\"\n none = ['none', 'undefined', 'unknown', 'null', '']\n if string_.lower() in none:\n return True\n elif not default:\n return False\n else:\n raise ValueError('The value \\'{}\\' cannot be mapped to none.'\n .format(string_))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef is_iban(potential_iban):\n path = 'data/iban.csv' # always use slash in Python packages\n filepath = pkg_resources.resource_filename('mpu', path)\n data = mpu.io.read(filepath, delimiter=';', format='dicts')\n potential_iban = potential_iban.replace(' ', '') # Remove spaces\n if len(potential_iban) < min([int(el['length']) for el in data]):\n return False\n country = None\n for element in data:\n if element['iban_fields'][:2] == potential_iban[:2]:\n country = element\n break\n if country is None:\n return False\n if len(potential_iban) != int(country['length']):\n return False\n if country['country_en'] == 'Germany':\n checksum_val = [value\n for field_type, value in\n zip(country['iban_fields'], potential_iban)\n if field_type == 'k']\n checksum_val = ''.join(checksum_val)\n checksum_exp = _calculate_german_iban_checksum(potential_iban,\n country['iban_fields'])\n return checksum_val == checksum_exp\n return True", "response": "Check if a string is a valid IBAN number."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncalculating the checksam of the German IBAN format.", "response": "def _calculate_german_iban_checksum(iban,\n iban_fields='DEkkbbbbbbbbcccccccccc'):\n \"\"\"\n Calculate the checksam of the German IBAN format.\n\n Examples\n --------\n >>> iban = 'DE41500105170123456789'\n >>> _calculate_german_iban_checksum(iban)\n '41'\n \"\"\"\n number = [value\n for field_type, value in zip(iban_fields, iban)\n if field_type in ['b', 'c']]\n translate = {'0': '0', '1': '1', '2': '2', '3': '3', '4': '4', '5': '5',\n '6': '6', '7': '7', '8': '8', '9': '9'}\n for i in range(ord('A'), ord('Z') + 1):\n translate[chr(i)] = str(i - ord('A') + 10)\n for val in 'DE00':\n translated = translate[val]\n for char in translated:\n number.append(char)\n number = sum(int(value) * 10**i for i, value in enumerate(number[::-1]))\n checksum = 98 - (number % 97)\n return str(checksum)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef human_readable_bytes(nb_bytes, suffix='B'):\n for unit in ['', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi']:\n if abs(nb_bytes) < 1024.0:\n return '%3.1f %s%s' % (nb_bytes, unit, suffix)\n nb_bytes /= 1024.0\n return '%.1f %s%s' % (nb_bytes, 'Yi', suffix)", "response": "Convert a byte number into a human readable format."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef list_files(bucket, profile_name=None):\n session = boto3.Session(profile_name=profile_name)\n conn = session.client('s3')\n keys = []\n ret = conn.list_objects(Bucket=bucket)\n print(ret)\n if 'Contents' not in ret:\n return []\n # Make this a generator in future and use the marker:\n # https://boto3.readthedocs.io/en/latest/reference/services/\n # s3.html#S3.Client.list_objects\n for key in conn.list_objects(Bucket=bucket)['Contents']:\n keys.append('s3://' + bucket + '/' + key['Key'])\n return keys", "response": "List up to 1000 files in a bucket."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nread a file from an S3 source.", "response": "def s3_read(source, profile_name=None):\n \"\"\"\n Read a file from an S3 source.\n\n Parameters\n ----------\n source : str\n Path starting with s3://, e.g. 's3://bucket-name/key/foo.bar'\n profile_name : str, optional\n AWS profile\n\n Returns\n -------\n content : bytes\n\n Raises\n ------\n botocore.exceptions.NoCredentialsError\n Botocore is not able to find your credentials. Either specify\n profile_name or add the environment variables AWS_ACCESS_KEY_ID,\n AWS_SECRET_ACCESS_KEY and AWS_SESSION_TOKEN.\n See https://boto3.readthedocs.io/en/latest/guide/configuration.html\n \"\"\"\n session = boto3.Session(profile_name=profile_name)\n s3 = session.client('s3')\n bucket_name, key = _s3_path_split(source)\n s3_object = s3.get_object(Bucket=bucket_name, Key=key)\n body = s3_object['Body']\n return body.read()"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncopies a file from an S3 source to a local destination.", "response": "def s3_download(source, destination,\n exists_strategy=ExistsStrategy.RAISE,\n profile_name=None):\n \"\"\"\n Copy a file from an S3 source to a local destination.\n\n Parameters\n ----------\n source : str\n Path starting with s3://, e.g. 's3://bucket-name/key/foo.bar'\n destination : str\n exists_strategy : {'raise', 'replace', 'abort'}\n What is done when the destination already exists?\n * `ExistsStrategy.RAISE` means a RuntimeError is raised,\n * `ExistsStrategy.REPLACE` means the local file is replaced,\n * `ExistsStrategy.ABORT` means the download is not done.\n profile_name : str, optional\n AWS profile\n\n Raises\n ------\n botocore.exceptions.NoCredentialsError\n Botocore is not able to find your credentials. Either specify\n profile_name or add the environment variables AWS_ACCESS_KEY_ID,\n AWS_SECRET_ACCESS_KEY and AWS_SESSION_TOKEN.\n See https://boto3.readthedocs.io/en/latest/guide/configuration.html\n \"\"\"\n if not isinstance(exists_strategy, ExistsStrategy):\n raise ValueError('exists_strategy \\'{}\\' is not in {}'\n .format(exists_strategy, ExistsStrategy))\n session = boto3.Session(profile_name=profile_name)\n s3 = session.resource('s3')\n bucket_name, key = _s3_path_split(source)\n if os.path.isfile(destination):\n if exists_strategy is ExistsStrategy.RAISE:\n raise RuntimeError('File \\'{}\\' already exists.'\n .format(destination))\n elif exists_strategy is ExistsStrategy.ABORT:\n return\n s3.Bucket(bucket_name).download_file(key, destination)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef s3_upload(source, destination, profile_name=None):\n session = boto3.Session(profile_name=profile_name)\n s3 = session.resource('s3')\n bucket_name, key = _s3_path_split(destination)\n with open(source, 'rb') as data:\n s3.Bucket(bucket_name).put_object(Key=key, Body=data)", "response": "Copy a file from a local source to an S3 destination."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _s3_path_split(s3_path):\n if not s3_path.startswith('s3://'):\n raise ValueError('s3_path is expected to start with \\'s3://\\', '\n 'but was {}'.format(s3_path))\n bucket_key = s3_path[len('s3://'):]\n bucket_name, key = bucket_key.split('/', 1)\n return S3Path(bucket_name, key)", "response": "Splits an S3 path into bucket and key."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget meta - information of an image.", "response": "def get_meta(filepath):\n \"\"\"\n Get meta-information of an image.\n\n Parameters\n ----------\n filepath : str\n\n Returns\n -------\n meta : dict\n \"\"\"\n meta = {}\n try:\n from PIL import Image\n with Image.open(filepath) as img:\n width, height = img.size\n meta['width'] = width\n meta['height'] = height\n meta['channels'] = len(img.mode) # RGB, RGBA - does this always work?\n except ImportError:\n pass\n\n # Get times - creation, last edit, last open\n meta['file'] = mpu.io.get_file_meta(filepath)\n return meta"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nflattens an iterable of iterables into one list.", "response": "def flatten(iterable, string_flattening=False):\n \"\"\"\n Flatten an given iterable of iterables into one list.\n\n Parameters\n ----------\n iterable : iterable\n string_flattening : bool\n If this is False, then strings are NOT flattened\n\n Returns\n -------\n flat_list : List\n\n Examples\n --------\n >>> flatten([1, [2, [3]]])\n [1, 2, 3]\n\n >>> flatten(((1, 2), (3, 4), (5, 6)))\n [1, 2, 3, 4, 5, 6]\n\n >>> flatten(EList([EList([1, 2]), (3, [4, [[5]]])]))\n [1, 2, 3, 4, 5]\n \"\"\"\n flat_list = []\n for item in iterable:\n is_iterable = (isinstance(item, collections.Iterable) and\n (string_flattening or\n (not string_flattening and not isinstance(item, str))\n ))\n if is_iterable:\n flat_list.extend(flatten(item))\n else:\n flat_list.append(item)\n return flat_list"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef dict_merge(dict_left, dict_right, merge_method='take_left_shallow'):\n new_dict = {}\n if merge_method in ['take_right_shallow', 'take_right_deep']:\n return _dict_merge_right(dict_left, dict_right, merge_method)\n elif merge_method == 'take_left_shallow':\n return dict_merge(dict_right, dict_left, 'take_right_shallow')\n elif merge_method == 'take_left_deep':\n return dict_merge(dict_right, dict_left, 'take_right_deep')\n elif merge_method == 'sum':\n new_dict = deepcopy(dict_left)\n for key, value in dict_right.items():\n if key not in new_dict:\n new_dict[key] = value\n else:\n recurse = isinstance(value, dict)\n if recurse:\n new_dict[key] = dict_merge(dict_left[key],\n dict_right[key],\n merge_method='sum')\n else:\n new_dict[key] = dict_left[key] + dict_right[key]\n return new_dict\n else:\n raise NotImplementedError('merge_method=\\'{}\\' is not known.'\n .format(merge_method))", "response": "This method merges two dictionaries."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _dict_merge_right(dict_left, dict_right, merge_method):\n new_dict = deepcopy(dict_left)\n for key, value in dict_right.items():\n if key not in new_dict:\n new_dict[key] = value\n else:\n recurse = (merge_method == 'take_right_deep' and\n isinstance(dict_left[key], dict) and\n isinstance(dict_right[key], dict))\n if recurse:\n new_dict[key] = dict_merge(dict_left[key],\n dict_right[key],\n merge_method='take_right_deep')\n else:\n new_dict[key] = value\n return new_dict", "response": "See documentation of mpu.datastructures.dict_merge."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsets a value in a nested dictionary by defining a list of keys.", "response": "def set_dict_value(dictionary, keys, value):\n \"\"\"\n Set a value in a (nested) dictionary by defining a list of keys.\n\n .. note:: Side-effects\n This function does not make a copy of dictionary, but directly\n edits it.\n\n Parameters\n ----------\n dictionary : dict\n keys : List[Any]\n value : object\n\n Returns\n -------\n dictionary : dict\n\n Examples\n --------\n >>> d = {'a': {'b': 'c', 'd': 'e'}}\n >>> expected = {'a': {'b': 'foobar', 'd': 'e'}}\n >>> set_dict_value(d, ['a', 'b'], 'foobar') == expected\n True\n \"\"\"\n orig = dictionary\n for key in keys[:-1]:\n dictionary = dictionary.setdefault(key, {})\n dictionary[keys[-1]] = value\n return orig"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nchecking if a sequence of keys exist in a nested dictionary.", "response": "def does_keychain_exist(dict_, list_):\n \"\"\"\n Check if a sequence of keys exist in a nested dictionary.\n\n Parameters\n ----------\n dict_ : Dict[str/int/tuple, Any]\n list_ : List[str/int/tuple]\n\n Returns\n -------\n keychain_exists : bool\n\n Examples\n --------\n >>> d = {'a': {'b': {'c': 'd'}}}\n >>> l_exists = ['a', 'b']\n >>> does_keychain_exist(d, l_exists)\n True\n\n >>> l_no_existant = ['a', 'c']\n >>> does_keychain_exist(d, l_no_existant)\n False\n \"\"\"\n for key in list_:\n if key not in dict_:\n return False\n dict_ = dict_[key]\n return True"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns a new list of items from the list that do not have the given indices.", "response": "def remove_indices(self, indices):\n \"\"\"\n Remove rows by which have the given indices.\n\n Parameters\n ----------\n indices : list\n\n Returns\n -------\n filtered_list : EList\n \"\"\"\n new_list = []\n for index, element in enumerate(self):\n if index not in indices:\n new_list.append(element)\n return EList(new_list)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_all_files(root, followlinks=False):\n filepaths = []\n for path, _, files in os.walk(root, followlinks=followlinks):\n for name in files:\n filepaths.append(os.path.abspath(os.path.join(path, name)))\n return filepaths", "response": "Get all files within a given root directory."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_from_package(package_name, path):\n filepath = pkg_resources.resource_filename(package_name, path)\n return os.path.abspath(filepath)", "response": "Get the absolute path to a file in a package."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncreate an example dataframe.", "response": "def example_df():\n \"\"\"Create an example dataframe.\"\"\"\n country_names = ['Germany',\n 'France',\n 'Indonesia',\n 'Ireland',\n 'Spain',\n 'Vatican']\n population = [82521653, 66991000, 255461700, 4761865, 46549045, None]\n population_time = [dt.datetime(2016, 12, 1),\n dt.datetime(2017, 1, 1),\n dt.datetime(2017, 1, 1),\n None, # Ireland\n dt.datetime(2017, 6, 1), # Spain\n None,\n ]\n euro = [True, True, False, True, True, True]\n df = pd.DataFrame({'country': country_names,\n 'population': population,\n 'population_time': population_time,\n 'EUR': euro})\n df = df[['country', 'population', 'population_time', 'EUR']]\n return df"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nprint a description of a Pandas dataframe.", "response": "def describe(df, dtype=None):\n \"\"\"\n Print a description of a Pandas dataframe.\n\n Parameters\n ----------\n df : Pandas.DataFrame\n dtype : dict\n Maps column names to types\n \"\"\"\n if dtype is None:\n dtype = {}\n print('Number of datapoints: {datapoints}'.format(datapoints=len(df)))\n column_info, column_info_meta = _get_column_info(df, dtype)\n\n if len(column_info['int']) > 0:\n _describe_int(df, column_info)\n\n if len(column_info['float']) > 0:\n _describe_float(df, column_info)\n\n if len(column_info['category']) > 0:\n _describe_category(df, column_info, column_info_meta)\n\n if len(column_info['time']) > 0:\n _describe_time(df, column_info, column_info_meta)\n\n if len(column_info['other']) > 0:\n _describe_other(df, column_info, column_info_meta)\n\n column_types = {}\n for column_type, columns in column_info.items():\n for column_name in columns:\n if column_type == 'other':\n column_type = 'str'\n column_types[column_name] = column_type\n return column_types"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef indices2one_hot(indices, nb_classes):\n if nb_classes < 1:\n raise ValueError('nb_classes={}, but positive number expected'\n .format(nb_classes))\n\n one_hot = []\n for index in indices:\n one_hot.append([0] * nb_classes)\n one_hot[-1][index] = 1\n return one_hot", "response": "Convert an iterable of indices to one - hot encoded list."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef one_hot2indices(one_hots):\n indices = []\n for one_hot in one_hots:\n indices.append(argmax(one_hot))\n return indices", "response": "Convert an iterable of one - hot encoded targets to a list of indices."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngenerating an infinite sequence of prime numbers.", "response": "def generate_primes():\n \"\"\"\n Generate an infinite sequence of prime numbers.\n\n The algorithm was originally written by David Eppstein, UC Irvine. See:\n http://code.activestate.com/recipes/117119/\n\n Examples\n --------\n >>> g = generate_primes()\n >>> next(g)\n 2\n >>> next(g)\n 3\n >>> next(g)\n 5\n \"\"\"\n divisors = {} # map number to at least one divisor\n\n candidate = 2 # next potential prime\n\n while True:\n if candidate in divisors:\n # candidate is composite. divisors[candidate] is the list of primes\n # that divide it. Since we've reached candidate, we no longer need\n # it in the map, but we'll mark the next multiples of its witnesses\n # to prepare for larger numbers\n for p in divisors[candidate]:\n divisors.setdefault(p + candidate, []).append(p)\n del divisors[candidate]\n else:\n # candidate is a new prime\n yield candidate\n\n # mark its first multiple that isn't\n # already marked in previous iterations\n divisors[candidate * candidate] = [candidate]\n\n candidate += 1"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the prime factors of an integer except for 1.", "response": "def factorize(number):\n \"\"\"\n Get the prime factors of an integer except for 1.\n\n Parameters\n ----------\n number : int\n\n Returns\n -------\n primes : iterable\n\n Examples\n --------\n >>> factorize(-17)\n [-1, 17]\n >>> factorize(8)\n [2, 2, 2]\n >>> factorize(3**25)\n [3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3]\n >>> factorize(1)\n [1]\n \"\"\"\n if not isinstance(number, int):\n raise ValueError('integer expected, but type(number)={}'\n .format(type(number)))\n if number < 0:\n return [-1] + factorize(number * (-1))\n elif number == 0:\n raise ValueError('All primes are prime factors of 0.')\n else:\n for i in range(2, int(math_stl.ceil(number**0.5)) + 1):\n if number % i == 0:\n if i == number:\n return [i]\n else:\n return [i] + factorize(int(number / i))\n return [number]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nfinding the first index of the biggest value in the iterable.", "response": "def argmax(iterable):\n \"\"\"\n Find the first index of the biggest value in the iterable.\n\n Parameters\n ----------\n iterable : iterable\n\n Returns\n -------\n argmax : int\n\n Examples\n --------\n >>> argmax([0, 0, 0])\n 0\n >>> argmax([1, 0, 0])\n 0\n >>> argmax([0, 1, 0])\n 1\n >>> argmax([])\n \"\"\"\n max_value = None\n max_index = None\n for index, value in enumerate(iterable):\n if (max_value is None) or max_value < value:\n max_value = value\n max_index = index\n return max_index"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nrounds a float down to decimal_places.", "response": "def round_down(x, decimal_places):\n \"\"\"\n Round a float down to decimal_places.\n\n Parameters\n ----------\n x : float\n decimal_places : int\n\n Returns\n -------\n rounded_float : float\n\n Examples\n --------\n >>> round_down(1.23456, 3)\n 1.234\n >>> round_down(1.23456, 2)\n 1.23\n \"\"\"\n from math import floor\n d = int('1' + ('0' * decimal_places))\n return floor(x * d) / d"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadding time to a timezone - aware datetime object.", "response": "def add_time(datetime_obj, days=0, hours=0, minutes=0, seconds=0):\n \"\"\"\n Add time to a timezone-aware datetime object.\n\n This keeps the timezone correct, even if it changes due to daylight\n saving time (DST).\n\n Parameters\n ----------\n datetime_obj : datetime.datetime\n days : int\n hours : int\n minutes : int\n seconds : int\n\n Returns\n -------\n datetime : datetime.datetime\n \"\"\"\n seconds += minutes * 60\n seconds += hours * 60**2\n seconds += days * 24 * 60**2\n t14 = datetime_obj + dt.timedelta(seconds=seconds) # Invalid timezone!\n t14 = t14.astimezone(pytz.utc).astimezone(t14.tzinfo) # Fix the timezone\n return t14"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef generate(minimum, maximum, local_random=random.Random()):\n if not (minimum < maximum):\n raise ValueError('{} is not smaller than {}'.format(minimum, maximum))\n\n # Python 3 allows direct multiplication of timedelta with a float, but\n # Python 2.7 does not. Hence this work-around.\n time_d = maximum - minimum\n time_d_float = time_d.total_seconds()\n time_d_rand = dt.timedelta(seconds=time_d_float * local_random.random())\n generated = minimum + time_d_rand\n return generated", "response": "Generate a random date."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef run_init(args):\n root = args.root\n if root is None:\n root = '.'\n root = os.path.abspath(root)\n\n project_data = _get_package_data()\n project_name = project_data['project_name']\n\n directories = [os.path.join(root, 'bin'),\n os.path.join(root, 'docs'),\n os.path.join(root, 'tests'),\n os.path.join(root, project_name),\n ]\n for dir_path in directories:\n if not os.path.exists(dir_path):\n os.makedirs(dir_path)\n\n script_paths = [os.path.join(root, 'README.md'),\n os.path.join(root, 'tests/__init__.py'),\n ]\n for script_path in script_paths:\n if not os.path.exists(script_path):\n os.mknod(script_path)\n\n copy_samples = [(resource_filename('mpu', 'package/templates/tox.ini.txt'),\n os.path.join(root, 'tox.ini')),\n (resource_filename('mpu',\n 'package/templates/setup.cfg.txt'),\n os.path.join(root, 'setup.cfg')),\n (resource_filename('mpu',\n 'package/templates/setup.py.txt'),\n os.path.join(root, 'setup.py')),\n (resource_filename('mpu',\n 'package/templates/_version.py.txt'),\n os.path.join(root, project_name + '/_version.py')),\n (resource_filename('mpu',\n 'package/templates/coveragerc.txt'),\n os.path.join(root, '.coveragerc')),\n (resource_filename('mpu', 'package/templates/init.py.txt'),\n os.path.join(root, project_name + '/__init__.py')),\n ]\n translate = {'[[project_name]]': project_data['project_name'],\n '[[license]]': project_data['license'],\n '[[author]]': project_data['author'],\n '[[email]]': project_data['email'],\n }\n for source, destination in copy_samples:\n if not os.path.exists(destination):\n copyfile(source, destination)\n _adjust_template(destination, translate)", "response": "Run project initialization.\n\n This will ask the user for input.\n\n Parameters\n ----------\n args : argparse named arguments"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreplacing multiple things at once in a text.", "response": "def _multiple_replace(text, search_replace_dict):\n \"\"\"\n Replace multiple things at once in a text.\n\n Parameters\n ----------\n text : str\n search_replace_dict : dict\n\n Returns\n -------\n replaced_text : str\n\n Examples\n --------\n >>> d = {'a': 'b', 'b': 'c', 'c': 'd', 'd': 'e'}\n >>> _multiple_replace('abcdefghijklm', d)\n 'bcdeefghijklm'\n \"\"\"\n # Create a regular expression from all of the dictionary keys\n regex = re.compile(\"|\".join(map(re.escape, search_replace_dict.keys())))\n\n # For each match, look up the corresponding value in the dictionary\n return regex.sub(lambda match: search_replace_dict[match.group(0)], text)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _adjust_template(filepath, translate):\n with open(filepath, 'r') as file:\n filedata = file.read()\n\n filedata = _multiple_replace(filedata, translate)\n\n with open(filepath, 'w') as file:\n file.write(filedata)", "response": "Search and replace contents of a filepath."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_parser(parser=None):\n from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter\n if parser is None:\n parser = ArgumentParser(description=__doc__,\n formatter_class=ArgumentDefaultsHelpFormatter)\n subparsers = parser.add_subparsers()\n pkg_init_parser = subparsers.add_parser('init')\n pkg_init_parser.add_argument(\"root\",\n nargs='?',\n help=\"project root - should be empty\")\n pkg_init_parser.set_defaults(func=run_init)\n return parser", "response": "Get parser for mpu."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef parallel_for(loop_function, parameters, nb_threads=100):\n import multiprocessing.pool\n from contextlib import closing\n with closing(multiprocessing.pool.ThreadPool(nb_threads)) as pool:\n return pool.map(loop_function, parameters)", "response": "Execute a function in parallel."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef clip(number, lowest=None, highest=None):\n if lowest is not None:\n number = max(number, lowest)\n if highest is not None:\n number = min(number, highest)\n return number", "response": "Clip a number to a given lowest / highest value."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef consistent_shuffle(*lists):\n perm = list(range(len(lists[0])))\n random.shuffle(perm)\n lists = tuple([sublist[index] for index in perm]\n for sublist in lists)\n return lists", "response": "Shuffle lists consistently.\n\n Parameters\n ----------\n *lists\n Variable length number of lists\n\n Returns\n -------\n shuffled_lists : tuple of lists\n All of the lists are shuffled consistently\n\n Examples\n --------\n >>> import mpu, random; random.seed(8)\n >>> mpu.consistent_shuffle([1,2,3], ['a', 'b', 'c'], ['A', 'B', 'C'])\n ([3, 2, 1], ['c', 'b', 'a'], ['C', 'B', 'A'])"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncalculating the Haversine distance between two sets of points.", "response": "def haversine_distance(origin, destination):\n \"\"\"\n Calculate the Haversine distance.\n\n Parameters\n ----------\n origin : tuple of float\n (lat, long)\n destination : tuple of float\n (lat, long)\n\n Returns\n -------\n distance_in_km : float\n\n Examples\n --------\n >>> munich = (48.1372, 11.5756)\n >>> berlin = (52.5186, 13.4083)\n >>> round(haversine_distance(munich, berlin), 1)\n 504.2\n\n >>> new_york_city = (40.712777777778, -74.005833333333) # NYC\n >>> round(haversine_distance(berlin, new_york_city), 1)\n 6385.3\n \"\"\"\n lat1, lon1 = origin\n lat2, lon2 = destination\n if not (-90.0 <= lat1 <= 90):\n raise ValueError('lat1={:2.2f}, but must be in [-90,+90]'.format(lat1))\n if not (-90.0 <= lat2 <= 90):\n raise ValueError('lat2={:2.2f}, but must be in [-90,+90]'.format(lat2))\n if not (-180.0 <= lon1 <= 180):\n raise ValueError('lon1={:2.2f}, but must be in [-180,+180]'\n .format(lat1))\n if not (-180.0 <= lon2 <= 180):\n raise ValueError('lon1={:2.2f}, but must be in [-180,+180]'\n .format(lat1))\n radius = 6371 # km\n\n dlat = math_stl.radians(lat2 - lat1)\n dlon = math_stl.radians(lon2 - lon1)\n a = (math_stl.sin(dlat / 2) * math_stl.sin(dlat / 2) +\n math_stl.cos(math_stl.radians(lat1)) *\n math_stl.cos(math_stl.radians(lat2)) *\n math_stl.sin(dlon / 2) * math_stl.sin(dlon / 2))\n c = 2 * math_stl.atan2(math_stl.sqrt(a), math_stl.sqrt(1 - a))\n d = radius * c\n\n return d"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nraises an exception if value is not in an interval.", "response": "def is_in_intervall(value, min_value, max_value, name='variable'):\n \"\"\"\n Raise an exception if value is not in an interval.\n\n Parameters\n ----------\n value : orderable\n min_value : orderable\n max_value : orderable\n name : str\n Name of the variable to print in exception.\n \"\"\"\n if not (min_value <= value <= max_value):\n raise ValueError('{}={} is not in [{}, {}]'\n .format(name, value, min_value, max_value))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef exception_logging(exctype, value, tb):\n write_val = {'exception_type': str(exctype),\n 'message': str(traceback.format_tb(tb, 10))}\n logging.exception(str(write_val))", "response": "Log exception by using the root logger."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef distance(self, there):\n return haversine_distance((self.latitude, self.longitude),\n (there.latitude, there.longitude))", "response": "Calculate the distance from this location to there."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef main():\n parser = get_parser()\n args = parser.parse_args()\n if hasattr(args, 'func') and args.func:\n args.func(args)\n else:\n parser.print_help()", "response": "Command line interface of mpu."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget parser for mpu.", "response": "def get_parser():\n \"\"\"Get parser for mpu.\"\"\"\n from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter\n parser = ArgumentParser(description=__doc__,\n formatter_class=ArgumentDefaultsHelpFormatter)\n parser.add_argument('--version',\n action='version',\n version='mpu {}'.format(mpu.__version__))\n subparsers = parser.add_subparsers(help='Python package commands')\n package_parser = subparsers.add_parser('package')\n mpu.package.cli.get_parser(package_parser)\n return parser"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreading a single object from a file.", "response": "def read(filepath, **kwargs):\n \"\"\"\n Read a file.\n\n Supported formats:\n\n * CSV\n * JSON, JSONL\n * pickle\n\n Parameters\n ----------\n filepath : str\n Path to the file that should be read. This methods action depends\n mainly on the file extension.\n kwargs : dict\n Any keywords for the specific file format. For CSV, this is\n 'delimiter', 'quotechar', 'skiprows', 'format'\n\n Returns\n -------\n data : str or bytes\n \"\"\"\n if filepath.lower().endswith('.csv'):\n return _read_csv(filepath, kwargs)\n elif filepath.lower().endswith('.json'):\n with open(filepath) as data_file:\n data = json.load(data_file, **kwargs)\n return data\n elif filepath.lower().endswith('.jsonl'):\n return _read_jsonl(filepath, kwargs)\n elif filepath.lower().endswith('.pickle'):\n with open(filepath, 'rb') as handle:\n data = pickle.load(handle)\n return data\n elif (filepath.lower().endswith('.yml') or\n filepath.lower().endswith('.yaml')):\n raise NotImplementedError('YAML is not supported, because you need '\n 'PyYAML in Python3. '\n 'See '\n 'https://stackoverflow.com/a/42054860/562769'\n ' as a guide how to use it.')\n elif (filepath.lower().endswith('.h5') or\n filepath.lower().endswith('.hdf5')):\n raise NotImplementedError('HDF5 is not supported. See '\n 'https://stackoverflow.com/a/41586571/562769'\n ' as a guide how to use it.')\n else:\n raise NotImplementedError('File \\'{}\\' is not known.'.format(filepath))"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreads a list of items from a CSV file.", "response": "def _read_csv(filepath, kwargs):\n \"\"\"See documentation of mpu.io.read.\"\"\"\n if 'delimiter' not in kwargs:\n kwargs['delimiter'] = ','\n if 'quotechar' not in kwargs:\n kwargs['quotechar'] = '\"'\n if 'skiprows' not in kwargs:\n kwargs['skiprows'] = []\n if isinstance(kwargs['skiprows'], int):\n kwargs['skiprows'] = [i for i in range(kwargs['skiprows'])]\n if 'format' in kwargs:\n format_ = kwargs['format']\n kwargs.pop('format', None)\n else:\n format_ = 'default'\n skiprows = kwargs['skiprows']\n kwargs.pop('skiprows', None)\n\n kwargs_open = {'newline': ''}\n mode = 'r'\n if sys.version_info < (3, 0):\n kwargs_open.pop('newline', None)\n mode = 'rb'\n with open(filepath, mode, **kwargs_open) as fp:\n if format_ == 'default':\n reader = csv.reader(fp, **kwargs)\n data = EList([row for row in reader])\n data = data.remove_indices(skiprows)\n elif format_ == 'dicts':\n reader_list = csv.DictReader(fp, **kwargs)\n data = [row for row in reader_list]\n else:\n raise NotImplementedError('Format \\'{}\\' unknown'\n .format(format_))\n return data"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _read_jsonl(filepath, kwargs):\n with open(filepath) as data_file:\n data = [json.loads(line, **kwargs)\n for line in data_file\n if len(line) > 0]\n return data", "response": "Read a json file and return a list of dicts."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nwrites a file. Supported formats: * CSV * JSON, JSONL * pickle Parameters ---------- filepath : str Path to the file that should be read. This methods action depends mainly on the file extension. data : dict or list Content that should be written kwargs : dict Any keywords for the specific file format. Returns ------- data : str or bytes", "response": "def write(filepath, data, **kwargs):\n \"\"\"\n Write a file.\n\n Supported formats:\n\n * CSV\n * JSON, JSONL\n * pickle\n\n Parameters\n ----------\n filepath : str\n Path to the file that should be read. This methods action depends\n mainly on the file extension.\n data : dict or list\n Content that should be written\n kwargs : dict\n Any keywords for the specific file format.\n\n Returns\n -------\n data : str or bytes\n \"\"\"\n if filepath.lower().endswith('.csv'):\n return _write_csv(filepath, data, kwargs)\n elif filepath.lower().endswith('.json'):\n return _write_json(filepath, data, kwargs)\n elif filepath.lower().endswith('.jsonl'):\n return _write_jsonl(filepath, data, kwargs)\n elif filepath.lower().endswith('.pickle'):\n return _write_pickle(filepath, data, kwargs)\n elif (filepath.lower().endswith('.yml') or\n filepath.lower().endswith('.yaml')):\n raise NotImplementedError('YAML is not supported, because you need '\n 'PyYAML in Python3. '\n 'See '\n 'https://stackoverflow.com/a/42054860/562769'\n ' as a guide how to use it.')\n elif (filepath.lower().endswith('.h5') or\n filepath.lower().endswith('.hdf5')):\n raise NotImplementedError('YAML is not supported. See '\n 'https://stackoverflow.com/a/41586571/562769'\n ' as a guide how to use it.')\n else:\n raise NotImplementedError('File \\'{}\\' is not known.'.format(filepath))"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nwrites data to a CSV file.", "response": "def _write_csv(filepath, data, kwargs):\n \"\"\"See documentation of mpu.io.write.\"\"\"\n kwargs_open = {'newline': ''}\n mode = 'w'\n if sys.version_info < (3, 0):\n kwargs_open.pop('newline', None)\n mode = 'wb'\n with open(filepath, mode, **kwargs_open) as fp:\n if 'delimiter' not in kwargs:\n kwargs['delimiter'] = ','\n if 'quotechar' not in kwargs:\n kwargs['quotechar'] = '\"'\n with open(filepath, 'w') as fp:\n writer = csv.writer(fp, **kwargs)\n writer.writerows(data)\n return data"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _write_json(filepath, data, kwargs):\n with io_stl.open(filepath, 'w', encoding='utf8') as outfile:\n if 'indent' not in kwargs:\n kwargs['indent'] = 4\n if 'sort_keys' not in kwargs:\n kwargs['sort_keys'] = True\n if 'separators' not in kwargs:\n kwargs['separators'] = (',', ': ')\n if 'ensure_ascii' not in kwargs:\n kwargs['ensure_ascii'] = False\n str_ = json.dumps(data, **kwargs)\n outfile.write(to_unicode(str_))\n return data", "response": "Write a JSON object to a file."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _write_jsonl(filepath, data, kwargs):\n with io_stl.open(filepath, 'w', encoding='utf8') as outfile:\n kwargs['indent'] = None # JSON has to be on one line!\n if 'sort_keys' not in kwargs:\n kwargs['sort_keys'] = True\n if 'separators' not in kwargs:\n kwargs['separators'] = (',', ': ')\n if 'ensure_ascii' not in kwargs:\n kwargs['ensure_ascii'] = False\n for line in data:\n str_ = json.dumps(line, **kwargs)\n outfile.write(to_unicode(str_))\n outfile.write(u'\\n')\n return data", "response": "Write a JSON file."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nwrites data to a pickle file.", "response": "def _write_pickle(filepath, data, kwargs):\n \"\"\"See documentation of mpu.io.write.\"\"\"\n if 'protocol' not in kwargs:\n kwargs['protocol'] = pickle.HIGHEST_PROTOCOL\n with open(filepath, 'wb') as handle:\n pickle.dump(data, handle, **kwargs)\n return data"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreading the content of a URL.", "response": "def urlread(url, encoding='utf8'):\n \"\"\"\n Read the content of an URL.\n\n Parameters\n ----------\n url : str\n\n Returns\n -------\n content : str\n \"\"\"\n try:\n from urllib.request import urlopen\n except ImportError:\n from urllib2 import urlopen\n response = urlopen(url)\n content = response.read()\n content = content.decode(encoding)\n return content"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ndownloads a file from a URL and store it in a local file system.", "response": "def download(source, sink=None):\n \"\"\"\n Download a file.\n\n Parameters\n ----------\n source : str\n Where the file comes from. Some URL.\n sink : str or None (default: same filename in current directory)\n Where the file gets stored. Some filepath in the local file system.\n \"\"\"\n try:\n from urllib.request import urlretrieve # Python 3\n except ImportError:\n from urllib import urlretrieve # Python 2\n if sink is None:\n sink = os.path.abspath(os.path.split(source)[1])\n urlretrieve(source, sink)\n return sink"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef hash(filepath, method='sha1', buffer_size=65536):\n if method == 'sha1':\n hash_function = hashlib.sha1()\n elif method == 'md5':\n hash_function = hashlib.md5()\n else:\n raise NotImplementedError('Only md5 and sha1 hashes are known, but '\n ' \\'{}\\' was specified.'.format(method))\n\n with open(filepath, 'rb') as fp:\n while True:\n data = fp.read(buffer_size)\n if not data:\n break\n hash_function.update(data)\n return hash_function.hexdigest()", "response": "Calculate a hash of a local file."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_creation_datetime(filepath):\n if platform.system() == 'Windows':\n return datetime.fromtimestamp(os.path.getctime(filepath))\n else:\n stat = os.stat(filepath)\n try:\n return datetime.fromtimestamp(stat.st_birthtime)\n except AttributeError:\n # We're probably on Linux. No easy way to get creation dates here,\n # so we'll settle for when its content was last modified.\n return None", "response": "Get the creation date of a file in a node."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget the datetime that a file was last modified.", "response": "def get_modification_datetime(filepath):\n \"\"\"\n Get the datetime that a file was last modified.\n\n Parameters\n ----------\n filepath : str\n\n Returns\n -------\n modification_datetime : datetime.datetime\n\n \"\"\"\n import tzlocal\n timezone = tzlocal.get_localzone()\n mtime = datetime.fromtimestamp(os.path.getmtime(filepath))\n return mtime.replace(tzinfo=timezone)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the last time filepath was accessed.", "response": "def get_access_datetime(filepath):\n \"\"\"\n Get the last time filepath was accessed.\n\n Parameters\n ----------\n filepath : str\n\n Returns\n -------\n access_datetime : datetime.datetime\n \"\"\"\n import tzlocal\n tz = tzlocal.get_localzone()\n mtime = datetime.fromtimestamp(os.path.getatime(filepath))\n return mtime.replace(tzinfo=tz)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_file_meta(filepath):\n meta = {}\n meta['filepath'] = os.path.abspath(filepath)\n meta['creation_datetime'] = get_creation_datetime(filepath)\n meta['last_access_datetime'] = get_access_datetime(filepath)\n meta['modification_datetime'] = get_modification_datetime(filepath)\n try:\n import magic\n f_mime = magic.Magic(mime=True, uncompress=True)\n f_other = magic.Magic(mime=False, uncompress=True)\n meta['mime'] = f_mime.from_file(meta['filepath'])\n meta['magic-type'] = f_other.from_file(meta['filepath'])\n except ImportError:\n pass\n return meta", "response": "Get meta - information about a file."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a GZIP file from a source file.", "response": "def gzip_file(source, sink):\n \"\"\"\n Create a GZIP file from a source file.\n\n Parameters\n ----------\n source : str\n Filepath\n sink : str\n Filepath\n \"\"\"\n import gzip\n with open(source, 'rb') as f_in, gzip.open(sink, 'wb') as f_out:\n f_out.writelines(f_in)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nstarts the patch for the current object.", "response": "def start(self):\n \"\"\"\n Start the patch\n \"\"\"\n self._patcher = mock.patch(target=self.target)\n MockClient = self._patcher.start()\n instance = MockClient.return_value\n instance.model.side_effect = mock.Mock(\n side_effect=self.model\n )"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nconfiguring client in session", "response": "def setup(cls, client_id, client_secret):\n \"\"\"Configure client in session\n \"\"\"\n cls.client_id = client_id\n cls.client_secret = client_secret"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nparsing a ris file and return a list of entries.", "response": "def read(filelines, mapping=None, wok=False):\n \"\"\"Parse a ris lines and return a list of entries.\n\n Entries are codified as dictionaries whose keys are the\n different tags. For single line and singly occurring tags,\n the content is codified as a string. In the case of multiline\n or multiple key occurrences, the content is returned as a list\n of strings.\n\n Keyword arguments:\n bibliography_file -- ris filehandle\n mapping -- custom RIS tags mapping\n wok -- flag, Web of Knowledge format is used if True, otherwise\n Refman's RIS specifications are used.\n\n \"\"\"\n\n if wok:\n if not mapping:\n mapping = WOK_TAG_KEY_MAPPING\n return Wok(filelines, mapping).parse()\n else:\n if not mapping:\n mapping = TAG_KEY_MAPPING\n return Ris(filelines, mapping).parse()"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nrefreshes the context of the user and save it", "response": "def refresh_context(self):\n \"\"\"\n Get the default context of the user and save it\n \"\"\"\n User = self.model('res.user')\n\n self.context = User.get_preferences(True)\n return self.context"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef login(self, login, password, set_auth=False):\n rv = self.session.post(\n self.host,\n dumps({\n \"method\": \"common.db.login\",\n \"params\": [login, password]\n }),\n )\n rv = loads(rv.content)['result']\n if set_auth:\n self.set_auth(\n SessionAuth(login, *rv)\n )\n return rv", "response": "Attempts a login to the remote server and returns the user id and session id."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning true if the auth is not expired else false", "response": "def is_auth_alive(self):\n \"Return true if the auth is not expired, else false\"\n model = self.model('ir.model')\n try:\n model.search([], None, 1, None)\n except ClientError as err:\n if err and err.message['code'] == 403:\n return False\n raise\n except Exception:\n raise\n else:\n return True"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef update(self, data=None, **kwargs):\n if data is None:\n data = {}\n data.update(kwargs)\n return self.model.write([self.id], data)", "response": "Update the record right away."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef search_read_all(self, domain, order, fields, batch_size=500,\n context=None, offset=0, limit=None):\n \"\"\"\n An endless iterator that iterates over records.\n\n :param domain: A search domain\n :param order: The order clause for search read\n :param fields: The fields argument for search_read\n :param batch_size: The optimal batch size when sending paginated\n requests\n \"\"\"\n if context is None:\n context = {}\n\n if limit is None:\n # When no limit is specified, all the records\n # should be fetched.\n record_count = self.search_count(domain, context=context)\n end = record_count + offset\n else:\n end = limit + offset\n\n for page_offset in range(offset, end, batch_size):\n if page_offset + batch_size > end:\n batch_size = end - page_offset\n for record in self.search_read(\n domain, page_offset, batch_size,\n order, fields, context=context):\n yield record", "response": "A generator that yields all records in a domain."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nfind records that match the filter.", "response": "def find(self, filter=None, page=1, per_page=10, fields=None, context=None):\n \"\"\"\n Find records that match the filter.\n\n Pro Tip: The fields could have nested fields names if the field is\n a relationship type. For example if you were looking up an order\n and also want to get the shipping address country then fields would be:\n\n `['shipment_address', 'shipment_address.country']`\n\n but country in this case is the ID of the country which is not very\n useful if you don't already have a map. You can fetch the country code\n by adding `'shipment_address.country.code'` to the fields.\n\n :param filter: A domain expression (Refer docs for domain syntax)\n :param page: The page to fetch to get paginated results\n :param per_page: The number of records to fetch per page\n :param fields: A list of field names to fetch.\n :param context: Any overrides to the context.\n \"\"\"\n if filter is None:\n filter = []\n rv = self.client.session.get(\n self.path,\n params={\n 'filter': dumps(filter or []),\n 'page': page,\n 'per_page': per_page,\n 'field': fields,\n 'context': dumps(context or self.client.context),\n }\n )\n response_received.send(rv)\n return rv"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadding an attachmemt to record from url", "response": "def attach(self, id, filename, url):\n \"\"\"Add an attachmemt to record from url\n\n :param id: ID of record\n :param filename: File name of attachment\n :param url: Public url to download file from.\n \"\"\"\n Attachment = self.client.model('ir.attachment')\n return Attachment.add_attachment_from_url(\n filename, url, '%s,%s' % (self.model_name, id)\n )"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef refresh_if_needed(self):\n if self.state in (self.PENDING, self.STARTED):\n try:\n response, = self._fetch_result()['tasks']\n except (KeyError, ValueError):\n raise Exception(\n \"Unable to find results for task.\"\n )\n\n if 'error' in response:\n self.state == self.FAILURE\n raise ServerError(response['error'])\n\n if 'state' in response:\n self.state = response['state']\n self.result = response['result']", "response": "Refresh the status of the task from server if required."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget the product inventory in each location.", "response": "def get_product_inventory(product_id, warehouse_ids):\n \"\"\"\n Return the product inventory in each location. The returned response\n will look like::\n\n {\n 12: { // Product ID\n 4: { // Location ID\n 'quantity_on_hand': 12.0,\n 'quantity_available': 8.0\n },\n 5: { // Location ID\n 'quantity_on_hand': 8.0,\n 'quantity_available': 8.0\n },\n },\n 126: { // Product ID\n 4: { // Location ID\n 'quantity_on_hand': 16.0,\n 'quantity_available': 15.0\n },\n 5: { // Location ID\n 'quantity_on_hand': 9.0,\n 'quantity_available': 8.0\n },\n }\n }\n\n Read more:\n http://docs.fulfiliorestapi.apiary.io/#reference/product/product-inventory\n \"\"\"\n Product = client.model('product.product')\n\n return Product.get_product_inventory(\n [product_id], warehouse_ids\n )[product_id]"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfetches a customer with the code.", "response": "def get_customer(code):\n \"\"\"\n Fetch a customer with the code.\n Returns None if the customer is not found.\n \"\"\"\n Party = client.model('party.party')\n results = Party.find([('code', '=', code)])\n if results:\n return results[0]['id']"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the address of a customer by its id.", "response": "def get_address(customer_id, data):\n \"\"\"\n Easier to fetch the addresses of customer and then check one by one.\n\n You can get fancy by using some validation mechanism too\n \"\"\"\n Address = client.model('party.address')\n\n addresses = Address.find(\n [('party', '=', customer_id)],\n fields=[\n 'name', 'street', 'street_bis', 'city', 'zip',\n 'subdivision.code', 'country.code'\n ]\n )\n for address in addresses:\n if (\n address['name'] == data['name'] and\n address['street'] == data['street'] and\n address['street_bis'] == data['street_bis'] and\n address['city'] == data['city'] and\n address['zip'] == data['zip'] and\n address['subdivision.code'].endswith(data['state']) and\n address['country.code'] == data['country']):\n return address['id']"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef create_address(customer_id, data):\n Address = client.model('party.address')\n Country = client.model('country.country')\n Subdivision = client.model('country.subdivision')\n\n country, = Country.find([('code', '=', data['country'])])\n state, = Subdivision.find([\n ('code', 'ilike', '%-' + data['state']), # state codes are US-CA, IN-KL\n ('country', '=', country['id'])\n ])\n\n address, = Address.create([{\n 'party': customer_id,\n 'name': data['name'],\n 'street': data['street'],\n 'street_bis': data['street_bis'],\n 'city': data['city'],\n 'zip': data['zip'],\n 'country': country['id'],\n 'subdivision': state['id'],\n }])\n return address['id']", "response": "Create an address and return the id"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncreate a customer with the name email and phone.", "response": "def create_customer(name, email, phone):\n \"\"\"\n Create a customer with the name.\n Then attach the email and phone as contact methods\n \"\"\"\n Party = client.model('party.party')\n ContactMechanism = client.model('party.contact_mechanism')\n\n party, = Party.create([{'name': name}])\n\n # Bulk create the email and phone\n ContactMechanism.create([\n {'type': 'email', 'value': email, 'party': party},\n {'type': 'phone', 'value': phone, 'party': party},\n ])\n\n return party"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate an order on fulfil from order_details.", "response": "def create_order(order):\n \"\"\"\n Create an order on fulfil from order_details.\n See the calling function below for an example of the order_details\n \"\"\"\n SaleOrder = client.model('sale.sale')\n SaleOrderLine = client.model('sale.line')\n\n # Check if customer exists, if not create one\n customer_id = get_customer(order['customer']['code'])\n if not customer_id:\n customer_id = create_customer(\n order['customer']['name'],\n order['customer']['email'],\n order['customer']['phone'],\n )\n\n # No check if there is a matching address\n invoice_address = get_address(\n customer_id,\n order['invoice_address']\n )\n if not invoice_address:\n invoice_address = create_address(\n customer_id,\n order['invoice_address']\n )\n\n # See if the shipping address exists, if not create it\n shipment_address = get_address(\n customer_id,\n order['shipment_address']\n )\n if not shipment_address:\n shipment_address = create_address(\n customer_id,\n order['shipment_address']\n )\n\n sale_order_id, = SaleOrder.create([{\n 'reference': order['number'],\n 'sale_date': order['date'],\n 'party': customer_id,\n 'invoice_address': invoice_address,\n 'shipment_address': shipment_address,\n }])\n\n # fetch inventory of all the products before we create lines\n warehouses = get_warehouses()\n warehouse_ids = [warehouse['id'] for warehouse in warehouses]\n\n lines = []\n for item in order['items']:\n # get the product. We assume ti already exists.\n product = get_product(item['product'])\n\n # find the first location that has inventory\n product_inventory = get_product_inventory(product, warehouse_ids)\n for location, quantities in product_inventory.items():\n if quantities['quantity_available'] >= item['quantity']:\n break\n\n lines.append({\n 'sale': sale_order_id,\n 'product': product,\n 'quantity': item['quantity'],\n 'unit_price': item['unit_price'],\n 'warehouse': location,\n })\n\n SaleOrderLine.create(lines)\n\n SaleOrder.quote([sale_order_id])\n SaleOrder.confirm([sale_order_id])"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef model_base(fulfil_client, cache_backend=None, cache_expire=10 * 60):\n return type(\n 'BaseModel',\n (Model,),\n {\n 'fulfil_client': fulfil_client,\n 'cache_backend': cache_backend,\n 'cache_expire': cache_expire,\n '__abstract__': True,\n '__modelregistry__': {},\n },\n )", "response": "Returns a base model class that binds to the fulfil client instance and the cache instance."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef all(self):\n return self.rpc_model.search_read_all(\n self.domain,\n self._order_by,\n self.fields,\n context=self.context,\n offset=self._offset or 0,\n limit=self._limit,\n )", "response": "Returns the results represented by this Query as a list."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef count(self):\n \"Return a count of rows this Query would return.\"\n return self.rpc_model.search_count(\n self.domain, context=self.context\n )", "response": "Return a count of rows this Query would return."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef exists(self):\n return self.rpc_model.search_count(\n self.domain, context=self.context\n ) > 0", "response": "Returns True if a record with the given ID exists otherwise False."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef show_active_only(self, state):\n query = self._copy()\n query.active_only = state\n return query", "response": "Returns a new QuerySet with active only = True or false on a copy of this QuerySet"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef filter_by(self, **kwargs):\n query = self._copy()\n for field, value in kwargs.items():\n query.domain.append(\n (field, '=', value)\n )\n return query", "response": "Returns a new Query object with only the specified fields and values applied to the current Query."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef filter_by_domain(self, domain):\n query = self._copy()\n query.domain = domain\n return query", "response": "Returns a new QuerySet with the given domain applied to the current QuerySet."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the first result of this Query or None if the result doesn t contain any row.", "response": "def first(self):\n \"\"\"\n Return the first result of this Query or None if the result\n doesn't contain any row.\n \"\"\"\n results = self.rpc_model.search_read(\n self.domain, None, 1, self._order_by, self.fields,\n context=self.context\n )\n return results and results[0] or None"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget an instance based on the given primary key identifier.", "response": "def get(self, id):\n \"\"\"\n Return an instance based on the given primary key identifier,\n or None if not found.\n\n This returns a record whether active or not.\n \"\"\"\n ctx = self.context.copy()\n ctx['active_test'] = False\n results = self.rpc_model.search_read(\n [('id', '=', id)],\n None, None, None, self.fields,\n context=ctx\n )\n return results and results[0] or None"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\napplies a LIMIT to the query and return a new Query.", "response": "def limit(self, limit):\n \"\"\"\n Apply a LIMIT to the query and return the newly resulting Query.\n \"\"\"\n query = self._copy()\n query._limit = limit\n return query"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\napplying an OFFSET to the query and return a new Query object.", "response": "def offset(self, offset):\n \"\"\"\n Apply an OFFSET to the query and return the newly resulting Query.\n \"\"\"\n query = self._copy()\n query._offset = offset\n return query"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns exactly one result or raise an exception.", "response": "def one(self):\n \"\"\"\n Return exactly one result or raise an exception.\n\n Raises fulfil_client.exc.NoResultFound if the query selects no rows.\n Raises fulfil_client.exc.MultipleResultsFound if multiple rows are\n found.\n \"\"\"\n results = self.rpc_model.search_read(\n self.domain, 2, None, self._order_by, self.fields,\n context=self.context\n )\n if not results:\n raise fulfil_client.exc.NoResultFound\n if len(results) > 1:\n raise fulfil_client.exc.MultipleResultsFound\n return results[0]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn a new Query object with the specified ORDER BY criterion applied to the query and returns the new Query object.", "response": "def order_by(self, *criterion):\n \"\"\"\n apply one or more ORDER BY criterion to the query and\n return the newly resulting Query\n\n All existing ORDER BY settings can be suppressed by passing None -\n this will suppress any ORDER BY configured on mappers as well.\n \"\"\"\n query = self._copy()\n query._order_by = criterion\n return query"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ndelete all records matching the query.", "response": "def delete(self):\n \"\"\"\n Delete all records matching the query.\n\n Warning: This is a desctructive operation.\n\n Not every model allows deletion of records and several models\n even restrict based on status. For example, deleting products\n that have been transacted is restricted. Another example is sales\n orders which can be deleted only when they are draft.\n\n If deletion fails, a server error is thrown.\n \"\"\"\n ids = self.rpc_model.search(self.domain, context=self.context)\n if ids:\n self.rpc_model.delete(ids)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\narchives all the records matching the query.", "response": "def archive(self):\n \"\"\"\n Archives (soft delete) all the records matching the query.\n\n This assumes that the model allows archiving (not many do - especially\n transactional documents).\n\n Internal implementation sets the active field to False.\n \"\"\"\n ids = self.rpc_model.search(self.domain, context=self.context)\n if ids:\n self.rpc_model.write(ids, {'active': False})"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _logged_in_successful(data):\n if re.match(r'^:(testserver\\.local|tmi\\.twitch\\.tv)'\n r' NOTICE \\* :'\n r'(Login unsuccessful|Error logging in)*$',\n data.strip()):\n return False\n else:\n return True", "response": "Test the login status from the returned communication of the the\n server."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef connect(self):\n\n # Do not use non-blocking stream, they are not reliably\n # non-blocking\n # s.setblocking(False)\n # s.settimeout(1.0)\n\n s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n connect_host = \"irc.twitch.tv\"\n connect_port = 6667\n try:\n s.connect((connect_host, connect_port))\n except (Exception, IOError):\n print(\"Unable to create a socket to %s:%s\" % (\n connect_host,\n connect_port))\n raise # unexpected, because it is a blocking socket\n\n # Connected to twitch\n # Sending our details to twitch...\n s.send(('PASS %s\\r\\n' % self.oauth).encode('utf-8'))\n s.send(('NICK %s\\r\\n' % self.username).encode('utf-8'))\n if self.verbose:\n print('PASS %s\\r\\n' % self.oauth)\n print('NICK %s\\r\\n' % self.username)\n\n received = s.recv(1024).decode()\n if self.verbose:\n print(received)\n if not TwitchChatStream._logged_in_successful(received):\n # ... and they didn't accept our details\n raise IOError(\"Twitch did not accept the username-oauth \"\n \"combination\")\n else:\n # ... and they accepted our details\n # Connected to twitch.tv!\n # now make this socket non-blocking on the OS-level\n fcntl.fcntl(s, fcntl.F_SETFL, os.O_NONBLOCK)\n if self.s is not None:\n self.s.close() # close the previous socket\n self.s = s # store the new socket\n self.join_channel(self.username)\n\n # Wait until we have switched channels\n while self.current_channel != self.username:\n self.twitch_receive_messages()", "response": "Connect to Twitch and return a Twitch - level object."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _push_from_buffer(self):\n if len(self.buffer) > 0:\n if time.time() - self.last_sent_time > 5:\n try:\n message = self.buffer.pop(0)\n self.s.send(message.encode('utf-8'))\n if self.verbose:\n print(message)\n finally:\n self.last_sent_time = time.time()", "response": "Push a message on the stack to the IRC stream."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\njoins a different chat channel on Twitch.", "response": "def join_channel(self, channel):\n \"\"\"\n Join a different chat channel on Twitch.\n Note, this function returns immediately, but the switch might\n take a moment\n\n :param channel: name of the channel (without #)\n \"\"\"\n self.s.send(('JOIN #%s\\r\\n' % channel).encode('utf-8'))\n if self.verbose:\n print('JOIN #%s\\r\\n' % channel)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _parse_message(self, data):\n if TwitchChatStream._check_has_ping(data):\n self._send_pong()\n if TwitchChatStream._check_has_channel(data):\n self.current_channel = \\\n TwitchChatStream._check_has_channel(data)[0]\n\n if TwitchChatStream._check_has_message(data):\n return {\n 'channel': re.findall(r'^:.+![a-zA-Z0-9_]+'\n r'@[a-zA-Z0-9_]+'\n r'.+ '\n r'PRIVMSG (.*?) :',\n data)[0],\n 'username': re.findall(r'^:([a-zA-Z0-9_]+)!', data)[0],\n 'message': re.findall(r'PRIVMSG #[a-zA-Z0-9_]+ :(.+)',\n data)[0].decode('utf8')\n }\n else:\n return None", "response": "Parse the bytes received from the socket and return a dict of the fields."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef twitch_receive_messages(self):\n self._push_from_buffer()\n result = []\n while True:\n # process the complete buffer, until no data is left no more\n try:\n msg = self.s.recv(4096).decode() # NON-BLOCKING RECEIVE!\n except socket.error as e:\n err = e.args[0]\n if err == errno.EAGAIN or err == errno.EWOULDBLOCK:\n # There is no more data available to read\n return result\n else:\n # a \"real\" error occurred\n # import traceback\n # import sys\n # print(traceback.format_exc())\n # print(\"Trying to recover...\")\n self.connect()\n return result\n else:\n if self.verbose:\n print(msg)\n rec = [self._parse_message(line)\n for line in filter(None, msg.split('\\r\\n'))]\n rec = [r for r in rec if r] # remove Nones\n result.extend(rec)", "response": "This function will process all messages received by the socket and return a list of dicts. Each dict is a dict with the keys channel username message and the values are the message fields."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nresetting the videostream by restarting ffmpeg.", "response": "def reset(self):\n \"\"\"\n Reset the videostream by restarting ffmpeg\n \"\"\"\n\n if self.ffmpeg_process is not None:\n # Close the previous stream\n try:\n self.ffmpeg_process.send_signal(signal.SIGINT)\n except OSError:\n pass\n\n command = []\n command.extend([\n self.ffmpeg_binary,\n '-loglevel', 'verbose',\n '-y', # overwrite previous file/stream\n # '-re', # native frame-rate\n '-analyzeduration', '1',\n '-f', 'rawvideo',\n '-r', '%d' % self.fps, # set a fixed frame rate\n '-vcodec', 'rawvideo',\n # size of one frame\n '-s', '%dx%d' % (self.width, self.height),\n '-pix_fmt', 'rgb24', # The input are raw bytes\n '-thread_queue_size', '1024',\n '-i', '/tmp/videopipe', # The input comes from a pipe\n\n # Twitch needs to receive sound in their streams!\n # '-an', # Tells FFMPEG not to expect any audio\n ])\n if self.audio_enabled:\n command.extend([\n '-ar', '%d' % AUDIORATE,\n '-ac', '2',\n '-f', 's16le',\n '-thread_queue_size', '1024',\n '-i', '/tmp/audiopipe'\n ])\n else:\n command.extend([\n '-ar', '8000',\n '-ac', '1',\n '-f', 's16le',\n '-i', '/dev/zero', # silence alternative, works forever\n # '-i','http://stream1.radiostyle.ru:8001/tunguska',\n # '-filter_complex',\n # '[0:1][1:0]amix=inputs=2:duration=first[all_audio]'\n ])\n command.extend([\n # VIDEO CODEC PARAMETERS\n '-vcodec', 'libx264',\n '-r', '%d' % self.fps,\n '-b:v', '3000k',\n '-s', '%dx%d' % (self.width, self.height),\n '-preset', 'faster', '-tune', 'zerolatency',\n '-crf', '23',\n '-pix_fmt', 'yuv420p',\n # '-force_key_frames', r'expr:gte(t,n_forced*2)',\n '-minrate', '3000k', '-maxrate', '3000k',\n '-bufsize', '12000k',\n '-g', '60', # key frame distance\n '-keyint_min', '1',\n # '-filter:v \"setpts=0.25*PTS\"'\n # '-vsync','passthrough',\n\n # AUDIO CODEC PARAMETERS\n '-acodec', 'libmp3lame', '-ar', '44100', '-b:a', '160k',\n # '-bufsize', '8192k',\n '-ac', '1',\n # '-acodec', 'aac', '-strict', 'experimental',\n # '-ab', '128k', '-ar', '44100', '-ac', '1',\n # '-async','44100',\n # '-filter_complex', 'asplit', #for audio sync?\n\n # STORE THE VIDEO PARAMETERS\n # '-vcodec', 'libx264', '-s', '%dx%d'%(width, height),\n # '-preset', 'libx264-fast',\n # 'my_output_videofile2.avi'\n\n # MAP THE STREAMS\n # use only video from first input and only audio from second\n '-map', '0:v', '-map', '1:a',\n\n # NUMBER OF THREADS\n '-threads', '2',\n\n # STREAM TO TWITCH\n '-f', 'flv', 'rtmp://live-ams.twitch.tv/app/%s' %\n self.twitch_stream_key\n ])\n\n devnullpipe = open(\"/dev/null\", \"w\") # Throw away stream\n if self.verbose:\n devnullpipe = None\n self.ffmpeg_process = subprocess.Popen(\n command,\n stdin=subprocess.PIPE,\n stderr=devnullpipe,\n stdout=devnullpipe)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef send_video_frame(self, frame):\n if self.video_pipe is None:\n if not os.path.exists('/tmp/videopipe'):\n os.mkfifo('/tmp/videopipe')\n self.video_pipe = os.open('/tmp/videopipe', os.O_WRONLY)\n\n assert frame.shape == (self.height, self.width, 3)\n\n frame = np.clip(255*frame, 0, 255).astype('uint8')\n try:\n os.write(self.video_pipe, frame.tostring())\n except OSError:\n # The pipe has been closed. Reraise and handle it further\n # downstream\n raise", "response": "Send a frame to the video pipe."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nsend the audio samples to the audio pipe.", "response": "def send_audio(self, left_channel, right_channel):\n \"\"\"Add the audio samples to the stream. The left and the right\n channel should have the same shape.\n Raises an OSError when the stream is closed.\n\n :param left_channel: array containing the audio signal.\n :type left_channel: numpy array with shape (k, )\n containing values between -1.0 and 1.0. k can be any integer\n :param right_channel: array containing the audio signal.\n :type right_channel: numpy array with shape (k, )\n containing values between -1.0 and 1.0. k can be any integer\n \"\"\"\n if self.audio_pipe is None:\n if not os.path.exists('/tmp/audiopipe'):\n os.mkfifo('/tmp/audiopipe')\n self.audio_pipe = os.open('/tmp/audiopipe', os.O_WRONLY)\n\n assert len(left_channel.shape) == 1\n assert left_channel.shape == right_channel.shape\n\n frame = np.column_stack((left_channel, right_channel)).flatten()\n\n frame = np.clip(32767*frame, -32767, 32767).astype('int16')\n try:\n os.write(self.audio_pipe, frame.tostring())\n except OSError:\n # The pipe has been closed. Reraise and handle it further\n # downstream\n raise"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nadds the audio samples to the stream. The left and right channels should have the same shape. The left and right channels should have the same shape. The right and the left channels must have the same shape. The left and the right channels must have the same shape.", "response": "def send_audio(self, left_channel, right_channel):\n \"\"\"Add the audio samples to the stream. The left and the right\n channel should have the same shape.\n\n :param left_channel: array containing the audio signal.\n :type left_channel: numpy array with shape (k, )\n containing values between -1.0 and 1.0. k can be any integer\n :param right_channel: array containing the audio signal.\n :type right_channel: numpy array with shape (k, )\n containing values between -1.0 and 1.0. k can be any integer\n \"\"\"\n self.lastaudioframe_left = left_channel\n self.lastaudioframe_right = right_channel"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsends a frame to the video queue", "response": "def send_video_frame(self, frame, frame_counter=None):\n \"\"\"send frame of shape (height, width, 3)\n with values between 0 and 1\n\n :param frame: array containing the frame.\n :type frame: numpy array with shape (height, width, 3)\n containing values between 0.0 and 1.0\n :param frame_counter: frame position number within stream.\n Provide this when multi-threading to make sure frames don't\n switch position\n :type frame_counter: int\n \"\"\"\n if frame_counter is None:\n frame_counter = self.frame_counter\n self.frame_counter += 1\n\n self.q_video.put((frame_counter, frame))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef send_audio(self,\n left_channel,\n right_channel,\n frame_counter=None):\n \"\"\"Add the audio samples to the stream. The left and the right\n channel should have the same shape.\n\n :param left_channel: array containing the audio signal.\n :type left_channel: numpy array with shape (k, )\n containing values between -1.0 and 1.0. l can be any integer\n :param right_channel: array containing the audio signal.\n :type right_channel: numpy array with shape (k, )\n containing values between -1.0 and 1.0. l can be any integer\n :param frame_counter: frame position number within stream.\n Provide this when multi-threading to make sure frames don't\n switch position\n :type frame_counter: int\n \"\"\"\n if frame_counter is None:\n frame_counter = self.audio_frame_counter\n self.audio_frame_counter += 1\n\n self.q_audio.put((frame_counter, left_channel, right_channel))", "response": "Add the audio samples to the queue."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nimport an attribute from a module.", "response": "def import_attribute(self, path):\n \"\"\"\n Import an attribute from a module.\n \"\"\"\n module = '.'.join(path.split('.')[:-1])\n function = path.split('.')[-1]\n\n module = importlib.import_module(module)\n return getattr(module, function)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning a url to redirect to after the login or signup.", "response": "def get_next(self, request):\n \"\"\"\n Returns a url to redirect to after the login / signup.\n \"\"\"\n if 'next' in request.session: \n next = request.session['next']\n del request.session['next']\n elif 'next' in request.GET:\n next = request.GET.get('next')\n elif 'next' in request.POST:\n next = request.POST.get('next')\n else:\n next = getattr(settings, 'LOGIN_REDIRECT_URL', '/')\n \n netloc = urlparse.urlparse(next)[1]\n \n if netloc and netloc != request.get_host():\n next = getattr(settings, 'LOGIN_REDIRECT_URL', '/')\n\n return next"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn an inactive message.", "response": "def inactive_response(self, request):\n \"\"\"\n Return an inactive message.\n \"\"\"\n inactive_url = getattr(settings, 'LOGIN_INACTIVE_REDIRECT_URL', '')\n if inactive_url:\n return HttpResponseRedirect(inactive_url)\n else:\n return self.error_to_response(request, {'error': _(\"This user account is marked as inactive.\")})"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a profile model.", "response": "def create_profile(self, user, save=False, **kwargs):\n \"\"\"\n Create a profile model.\n\n :param user: A user object\n :param save: If this is set, the profile will\n be saved to DB straight away\n :type save: bool\n \"\"\"\n profile = self.get_model()(user=user, **kwargs)\n\n if save:\n profile.save()\n\n return profile"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_or_create_profile(self, user, save=False, **kwargs):\n try:\n profile = self.get_model().objects.get(user=user, **kwargs)\n return profile, False\n except self.get_model().DoesNotExist:\n profile = self.create_profile(user, save=save, **kwargs)\n return profile, True", "response": "Get a profile from DB or create one if it doesn t exist."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_session_data(self, request):\n user = request.session['%suser' % SESSION_KEY]\n profile = request.session['%sprofile' % SESSION_KEY]\n client = request.session['%sclient' % SESSION_KEY]\n return user, profile, client", "response": "Return a tuple of user profile client from the session."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef send_login_signal(self, request, user, profile, client):\n signals.login.send(sender=profile.__class__, user=user,\n profile=profile, client=client, request=request)", "response": "Send a signal that a user logged in."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsends a signal that a user connected a social profile to his Django account.", "response": "def send_connect_signal(self, request, user, profile, client):\n \"\"\"\n Send a signal that a user connected a social profile to his Django\n account. This signal should be sent *only* when the a new social\n connection was created.\n \"\"\"\n signals.connect.send(sender=profile.__class__, user=user, profile=profile,\n client=client, request=request)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nrequests an access token for the specified set of parameters.", "response": "def request_access_token(self, params):\n \"\"\" \n Foursquare does not accept POST requests to retrieve an access token,\n so we'll be doing a GET request instead.\n \"\"\"\n return self.request(self.access_token_url, method=\"GET\", params=params)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef openid_form(parser, token):\n\n bits = get_bits(token)\n\n if len(bits) > 1:\n return FormNode(bits[0], bits[1:])\n if len(bits) == 1:\n return FormNode(bits[0])\n return FormNode(None)", "response": "Render OpenID form. Allows to pre set the provider::\n\n \t{% openid_form \"https://www.google.com/accounts/o8/id\" %}\n\n Also creates custom button URLs by concatenating all arguments\n after the provider's URL\n\n \t{% openid_form \"https://www.google.com/accounts/o8/id\" STATIC_URL \"image/for/google.jpg\" %}"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_initial_data(self, request, user, profile, client):\n if INITAL_DATA_FUNCTION:\n func = self.import_attribute(INITAL_DATA_FUNCTION)\n return func(request, user, profile, client)\n return {}", "response": "Returns the initial data for the setup form."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn the additional context for the setup view.", "response": "def get_context(self, request, user, profile, client):\n \"\"\"\n Return additional context for the setup view. The function can\n be controlled with ``SOCIALREGISTRATION_SETUP_CONTEXT_FUNCTION``.\n\n :param request: The current request object\n :param user: The unsaved user object\n :param profile: The unsaved profile object\n :param client: The API client\n \"\"\"\n if CONTEXT_FUNCTION:\n func = self.import_attribute(CONTEXT_FUNCTION)\n return func(request, user, profile, client)\n return {}"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef generate_username_and_redirect(self, request, user, profile, client):\n func = self.get_username_function()\n \n user.username = func(user, profile, client)\n user.set_unusable_password()\n user.save()\n \n profile.user = user\n profile.save()\n \n user = profile.authenticate()\n \n self.send_connect_signal(request, user, profile, client)\n \n self.login(request, user)\n \n self.send_login_signal(request, user, profile, client)\n \n self.delete_session_data(request)\n \n return HttpResponseRedirect(self.get_next(request))", "response": "Generate a username and then redirect the user to the correct place."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget the social profile.", "response": "def get(self, request):\n \"\"\"\n When signing a new user up - either display a setup form, or\n generate the username automatically.\n \"\"\"\n\n if request.user.is_authenticated():\n return HttpResponseRedirect(self.get_next(request))\n\n try:\n user, profile, client = self.get_session_data(request)\n except KeyError:\n return self.error_to_response(request, dict(\n error=_(\"Social profile is missing from your session.\")))\n \n if GENERATE_USERNAME:\n return self.generate_username_and_redirect(request, user, profile, client)\n \n form = self.get_form()(initial=self.get_initial_data(request, user, profile, client))\n \n additional_context = self.get_context(request, user, profile, client)\n return self.render_to_response(dict({'form': form}, **additional_context))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef post(self, request):\n\n if request.user.is_authenticated():\n return self.error_to_response(request, dict(\n error=_(\"You are already logged in.\")))\n\n try:\n user, profile, client = self.get_session_data(request)\n except KeyError:\n return self.error_to_response(request, dict(\n error=_(\"A social profile is missing from your session.\")))\n \n form = self.get_form()(request.POST, request.FILES,\n initial=self.get_initial_data(request, user, profile, client))\n \n if not form.is_valid():\n additional_context = self.get_context(request, user, profile, client)\n return self.render_to_response(dict({'form': form}, **additional_context))\n \n user, profile = form.save(request, user, profile, client)\n \n user = profile.authenticate()\n \n self.send_connect_signal(request, user, profile, client)\n \n self.login(request, user)\n \n self.send_login_signal(request, user, profile, client)\n \n self.delete_session_data(request)\n \n return HttpResponseRedirect(self.get_next(request))", "response": "Save the user and profile login and send the right signals."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef post(self, request):\n request.session['next'] = self.get_next(request)\n client = self.get_client()()\n request.session[self.get_client().get_session_key()] = client\n url = client.get_redirect_url(request=request)\n logger.debug(\"Redirecting to %s\", url)\n try:\n return HttpResponseRedirect(url)\n except OAuthError, error:\n return self.error_to_response(request, {'error': error})\n except socket.timeout:\n return self.error_to_response(request, {'error': \n _('Could not connect to service (timed out)')})", "response": "Create a client and redirect to the API provider to authorize our app and permissions."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get(self, request):\n try:\n client = request.session[self.get_client().get_session_key()]\n logger.debug(\"API returned: %s\", request.GET) \n client.complete(dict(request.GET.items()))\n request.session[self.get_client().get_session_key()] = client\n return HttpResponseRedirect(self.get_redirect())\n except KeyError:\n return self.error_to_response(request, {'error': \"Session expired.\"})\n except OAuthError, error:\n return self.error_to_response(request, {'error': error})\n except socket.timeout:\n return self.error_to_response(request, {'error':\n _('Could not connect to service (timed out)')})", "response": "Handles the GET request and returns a response object with the user s information."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nattempting to get a user s profile and create a profile object if it doesn t already exist.", "response": "def get(self, request):\n \"\"\"\n Called after authorization was granted and the OAuth flow \n successfully completed. \n \n Tries to:\n\n - Connect the remote account if the user is logged in already\n - Log the user in if a local profile of the remote account \n exists already\n - Create a user and profile object if none of the above succeed\n and redirect the user further to either capture some data via\n form or generate a username automatically\n \"\"\"\n \n try:\n client = request.session[self.get_client().get_session_key()]\n except KeyError:\n return self.error_to_response(request, {'error': \"Session expired.\"})\n \n # Get the lookup dictionary to find the user's profile\n lookup_kwargs = self.get_lookup_kwargs(request, client)\n\n # Logged in user (re-)connecting an account\n if request.user.is_authenticated():\n try:\n profile = self.get_profile(**lookup_kwargs)\n \n # Make sure that there is only *one* account per profile. \n if not profile.user == request.user:\n self.delete_session_data(request)\n return self.error_to_response(request, {\n 'error': _('This profile is already connected to another user account.')\n })\n \n except self.get_model().DoesNotExist: \n profile, created = self.get_or_create_profile(request.user,\n save=True, **lookup_kwargs) \n\n self.send_connect_signal(request, request.user, profile, client)\n\n return self.redirect(request)\n\n # Logged out user - let's see if we've got the identity saved already.\n # If so - just log the user in. If not, create profile and redirect\n # to the setup view \n \n user = self.authenticate(**lookup_kwargs)\n \n # No user existing - create a new one and redirect to the final setup view\n if user is None:\n if not ALLOW_OPENID_SIGNUPS and self.client is OpenIDClient:\n return self.error_to_response(request, {\n 'error': _('We are not currently accepting new OpenID signups.')\n })\n user = self.create_user()\n profile = self.create_profile(user, **lookup_kwargs)\n \n self.store_user(request, user)\n self.store_profile(request, profile)\n self.store_client(request, client)\n \n return HttpResponseRedirect(reverse('socialregistration:setup'))\n\n # Inactive user - displaying / redirect to the appropriate place.\n if not user.is_active:\n return self.inactive_response(request)\n \n # Active user with existing profile: login, send signal and redirect\n self.login(request, user)\n \n profile = self.get_profile(user=user, **lookup_kwargs)\n \n self.send_login_signal(request, user, profile, client)\n \n return self.redirect(request)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef client(self, verifier=None):\n # We're just starting out and don't have neither request nor access\n # token. Return the standard client\n if not self._request_token and not self._access_token:\n client = oauth.Client(self.consumer, timeout=TIMEOUT)\n \n # We're one step in, we've got the request token and can add that to \n # the client.\n if self._request_token and not self._access_token:\n if verifier is not None:\n self._request_token.set_verifier(verifier)\n client = oauth.Client(self.consumer, self._request_token, timeout=TIMEOUT)\n \n # Two steps in, we've got an access token and can now properly sign \n # our client requests with it.\n if self._access_token:\n client = oauth.Client(self.consumer, self._access_token, timeout=TIMEOUT)\n \n return client", "response": "Return the correct client depending on which stage of the OAuth process is in."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nfetch a request token from self. request_token_url.", "response": "def _get_request_token(self):\n \"\"\"\n Fetch a request token from `self.request_token_url`.\n \"\"\"\n \n params = {\n 'oauth_callback': self.get_callback_url()\n }\n \n \n response, content = self.client().request(self.request_token_url,\n \"POST\", body=urllib.urlencode(params))\n \n content = smart_unicode(content)\n\n if not response['status'] == '200':\n raise OAuthError(_(\n u\"Invalid status code %s while obtaining request token from %s: %s\") % (\n response['status'], self.request_token_url, content))\n \n token = dict(urlparse.parse_qsl(content))\n \n return oauth.Token(token['oauth_token'], token['oauth_token_secret'])"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfetching an access token from self. access_token_url.", "response": "def _get_access_token(self, verifier=None):\n \"\"\"\n Fetch an access token from `self.access_token_url`.\n \"\"\"\n \n response, content = self.client(verifier).request(\n self.access_token_url, \"POST\")\n \n content = smart_unicode(content)\n\n if not response['status'] == '200':\n raise OAuthError(_(\n u\"Invalid status code %s while obtaining access token from %s: %s\") % \n (response['status'], self.access_token_url, content))\n \n token = dict(urlparse.parse_qsl(content))\n \n \n return (oauth.Token(token['oauth_token'], token['oauth_token_secret']),\n token)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget the request token for this API.", "response": "def get_request_token(self):\n \"\"\"\n Return the request token for this API. If we've not fetched it yet,\n go out, request and memoize it.\n \"\"\"\n \n if self._request_token is None:\n self._request_token = self._get_request_token()\n return self._request_token"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_access_token(self, verifier=None):\n \n if self._access_token is None:\n self._access_token, self._access_token_dict = self._get_access_token(verifier)\n return self._access_token", "response": "Get the access token for this API."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_redirect_url(self, **kwargs):\n params = {\n 'oauth_token': self.get_request_token().key,\n }\n return '%s?%s' % (self.auth_url, urllib.urlencode(params))", "response": "Returns the authorization URL signed with the request \n token."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nattempting to complete the flow by requesting an access token.", "response": "def complete(self, GET):\n \"\"\"\n When redirect back to our application, try to complete the flow by\n requesting an access token. If the access token request fails, it'll\n throw an `OAuthError`.\n \n Tries to complete the flow by validating against the `GET` paramters \n received.\n \"\"\"\n token = self.get_access_token(verifier=GET.get('oauth_verifier', None))\n return token"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef request(self, url, method=\"GET\", params=None, headers=None):\n params = params or {}\n headers = headers or {}\n\n logger.debug(\"URL: %s\", url)\n logger.debug(\"Method: %s\", method)\n logger.debug(\"Headers: %s\", headers)\n logger.debug(\"Params: %s\", params)\n \n response, content = self.client().request(url, method, headers=headers,\n body=urllib.urlencode(params))\n \n content = smart_unicode(content)\n\n logger.debug(\"Status: %s\", response['status'])\n logger.debug(\"Content: %s\", content)\n \n if response['status'] != '200':\n raise OAuthError(_(\n u\"Invalid status code %s while requesting %s: %s\") % (\n response['status'], url, content))\n \n return content", "response": "Make signed requests against url."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_redirect_url(self, state='', **kwargs):\n params = {\n 'response_type': 'code',\n 'client_id': self.client_id,\n 'redirect_uri': self.get_callback_url(**kwargs),\n 'scope': self.scope or '',\n 'state': state,\n }\n \n return '%s?%s' % (self.auth_url, urllib.urlencode(params))", "response": "Assemble the URL to redirect the user to request the permissions."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nrequest the access token from the API.", "response": "def request_access_token(self, params):\n \"\"\"\n Request the access token from `self.access_token_url`. The default \n behaviour is to use a `POST` request, but some services use `GET` \n requests. Individual clients can override this method to use the \n correct HTTP method.\n \"\"\"\n return self.request(self.access_token_url, method=\"POST\", params=params,\n is_signed=False)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfetches an access token with the provided code.", "response": "def _get_access_token(self, code, **params):\n \"\"\"\n Fetch an access token with the provided `code`.\n \"\"\"\n params.update({\n 'code': code,\n 'client_id': self.client_id,\n 'client_secret': self.secret,\n 'redirect_uri': self.get_callback_url(),\n })\n\n logger.debug(\"Params: %s\", params)\n\n resp, content = self.request_access_token(params=params)\n\n content = smart_unicode(content)\n\n logger.debug(\"Status: %s\", resp['status'])\n logger.debug(\"Content: %s\", content)\n \n content = self.parse_access_token(content)\n \n if 'error' in content:\n raise OAuthError(_(\n u\"Received error while obtaining access token from %s: %s\") % (\n self.access_token_url, content['error']))\n\n return content"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets the memoized access token or go out and fetch one.", "response": "def get_access_token(self, code=None, **params):\n \"\"\"\n Return the memoized access token or go out and fetch one.\n \"\"\"\n if self._access_token is None:\n if code is None:\n raise ValueError(_('Invalid code.'))\n \n self.access_token_dict = self._get_access_token(code, **params)\n try:\n self._access_token = self.access_token_dict['access_token']\n except KeyError, e:\n raise OAuthError(\"Credentials could not be validated, the provider returned no access token.\")\n \n return self._access_token"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef complete(self, GET):\n if 'error' in GET:\n raise OAuthError(\n _(\"Received error while obtaining access token from %s: %s\") % (\n self.access_token_url, GET['error']))\n return self.get_access_token(code=GET.get('code'))", "response": "Complete the OAuth2 flow by fetching an access token with the provided code in the GET parameters."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef request(self, url, method=\"GET\", params=None, headers=None, is_signed=True):\n params = params or {}\n headers = headers or {}\n \n if is_signed:\n params.update(self.get_signing_params())\n \n if method.upper() == \"GET\":\n url = '%s?%s' % (url, urllib.urlencode(params))\n return self.client().request(url, method=method, headers=headers)\n return self.client().request(url, method, body=urllib.urlencode(params), headers=headers)", "response": "Make a request against url."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef request_access_token(self, params):\n return self.client().request(self.access_token_url, method=\"POST\", body=urllib.urlencode(params), headers={'Content-Type':'application/x-www-form-urlencoded'})", "response": "Request an access token for the specified set of parameters."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn internal temperature value in degrees celsius.", "response": "def readInternalC(self):\n \"\"\"Return internal temperature value in degrees celsius.\"\"\"\n v = self._read32()\n # Ignore bottom 4 bits of thermocouple data.\n v >>= 4\n # Grab bottom 11 bits as internal temperature data.\n internal = v & 0x7FF\n if v & 0x800:\n # Negative value, take 2's compliment. Compute this with subtraction\n # because python is a little odd about handling signed/unsigned.\n internal -= 4096\n # Scale by 0.0625 degrees C per bit and return value.\n return internal * 0.0625"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn the thermocouple temperature value in degrees celsius.", "response": "def readTempC(self):\n \"\"\"Return the thermocouple temperature value in degrees celsius.\"\"\"\n v = self._read32()\n # Check for error reading value.\n if v & 0x7:\n return float('NaN')\n # Check if signed bit is set.\n if v & 0x80000000:\n # Negative value, take 2's compliment. Compute this with subtraction\n # because python is a little odd about handling signed/unsigned.\n v >>= 18\n v -= 16384\n else:\n # Positive value, just shift the bits to get the value.\n v >>= 18\n # Scale by 0.25 degrees C per bit and return value.\n return v * 0.25"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef readLinearizedTempC(self):\n # MAX31855 thermocouple voltage reading in mV\n thermocoupleVoltage = (self.readTempC() - self.readInternalC()) * 0.041276\n # MAX31855 cold junction voltage reading in mV\n coldJunctionTemperature = self.readInternalC()\n coldJunctionVoltage = (-0.176004136860E-01 +\n 0.389212049750E-01 * coldJunctionTemperature +\n 0.185587700320E-04 * math.pow(coldJunctionTemperature, 2.0) +\n -0.994575928740E-07 * math.pow(coldJunctionTemperature, 3.0) +\n 0.318409457190E-09 * math.pow(coldJunctionTemperature, 4.0) +\n -0.560728448890E-12 * math.pow(coldJunctionTemperature, 5.0) +\n 0.560750590590E-15 * math.pow(coldJunctionTemperature, 6.0) +\n -0.320207200030E-18 * math.pow(coldJunctionTemperature, 7.0) +\n 0.971511471520E-22 * math.pow(coldJunctionTemperature, 8.0) +\n -0.121047212750E-25 * math.pow(coldJunctionTemperature, 9.0) +\n 0.118597600000E+00 * math.exp(-0.118343200000E-03 * math.pow((coldJunctionTemperature-0.126968600000E+03), 2.0)))\n # cold junction voltage + thermocouple voltage\n voltageSum = thermocoupleVoltage + coldJunctionVoltage\n # calculate corrected temperature reading based on coefficients for 3 different ranges\n # float b0, b1, b2, b3, b4, b5, b6, b7, b8, b9, b10;\n if thermocoupleVoltage < 0:\n b0 = 0.0000000E+00\n b1 = 2.5173462E+01\n b2 = -1.1662878E+00\n b3 = -1.0833638E+00\n b4 = -8.9773540E-01\n b5 = -3.7342377E-01\n b6 = -8.6632643E-02\n b7 = -1.0450598E-02\n b8 = -5.1920577E-04\n b9 = 0.0000000E+00\n elif thermocoupleVoltage < 20.644:\n b0 = 0.000000E+00\n b1 = 2.508355E+01\n b2 = 7.860106E-02\n b3 = -2.503131E-01\n b4 = 8.315270E-02\n b5 = -1.228034E-02\n b6 = 9.804036E-04\n b7 = -4.413030E-05\n b8 = 1.057734E-06\n b9 = -1.052755E-08\n elif thermocoupleVoltage < 54.886:\n b0 = -1.318058E+02\n b1 = 4.830222E+01\n b2 = -1.646031E+00\n b3 = 5.464731E-02\n b4 = -9.650715E-04\n b5 = 8.802193E-06\n b6 = -3.110810E-08\n b7 = 0.000000E+00\n b8 = 0.000000E+00\n b9 = 0.000000E+00\n else:\n # TODO: handle error - out of range\n return 0\n return (b0 +\n b1 * voltageSum +\n b2 * pow(voltageSum, 2.0) +\n b3 * pow(voltageSum, 3.0) +\n b4 * pow(voltageSum, 4.0) +\n b5 * pow(voltageSum, 5.0) +\n b6 * pow(voltageSum, 6.0) +\n b7 * pow(voltageSum, 7.0) +\n b8 * pow(voltageSum, 8.0) +\n b9 * pow(voltageSum, 9.0))", "response": "Return the NIST - linearized thermocouple temperature value in degrees celsius."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef escape_quotes(self, val):\n if self.is_string(val) and self._in_quotes(val, self.quote):\n # make sure any previously escaped quotes are not re-escaped\n middle = self.remove_quotes(val).replace(\"\\\\\" + self.quote, self.quote)\n middle = middle.replace(self.quote, \"\\\\\" + self.quote)\n val = self.add_quotes(middle)\n\n return val", "response": "Escape any quotes in a value."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nstandardise the quotes used to wrap a value to the pprint default.", "response": "def standardise_quotes(self, val):\n \"\"\"\n Change the quotes used to wrap a value to the pprint default\n E.g. \"val\" to 'val' or 'val' to \"val\"\n \"\"\"\n if self._in_quotes(val, self.altquote):\n middle = self.remove_quotes(val)\n val = self.add_quotes(middle)\n\n return self.escape_quotes(val)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef process_key_dict(self, key, d, level):\n\n # add any composite level comments\n comments = d.get(\"__comments__\", {})\n lines = []\n self._add_type_comment(level, comments, lines)\n\n lines += [self.add_start_line(key, level)]\n lines += self.process_dict(d, level, comments)\n lines.append(self.add_end_line(level, 1, key))\n\n return lines", "response": "Process key value dicts e. g. METADATA key value"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef process_dict(self, d, level, comments):\n lines = []\n\n for k, v in d.items():\n if not self.__is_metadata(k):\n qk = self.quoter.add_quotes(k)\n qv = self.quoter.add_quotes(v)\n line = self.__format_line(self.whitespace(level, 2), qk, qv)\n line += self.process_attribute_comment(comments, k)\n lines.append(line)\n\n return lines", "response": "Processes keys and values within a block holding a specific level."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef process_config_dict(self, key, d, level):\n lines = []\n for k, v in d.items():\n k = \"CONFIG {}\".format(self.quoter.add_quotes(k.upper()))\n v = self.quoter.add_quotes(v)\n lines.append(self.__format_line(self.whitespace(level, 1), k, v))\n return lines", "response": "Process the CONFIG block"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nprocess repeated keys e. g. FORMATOPTION", "response": "def process_repeated_list(self, key, lst, level):\n \"\"\"\n Process blocks of repeated keys e.g. FORMATOPTION\n \"\"\"\n lines = []\n\n for v in lst:\n k = key.upper()\n v = self.quoter.add_quotes(v)\n lines.append(self.__format_line(self.whitespace(level, 1), k, v))\n\n return lines"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef format_pair_list(self, key, pair_list, level):\n\n lines = [self.add_start_line(key, level)]\n\n list_spacer = self.spacer * (level + 2)\n pairs = [\"{}{} {}\".format(list_spacer, p[0], p[1]) for p in pair_list]\n lines += pairs\n\n lines.append(self.add_end_line(level, 1, key))\n\n return lines", "response": "Format a list of pairs."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef format_repeated_pair_list(self, key, root_list, level):\n\n lines = []\n\n def depth(L):\n return isinstance(L, (tuple, list)) and max(map(depth, L)) + 1\n\n if depth(root_list) == 2:\n # single set of points only\n root_list = [root_list]\n\n for pair_list in root_list:\n lines += self.format_pair_list(key, pair_list, level)\n\n return lines", "response": "Process repeated lists of pairs e. g. POINTs blocks\n "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning True if the key is not a hidden container.", "response": "def is_hidden_container(self, key, val):\n \"\"\"\n The key is not one of the Mapfile keywords, and its\n values are a list\n \"\"\"\n\n if key in (\"layers\", \"classes\", \"styles\", \"symbols\", \"labels\",\n \"outputformats\", \"features\", \"scaletokens\",\n \"composites\") and isinstance(val, list):\n return True\n else:\n return False"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nprint out a nicely indented Mapfile .", "response": "def pprint(self, composites):\n \"\"\"\n Print out a nicely indented Mapfile\n \"\"\"\n\n # if only a single composite is used then cast to list\n # and allow for multiple root composites\n\n if composites and not isinstance(composites, list):\n composites = [composites]\n\n lines = []\n\n for composite in composites:\n type_ = composite[\"__type__\"]\n if type_ in (\"metadata\", \"validation\"):\n # types are being parsed directly, and not as an attr of a parent\n lines += self.process_key_dict(type_, composite, level=0)\n else:\n lines += self._format(composite)\n\n result = str(self.newlinechar.join(lines))\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef format_value(self, attr, attr_props, value):\n if isinstance(value, bool):\n return str(value).upper()\n\n if any(i in [\"enum\"] for i in attr_props):\n if not isinstance(value, numbers.Number):\n return value.upper() # value is from a set list, no need for quote\n else:\n return value\n\n if \"type\" in attr_props and attr_props[\"type\"] == \"string\": # and \"enum\" not in attr_props\n # check schemas for expressions and handle accordingly\n if self.is_expression(attr_props) and self.quoter.in_slashes(value):\n return value\n elif self.is_expression(attr_props) and (value.endswith(\"'i\") or value.endswith('\"i')):\n # for case insensitive regex\n return value\n else:\n return self.quoter.add_quotes(value)\n\n # expressions can be one of a string or an expression in brackets\n if any(i in [\"oneOf\", \"anyOf\"] for i in attr_props): # and check that type string is in list\n if \"oneOf\" in attr_props:\n options_list = attr_props[\"oneOf\"]\n else:\n options_list = attr_props[\"anyOf\"]\n if self.quoter.is_string(value):\n if self.quoter.in_parenthesis(value):\n pass\n elif attr == \"expression\" and self.quoter.in_braces(value):\n # don't add quotes to list expressions such as {val1, val2}\n pass\n elif attr != \"text\" and self.quoter.in_brackets(value):\n # TEXT expressions are often \"[field1]-[field2]\" so need to leave quotes for these\n pass\n elif value.startswith(\"NOT \") and self.quoter.in_parenthesis(value[4:]):\n value = \"NOT {}\".format(value[4:])\n else:\n value = self.check_options_list(options_list, value)\n\n if isinstance(value, list):\n new_values = []\n\n for v in value:\n if not isinstance(v, numbers.Number):\n v = self.quoter.add_quotes(v)\n new_values.append(v)\n\n value = \" \".join(list(map(str, new_values)))\n else:\n value = self.quoter.escape_quotes(value)\n\n return value", "response": "Format the value for the related attribute."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nprocess one of the main composite types", "response": "def process_attribute(self, type_, attr, value, level):\n \"\"\"\n Process one of the main composite types (see the type_ value)\n \"\"\"\n\n attr_props = self.get_attribute_properties(type_, attr)\n value = self.format_value(attr, attr_props, value)\n line = self.__format_line(self.whitespace(level, 1), attr.upper(), value)\n\n return line"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef process_composite_comment(self, level, comments, key):\n if key not in comments:\n comment = \"\"\n else:\n value = comments[key]\n spacer = self.whitespace(level, 0)\n\n if isinstance(value, list):\n comments = [self.format_comment(spacer, v) for v in value]\n comment = self.newlinechar.join(comments)\n else:\n comment = self.format_comment(spacer, value)\n\n return comment", "response": "Process comments for composites such as MAP LAYER etc."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate an animation from a list of image files.", "response": "def create_animation(img_files):\n \"\"\"\n See http://pillow.readthedocs.io/en/4.2.x/handbook/image-file-formats.html?highlight=append_images#saving\n \"\"\"\n\n open_images = []\n\n for fn in img_files:\n print(fn)\n im = Image.open(fn)\n open_images.append(im)\n\n im = open_images[0]\n im.save(r\"C:\\temp\\animation.gif\", save_all=True, append_images=open_images[1:], duration=120, loop=100, optimize=True)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nparse a MapServer Mapfile and returns a list of all possible Mapfiles or lists of composites", "response": "def start(self, children):\n \"\"\"\n Parses a MapServer Mapfile\n Parsing of partial Mapfiles or lists of composites is also possible\n \"\"\"\n\n composites = []\n\n for composite_dict in children:\n if False and self.include_position:\n key_token = composite_dict[1]\n key_name = key_token.value.lower()\n composites_position = self.get_position_dict(composite_dict)\n composites_position[key_name] = self.create_position_dict(key_token, None)\n\n composites.append(composite_dict)\n\n # only return a list when there are multiple root composites (e.g.\n # several CLASSes)\n if len(composites) == 1:\n return composites[0]\n else:\n return composites"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef composite(self, t):\n if len(t) == 1:\n return t[0] # metadata and values - already processed\n\n key_token = t[0][0]\n attribute_dicts = t[1]\n\n if not isinstance(attribute_dicts, list):\n # always handle a list of attributes\n attribute_dicts = [attribute_dicts]\n\n key_name = self.key_name(key_token)\n composite_dict = CaseInsensitiveOrderedDict(CaseInsensitiveOrderedDict)\n composite_dict[\"__type__\"] = key_name\n\n if self.include_position:\n pd = self.create_position_dict(key_token, None)\n composite_dict[\"__position__\"] = pd\n\n if self.include_comments:\n comments_dict = composite_dict[\"__comments__\"] = OrderedDict()\n\n for d in attribute_dicts:\n keys = d.keys()\n if \"__type__\" in keys:\n k = d[\"__type__\"]\n if k in SINGLETON_COMPOSITE_NAMES:\n composite_dict[k] = d\n else:\n plural_key = self.plural(k)\n if plural_key not in composite_dict:\n composite_dict[plural_key] = []\n\n composite_dict[plural_key].append(d)\n else:\n # simple attribute\n pos = d.pop(\"__position__\")\n d.pop(\"__tokens__\", None) # tokens are no longer needed now we have the positions\n comments = d.pop(\"__comments__\", None)\n\n key_name = self.get_single_key(d)\n\n if key_name == \"config\":\n # there may be several config dicts - one for each setting\n if key_name not in composite_dict:\n # create an initial OrderedDict\n composite_dict[key_name] = CaseInsensitiveOrderedDict(CaseInsensitiveOrderedDict)\n # populate the existing config dict\n cfg_dict = composite_dict[key_name]\n cfg_dict.update(d[key_name])\n\n if self.include_position:\n if key_name not in pd:\n pd[key_name] = OrderedDict()\n\n subkey_name = self.get_single_key(d[key_name])\n pd[key_name][subkey_name] = pos\n\n elif key_name == \"points\":\n if key_name not in composite_dict:\n composite_dict[key_name] = d[key_name]\n else:\n # if points are already in a feature then\n # allow for multipart features in a nested list\n existing_points = composite_dict[key_name]\n\n def depth(L):\n return isinstance(L, (tuple, list)) and max(map(depth, L)) + 1\n\n if depth(existing_points) == 2:\n composite_dict[key_name] = [existing_points]\n\n if key_name not in composite_dict:\n composite_dict[key_name] = []\n composite_dict[key_name].append(d[key_name])\n\n if self.include_position:\n if key_name not in pd:\n pd[key_name] = pos\n else:\n existing_pos = pd[key_name]\n if isinstance(existing_pos, dict):\n pd[key_name] = [existing_pos]\n pd[key_name].append(pos)\n\n elif key_name in REPEATED_KEYS:\n if key_name not in composite_dict:\n composite_dict[key_name] = []\n\n composite_dict[key_name].append(d[key_name])\n\n if self.include_position:\n if key_name not in pd:\n pd[key_name] = []\n pd[key_name].append(pos)\n\n else:\n assert len(d.items()) == 1\n if self.include_position:\n # hoist position details to composite\n pd[key_name] = pos\n if self.include_comments and comments:\n # hoist comments to composite\n comments_dict[key_name] = comments\n\n composite_dict[key_name] = d[key_name]\n\n return composite_dict", "response": "Handle the composite types e. g. CLASS.. END"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn the key and contents of a KEY..END block for PATTERN, POINTS, and PROJECTION", "response": "def check_composite_tokens(self, name, tokens):\n \"\"\"\n Return the key and contents of a KEY..END block\n for PATTERN, POINTS, and PROJECTION\n \"\"\"\n assert len(tokens) >= 2\n key = tokens[0]\n\n assert key.value.lower() == name\n assert tokens[-1].value.lower() == \"end\"\n\n if len(tokens) == 2:\n body = [] # empty TYPE..END block\n else:\n body = tokens[1:-1]\n\n body_tokens = []\n\n for t in body:\n if isinstance(t, dict):\n body_tokens.append(t[\"__tokens__\"])\n else:\n body_tokens.append(t)\n return key, body_tokens"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef process_value_pairs(self, tokens, type_):\n key, body = self.check_composite_tokens(type_, tokens)\n key_name = self.key_name(key)\n\n d = CaseInsensitiveOrderedDict(CaseInsensitiveOrderedDict)\n\n for t in body:\n k = self.clean_string(t[0].value).lower()\n v = self.clean_string(t[1].value)\n\n if k in d.keys():\n log.warning(\"A duplicate key ({}) was found in {}. Only the last value ({}) will be used. \".format(\n k, type_, v))\n\n d[k] = v\n\n if self.include_position:\n pd = self.create_position_dict(key, body)\n d[\"__position__\"] = pd\n\n d[\"__type__\"] = key_name\n\n # return the token as well as the processed dict so the\n # composites function works the same way\n return d", "response": "Process the value pairs of metadata values and Validation blocks."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef func_call(self, t):\n func, params = t\n func_name = func.value\n func.value = \"({}({}))\".format(func_name, params)\n return func", "response": "For function calls e. g. TEXT"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding comments to the dictionary d.", "response": "def add_metadata_comments(self, d, metadata):\n \"\"\"\n Any duplicate keys will be replaced with the last duplicate along with comments\n \"\"\"\n\n if len(metadata) > 2:\n string_pairs = metadata[1:-1] # get all metadata pairs\n for sp in string_pairs:\n # get the raw metadata key\n\n if isinstance(sp.children[0], Token):\n token = sp.children[0]\n assert token.type == \"UNQUOTED_STRING\"\n key = token.value\n else:\n # quoted string (double or single)\n token = sp.children[0].children[0]\n key = token.value\n\n # clean it to match the dict key\n key = self._mapfile_todict.clean_string(key).lower()\n assert key in d.keys()\n key_comments = self.get_comments(sp.meta)\n d[\"__comments__\"][key] = key_comments\n\n return d"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef assign_comments(self, tree, comments):\n comments = list(comments)\n comments.sort(key=lambda c: c.line)\n\n idx_by_line = {0: 0} # {line_no: comment_idx}\n\n for i, c in enumerate(comments):\n if c.line not in idx_by_line:\n idx_by_line[c.line] = i\n\n idx = []\n\n # convert comment tokens to strings, and remove any line breaks\n self.comments = [c.value.strip() for c in comments]\n last_comment_line = max(idx_by_line.keys())\n\n # make a list with an entry for each line\n # number associated with a comment list index\n\n for i in range(last_comment_line, 0, -1):\n if i in idx_by_line:\n # associate line with new comment\n idx.append(idx_by_line[i])\n else:\n # associate line with following comment\n idx.append(idx[-1])\n\n idx.append(0) # line numbers start from 1\n idx.reverse()\n self.idx = idx\n self._assign_comments(tree, 0)", "response": "Assign comments to the tree."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef parse(self, text, fn=None):\n\n if PY2 and not isinstance(text, unicode):\n # specify Unicode for Python 2.7\n text = unicode(text, 'utf-8')\n\n if self.expand_includes:\n text = self.load_includes(text, fn=fn)\n\n try:\n self._comments[:] = [] # clear any comments from a previous parse\n tree = self.lalr.parse(text)\n if self.include_comments:\n self.assign_comments(tree, self._comments)\n return tree\n except (ParseError, UnexpectedInput) as ex:\n if fn:\n log.error(\"Parsing of {} unsuccessful\".format(fn))\n else:\n log.error(\"Parsing of Mapfile unsuccessful\")\n log.info(ex)\n raise", "response": "Parse the Mapfile containing the text and return a tree of the tree."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nadding a child to the graph.", "response": "def add_child(graph, child_id, child_label, parent_id, colour):\n \"\"\"\n http://www.graphviz.org/doc/info/shapes.html#polygon\n \"\"\"\n node = pydot.Node(child_id, style=\"filled\", fillcolor=colour, label=child_label, shape=\"polygon\", fontname=FONT)\n graph.add_node(node)\n graph.add_edge(pydot.Edge(parent_id, node))"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns a file protocol URI e. g. file:///D:/mappyfile / schemas / on Linux and file : // home / user / mappyfile / schemas / on Windows and file : // home / user / mappyfile / schemas / on Linux", "response": "def get_schema_path(self, schemas_folder):\n \"\"\"\n Return a file protocol URI e.g. file:///D:/mappyfile/mappyfile/schemas/ on Windows\n and file:////home/user/mappyfile/mappyfile/schemas/ on Linux\n \"\"\"\n\n # replace any Windows path back slashes with forward slashes\n schemas_folder = schemas_folder.replace(\"\\\\\", \"/\")\n\n # HACK Python 2.7 on Linux seems to remove the root slash\n # so add this back in\n if schemas_folder.startswith(\"/\"):\n schemas_folder = \"/\" + schemas_folder\n\n host = \"\"\n root_schema_path = \"file://{}/{}\".format(host, schemas_folder) + \"/\"\n\n return root_schema_path"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning a jsonschema validator for the given schema name.", "response": "def get_schema_validator(self, schema_name):\n \"\"\"\n Had to remove the id property from map.json or it uses URLs for validation\n See various issues at https://github.com/Julian/jsonschema/pull/306\n \"\"\"\n\n if schema_name not in self.schemas:\n schema_file = self.get_schema_file(schema_name)\n with open(schema_file) as f:\n try:\n jsn_schema = json.load(f)\n except ValueError as ex:\n log.error(\"Could not load %s\", schema_file)\n raise ex\n\n schemas_folder = self.get_schemas_folder()\n root_schema_path = self.get_schema_path(schemas_folder)\n resolver = jsonschema.RefResolver(root_schema_path, None)\n # cache the schema for future use\n self.schemas[schema_name] = (jsn_schema, resolver)\n else:\n jsn_schema, resolver = self.schemas[schema_name]\n\n validator = jsonschema.Draft4Validator(schema=jsn_schema, resolver=resolver)\n # validator.check_schema(jsn_schema) # check schema is valid\n\n return validator"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ncreating a message for the object that is not in the error message", "response": "def create_message(self, rootdict, path, error, add_comments):\n \"\"\"\n Add a validation comment to the dictionary\n path is the path to the error object, it can be empty if the error is in the root object\n http://python-jsonschema.readthedocs.io/en/latest/errors/#jsonschema.exceptions.ValidationError.absolute_path\n It can also reference an object in a list e.g. [u'layers', 0]\n\n Unfortunately it is not currently possible to get the name of the failing property from the\n JSONSchema error object, even though it is in the error message.\n See https://github.com/Julian/jsonschema/issues/119\n \"\"\"\n\n if not path:\n # error applies to the root type\n d = rootdict\n key = d[\"__type__\"]\n elif isinstance(path[-1], int):\n # the error is on an object in a list\n d = utils.findkey(rootdict, *path)\n key = d[\"__type__\"]\n else:\n key = path[-1]\n d = utils.findkey(rootdict, *path[:-1])\n\n error_message = \"ERROR: Invalid value in {}\".format(key.upper())\n\n # add a comment to the dict structure\n\n if add_comments:\n if \"__comments__\" not in d:\n d[\"__comments__\"] = OrderedDict()\n\n d[\"__comments__\"][key] = \"# {}\".format(error_message)\n\n error_message = {\"error\": error.message,\n \"message\": error_message}\n\n # add in details of the error line, when Mapfile was parsed to\n # include position details\n\n if \"__position__\" in d:\n if not path:\n # position for the root object is stored in the root of the dict\n pd = d[\"__position__\"]\n else:\n pd = d[\"__position__\"][key]\n\n error_message[\"line\"] = pd.get(\"line\")\n error_message[\"column\"] = pd.get(\"column\")\n\n return error_message"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef validate(self, value, add_comments=False, schema_name=\"map\"):\n validator = self.get_schema_validator(schema_name)\n\n error_messages = []\n\n if isinstance(value, list):\n for d in value:\n error_messages += self._validate(d, validator, add_comments, schema_name)\n else:\n error_messages = self._validate(value, validator, add_comments, schema_name)\n\n return error_messages", "response": "Validate the value against the schema."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn a schema file with all properties expanded", "response": "def get_expanded_schema(self, schema_name):\n \"\"\"\n Return a schema file with all $ref properties expanded\n \"\"\"\n if schema_name not in self.expanded_schemas:\n fn = self.get_schema_file(schema_name)\n schemas_folder = self.get_schemas_folder()\n base_uri = self.get_schema_path(schemas_folder)\n\n with open(fn) as f:\n jsn_schema = jsonref.load(f, base_uri=base_uri)\n\n # cache the schema for future use\n self.expanded_schemas[schema_name] = jsn_schema\n else:\n jsn_schema = self.expanded_schemas[schema_name]\n\n return jsn_schema"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef output(s):\n p = Parser()\n t = ExpressionsTransformer()\n\n ast = p.parse(s)\n logging.debug(ast.pretty())\n print(ast.pretty())\n d = t.transform(ast)\n print(json.dumps(d, indent=4))\n return d", "response": "Parse transform and pretty print\n the result"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef comparison(self, t):\n assert(len(t) == 3)\n\n d = {\"PropertyIsEqualTo\": [\n t[0], t[1], t[2]\n ]}\n\n #parts = [str(p.value) for p in t]\n #v = \" \".join(parts)\n\n #v = \"( {} )\".format(v)\n #t[0].value = v\n #return t[0]\n return d", "response": "Compare two sets of names and their respective values."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef main(ctx, verbose, quiet):\n verbosity = verbose - quiet\n configure_logging(verbosity)\n ctx.obj = {}\n ctx.obj['verbosity'] = verbosity", "response": "Execute the main mappyfile command\n "} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nformat a single Mapfile and save it as output - mapfile", "response": "def format(ctx, input_mapfile, output_mapfile, indent, spacer, quote, newlinechar, expand, comments):\n \"\"\"\n Format a the input-mapfile and save as output-mapfile. Note output-mapfile will be\n overwritten if it already exists.\n\n Example of formatting a single Mapfile:\n\n mappyfile format C:/Temp/valid.map C:/Temp/valid_formatted.map\n\n Example of formatting a single Mapfile with single quotes and tabs for indentation:\n\n mappyfile format C:/Temp/valid.map C:/Temp/valid_formatted.map --quote=\\\\' --indent=1 --spacer=\\t\n\n Example of formatting a single Mapfile without expanding includes, but including comments:\n\n mappyfile format C:/Temp/valid.map C:/Temp/valid_formatted.map --no-expand --comments\n \"\"\"\n\n quote = codecs.decode(quote, 'unicode_escape') # ensure \\t is handled as a tab\n spacer = codecs.decode(spacer, 'unicode_escape') # ensure \\t is handled as a tab\n newlinechar = codecs.decode(newlinechar, 'unicode_escape') # ensure \\n is handled as a newline\n\n d = mappyfile.open(input_mapfile, expand_includes=expand, include_comments=comments, include_position=True)\n mappyfile.save(d, output_mapfile, indent=indent, spacer=spacer, quote=quote, newlinechar=newlinechar)\n sys.exit(0)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef validate(ctx, mapfiles, expand):\n\n all_mapfiles = get_mapfiles(mapfiles)\n\n if len(all_mapfiles) == 0:\n click.echo(\"No Mapfiles found at the following paths: {}\".format(\",\".join(mapfiles)))\n return\n\n validation_count = 0\n errors = 0\n\n for fn in all_mapfiles:\n fn = click.format_filename(fn)\n d = mappyfile.open(fn, expand_includes=expand, include_position=True)\n\n validation_messages = mappyfile.validate(d)\n if validation_messages:\n for v in validation_messages:\n v[\"fn\"] = fn\n msg = \"{fn} (Line: {line} Column: {column}) {message} - {error}\".format(**v)\n click.echo(msg)\n errors += 1\n else:\n click.echo(\"{} validated successfully\".format(fn))\n validation_count += 1\n\n click.echo(\"{} file(s) validated ({} successfully)\".format(len(all_mapfiles), validation_count))\n sys.exit(errors)", "response": "Validate a list of Mapfiles against the schema of a single or two Mapfiles."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_keyword(text):\n first_word = text.split(\" \")[0]\n\n if len(first_word) > 1 and first_word.isupper(): \n kwd = str(first_word.lower())\n else:\n kwd = None\n\n return kwd", "response": "Return the keyword for a given text."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a list of values from a string such as BACKGROUNDCOLOR r g b.", "response": "def get_values(text):\n \"\"\"\n Accept a string such as BACKGROUNDCOLOR [r] [g] [b]\n and return ['r', 'g', 'b']\n \"\"\"\n\n res = re.findall(r\"\\[(.*?)\\]\", text)\n values = []\n\n for r in res:\n if \"|\" in r:\n params = r.split(\"|\")\n for p in params:\n values.append(p)\n else:\n values.append(r)\n\n values = [str(v.lower()) for v in values]\n\n return values"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nprocesses the given text and return the kwd and values of the docutils node.", "response": "def process_doc(text):\n \"\"\"\n The :ref: role is supported by Sphinx but not by plain docutils\n \"\"\"\n # remove :ref: directives\n document = docutils.core.publish_doctree(text) # http://epydoc.sourceforge.net/docutils/private/docutils.nodes.document-class.html\n visitor = RefVisitor(document)\n document.walk(visitor)\n\n return visitor.kwd, visitor.values"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nopens a Mapfile and return a dictionary.", "response": "def open(fn, expand_includes=True, include_comments=False, include_position=False, **kwargs):\n \"\"\"\n Load a Mapfile from the supplied filename into a Python dictionary.\n\n Parameters\n ----------\n\n fn: string\n The path to the Mapfile, or partial Mapfile\n expand_includes: boolean\n Load any ``INCLUDE`` files in the MapFile\n include_comments: boolean\n Include or discard comment strings from the Mapfile - *experimental*\n include_position: boolean\n Include the position of the Mapfile tokens in the output\n\n Returns\n -------\n\n dict\n A Python dictionary representing the Mapfile in the mappyfile format\n\n Example\n -------\n\n To open a Mapfile from a filename and return it as a dictionary object::\n\n d = mappyfile.open('mymap.map')\n\n Notes\n -----\n\n Partial Mapfiles can also be opened, for example a file containing a ``LAYER`` object.\n\n \"\"\"\n p = Parser(expand_includes=expand_includes,\n include_comments=include_comments, **kwargs)\n ast = p.parse_file(fn)\n m = MapfileToDict(include_position=include_position,\n include_comments=include_comments, **kwargs)\n d = m.transform(ast)\n return d"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nload a Mapfile from a file - like object.", "response": "def load(fp, expand_includes=True, include_position=False, include_comments=False, **kwargs):\n \"\"\"\n Load a Mapfile from an open file or file-like object.\n\n Parameters\n ----------\n\n fp: file\n A file-like object - as with all Mapfiles this should be encoded in \"utf-8\"\n expand_includes: boolean\n Load any ``INCLUDE`` files in the MapFile\n include_comments: boolean\n Include or discard comment strings from the Mapfile - *experimental*\n include_position: boolean\n Include the position of the Mapfile tokens in the output\n\n Returns\n -------\n\n dict\n A Python dictionary representing the Mapfile in the mappyfile format\n\n Example\n -------\n\n To open a Mapfile from a file and return it as a dictionary object::\n\n with open('mymap.map') as fp:\n d = mappyfile.load(fp)\n\n Notes\n -----\n\n Partial Mapfiles can also be opened, for example a file containing a ``LAYER`` object.\n \"\"\"\n p = Parser(expand_includes=expand_includes,\n include_comments=include_comments, **kwargs)\n ast = p.load(fp)\n m = MapfileToDict(include_position=include_position,\n include_comments=include_comments, **kwargs)\n d = m.transform(ast)\n return d"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef loads(s, expand_includes=True, include_position=False, include_comments=False, **kwargs):\n p = Parser(expand_includes=expand_includes,\n include_comments=include_comments, **kwargs)\n ast = p.parse(s)\n m = MapfileToDict(include_position=include_position,\n include_comments=include_comments, **kwargs)\n d = m.transform(ast)\n return d", "response": "Loads a Mapfile from a string and returns a dictionary representing the Mapfile."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef dump(d, fp, indent=4, spacer=\" \", quote='\"', newlinechar=\"\\n\", end_comment=False):\n map_string = _pprint(d, indent, spacer, quote, newlinechar, end_comment)\n fp.write(map_string)", "response": "Dump a dictionary to a file - like object"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nsave a dictionary to a Mapfile on disk", "response": "def save(d, output_file, indent=4, spacer=\" \", quote='\"', newlinechar=\"\\n\", end_comment=False, **kwargs):\n \"\"\"\n Write a dictionary to an output Mapfile on disk\n\n Parameters\n ----------\n\n d: dict\n A Python dictionary based on the the mappyfile schema\n output_file: string\n The output filename\n indent: int\n The number of ``spacer`` characters to indent structures in the Mapfile\n spacer: string\n The character to use for indenting structures in the Mapfile. Typically\n spaces or tab characters (``\\\\t``)\n quote: string\n The quote character to use in the Mapfile (double or single quotes)\n newlinechar: string\n The character used to insert newlines in the Mapfile\n end_comment: bool\n Add a comment with the block type at each closing END\n statement e.g. END # MAP\n\n Returns\n -------\n\n string\n The output_file passed into the function\n\n Example\n -------\n\n To open a Mapfile from a string, and then save it to a file::\n\n s = '''MAP NAME \"TEST\" END'''\n\n d = mappyfile.loads(s)\n fn = \"C:/Data/mymap.map\"\n mappyfile.save(d, fn)\n \"\"\"\n map_string = _pprint(d, indent, spacer, quote, newlinechar, end_comment)\n _save(output_file, map_string)\n return output_file"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef dumps(d, indent=4, spacer=\" \", quote='\"', newlinechar=\"\\n\", end_comment=False, **kwargs):\n return _pprint(d, indent, spacer, quote, newlinechar, end_comment, **kwargs)", "response": "Dump a dictionary to a string"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfinds an item in a list of dicts using a key and a value.", "response": "def find(lst, key, value):\n \"\"\"\n Find an item in a list of dicts using a key and a value\n\n Parameters\n ----------\n\n list: list\n A list of composite dictionaries e.g. ``layers``, ``classes``\n key: string\n The key name to search each dictionary in the list\n key: value\n The value to search for\n\n Returns\n -------\n\n dict\n The first composite dictionary object with a key that matches the value\n\n Example\n -------\n\n To find the ``LAYER`` in a list of layers with ``NAME`` set to ``Layer2``::\n\n s = '''\n MAP\n LAYER\n NAME \"Layer1\"\n TYPE POLYGON\n END\n LAYER\n NAME \"Layer2\"\n TYPE POLYGON\n CLASS\n NAME \"Class1\"\n COLOR 0 0 -8\n END\n END\n END\n '''\n\n d = mappyfile.loads(s)\n cmp = mappyfile.find(d[\"layers\"], \"name\", \"Layer2\")\n assert cmp[\"name\"] == \"Layer2\"\n \"\"\"\n return next((item for item in lst if item[key.lower()] == value), None)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfinding all items in lst where key matches value.", "response": "def findall(lst, key, value):\n \"\"\"\n Find all items in lst where key matches value.\n For example find all ``LAYER`` s in a ``MAP`` where ``GROUP`` equals ``VALUE``\n\n Parameters\n ----------\n\n list: list\n A list of composite dictionaries e.g. ``layers``, ``classes``\n key: string\n The key name to search each dictionary in the list\n key: value\n The value to search for\n\n Returns\n -------\n\n list\n A Python list containing the matching composite dictionaries\n\n Example\n -------\n\n To find all ``LAYER`` s with ``GROUP`` set to ``test``::\n\n s = '''\n MAP\n LAYER\n NAME \"Layer1\"\n TYPE POLYGON\n GROUP \"test\"\n END\n LAYER\n NAME \"Layer2\"\n TYPE POLYGON\n GROUP \"test1\"\n END\n LAYER\n NAME \"Layer3\"\n TYPE POLYGON\n GROUP \"test2\"\n END\n LAYER\n NAME \"Layer4\"\n TYPE POLYGON\n GROUP \"test\"\n END\n END\n '''\n\n d = mappyfile.loads(s)\n layers = mappyfile.findall(d[\"layers\"], \"group\", \"test\")\n assert len(layers) == 2\n \"\"\"\n return [item for item in lst if item[key.lower()] in value]"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef findunique(lst, key):\n return sorted(set([item[key.lower()] for item in lst]))", "response": "Find all unique key values for items in lst."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef findkey(d, *keys):\n if keys:\n keys = list(keys)\n key = keys.pop(0)\n return findkey(d[key], *keys)\n else:\n return d", "response": "Returns a value from a dictionary based on a list of keys and list indexes."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef update(d1, d2):\n NoneType = type(None)\n\n if d2.get(\"__delete__\", False):\n return {}\n\n for k, v in d2.items():\n if isinstance(v, dict):\n if v.get(\"__delete__\", False):\n # allow a __delete__ property to be set to delete objects\n del d1[k]\n else:\n d1[k] = update(d1.get(k, {}), v)\n elif isinstance(v, (tuple, list)) and all(isinstance(li, (NoneType, dict)) for li in v):\n # a list of dicts and/or NoneType\n orig_list = d1.get(k, [])\n new_list = []\n pairs = list(zip_longest(orig_list, v, fillvalue=None))\n for orig_item, new_item in pairs:\n if orig_item is None:\n orig_item = {} # can't use {} for fillvalue as only one dict created/modified!\n if new_item is None:\n new_item = {}\n\n if new_item.get(\"__delete__\", False):\n d = None # orig_list.remove(orig_item) # remove the item to delete\n else:\n d = update(orig_item, new_item)\n\n if d is not None:\n new_list.append(d)\n d1[k] = new_list\n else:\n if k in d1 and v == \"__delete__\":\n del d1[k]\n else:\n d1[k] = v\n return d1", "response": "Update dict d1 with properties from d2"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _decode_response(response):\n content_type = response.headers.get('content-type', '')\n logger.debug(\"status[%s] content_type[%s] encoding[%s]\" %\n (response.status_code, content_type, response.encoding))\n response.raise_for_status()\n content = response.content.strip()\n if response.encoding:\n content = content.decode(response.encoding)\n if not content:\n logger.debug(\"no content in response\")\n return content\n if content_type.split(';')[0] != 'application/json':\n return content\n if content.startswith(GERRIT_MAGIC_JSON_PREFIX):\n content = content[len(GERRIT_MAGIC_JSON_PREFIX):]\n try:\n return json.loads(content)\n except ValueError:\n logger.error('Invalid json content: %s', content)\n raise", "response": "Strip off Gerrit s magic prefix and decode a response."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ntranslate kwargs replacing data with json if necessary.", "response": "def translate_kwargs(self, **kwargs):\n \"\"\"Translate kwargs replacing `data` with `json` if necessary.\"\"\"\n local_kwargs = self.kwargs.copy()\n local_kwargs.update(kwargs)\n\n if \"data\" in local_kwargs and \"json\" in local_kwargs:\n raise ValueError(\"Cannot use data and json together\")\n\n if \"data\" in local_kwargs and isinstance(local_kwargs[\"data\"], dict):\n local_kwargs.update({\"json\": local_kwargs[\"data\"]})\n del local_kwargs[\"data\"]\n\n headers = DEFAULT_HEADERS.copy()\n if \"headers\" in kwargs:\n headers.update(kwargs[\"headers\"])\n if \"json\" in local_kwargs:\n headers.update({\"Content-Type\": \"application/json;charset=UTF-8\"})\n local_kwargs.update({\"headers\": headers})\n\n return local_kwargs"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsending HTTP POST to the endpoint.", "response": "def post(self, endpoint, return_response=False, **kwargs):\n \"\"\"Send HTTP POST to the endpoint.\n\n :arg str endpoint: The endpoint to send to.\n\n :returns:\n JSON decoded result.\n\n :raises:\n requests.RequestException on timeout or connection error.\n\n \"\"\"\n args = self.translate_kwargs(**kwargs)\n response = self.session.post(self.make_url(endpoint), **args)\n\n decoded_response = _decode_response(response)\n\n if return_response:\n return decoded_response, response\n return decoded_response"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nescape a string for use in Gerrit commands.", "response": "def escape_string(string):\n \"\"\"Escape a string for use in Gerrit commands.\n\n :arg str string: The string to escape.\n\n :returns: The string with necessary escapes and surrounding double quotes\n so that it can be passed to any of the Gerrit commands that require\n double-quoted strings.\n\n \"\"\"\n result = string\n result = result.replace('\\\\', '\\\\\\\\')\n result = result.replace('\"', '\\\\\"')\n return '\"' + result + '\"'"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef append(self, data):\n if not data:\n return\n\n if isinstance(data, list):\n # First we need to clean up the data.\n #\n # Gerrit creates new bullet items when it gets newline characters\n # within a bullet list paragraph, so unless we remove the newlines\n # from the texts the resulting bullet list will contain multiple\n # bullets and look crappy.\n #\n # We add the '*' character on the beginning of each bullet text in\n # the next step, so we strip off any existing leading '*' that the\n # caller has added, and then strip off any leading or trailing\n # whitespace.\n _items = [x.replace(\"\\n\", \" \").strip().lstrip('*').strip()\n for x in data]\n\n # Create the bullet list only with the items that still have any\n # text in them after cleaning up.\n _paragraph = \"\\n\".join([\"* %s\" % x for x in _items if x])\n if _paragraph:\n self.paragraphs.append(_paragraph)\n elif isinstance(data, str):\n _paragraph = data.strip()\n if _paragraph:\n self.paragraphs.append(_paragraph)\n else:\n raise ValueError('Data must be a list or a string')", "response": "Append the given data to the output."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef format(self):\n message = \"\"\n if self.paragraphs:\n if self.header:\n message += (self.header + '\\n\\n')\n message += \"\\n\\n\".join(self.paragraphs)\n if self.footer:\n message += ('\\n\\n' + self.footer)\n return message", "response": "Format the message parts to a string."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsets header context parameters.", "response": "def set_context_params(self, params):\n\n \"\"\" Set header context parameters. Refer to the top of /docs/soap.txt about specifics.\n\n The -Parameter cannot be changed, because it is set by the\n implementing class.\n\n Should be called by implementing method to check for valid context\n params.\n\n :param params: A Dict containing context parameters.\n \"\"\"\n\n for key, value in params.items():\n\n if key not in self.valid_context_params:\n\n raise RequestHeaderContextException(\n \"%s is not a valid context parameter.\" % key\n )"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nenable batch request gathering.", "response": "def enable_batch(self, onerror=\"continue\"):\n\n \"\"\" Enables batch request gathering.\n\n Do this first and then consecutively call \"add_request\" to add more\n requests.\n\n :param onerror: \"continue\" (default) if one request fails (and\n response with soap Faults for the request) or \"stop\" processing.\n \"\"\"\n\n self.batch_request = True\n self.batch_request_id = 1\n\n self._create_batch_node(onerror)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks if this response has at least one fault response", "response": "def is_fault(self):\n\n \"\"\" Checks, wether this response has at least one fault response (\n supports both batch and single responses)\n \"\"\"\n\n if self.is_batch():\n\n info = self.get_batch()\n\n return info['hasFault']\n\n else:\n\n my_response = self.get_response()\n\n if list(my_response.keys())[0] == \"Fault\":\n\n return True\n\n return False"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _filter_response(self, response_dict):\n\n \"\"\" Add additional filters to the response dictionary\n\n Currently the response dictionary is filtered like this:\n\n * If a list only has one item, the list is replaced by that item\n * Namespace-Keys (_jsns and xmlns) are removed\n\n :param response_dict: the pregenerated, but unfiltered response dict\n :type response_dict: dict\n :return: The filtered dictionary\n :rtype: dict\n \"\"\"\n\n filtered_dict = {}\n\n for key, value in response_dict.items():\n\n if key == \"_jsns\":\n\n continue\n\n if key == \"xmlns\":\n\n continue\n\n if type(value) == list and len(value) == 1:\n\n filtered_dict[key] = value[0]\n\n elif type(value) == dict and len(value.keys()) == 1 and \"_content\" \\\n in value.keys():\n\n filtered_dict[key] = value[\"_content\"]\n\n elif type(value) == dict:\n\n tmp_dict = self._filter_response(value)\n\n filtered_dict[key] = tmp_dict\n\n else:\n\n filtered_dict[key] = value\n\n return filtered_dict", "response": "Add additional filters to the response dictionary"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate a zimbra preauth value for the user in the order they are in.", "response": "def create_preauth(byval, key, by='name', expires=0, timestamp=None):\n\n \"\"\" Generates a zimbra preauth value\n\n :param byval: The value of the targeted user (according to the\n by-parameter). For example: The account name, if \"by\" is \"name\".\n :param key: The domain preauth key (you can retrieve that using zmprov gd)\n :param by: What type is the byval-parameter? Valid parameters are \"name\"\n (default), \"id\" and \"foreignPrincipal\"\n :param expires: Milliseconds when the auth token expires. Defaults to 0\n for default account expiration\n :param timestamp: Current timestamp (is calculated by default)\n :returns: The preauth value to be used in an AuthRequest\n :rtype: str\n \"\"\"\n\n if timestamp is None:\n timestamp = int(datetime.now().strftime(\"%s\")) * 1000\n\n pak = hmac.new(\n codecs.latin_1_encode(key)[0],\n ('%s|%s|%s|%s' % (\n byval,\n by,\n expires,\n timestamp\n )).encode(\"utf-8\"),\n hashlib.sha1\n ).hexdigest()\n\n return pak"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef zimbra_to_python(zimbra_dict, key_attribute=\"n\",\n content_attribute=\"_content\"):\n\n \"\"\"\n Converts single level Zimbra dicts to a standard python dict\n\n :param zimbra_dict: The dictionary in Zimbra-Format\n :return: A native python dict\n \"\"\"\n\n local_dict = {}\n\n for item in zimbra_dict:\n\n local_dict[item[key_attribute]] = item[content_attribute]\n\n return local_dict", "response": "Converts a Zimbra dict to a native python dict"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_value(haystack, needle, key_attribute=\"n\",\n content_attribute=\"_content\"):\n\n \"\"\" Fetch a value from a zimbra-like json dict (keys are \"n\", values are\n \"_content\"\n\n This function may be slightly faster than zimbra_to_python(haystack)[\n needle], because it doesn't necessarily iterate over the complete list.\n\n :param haystack: The list in zimbra-dict format\n :param needle: the key to search for\n :return: the value or None, if the key is not found\n \"\"\"\n\n for value in haystack:\n\n if value[key_attribute] == needle:\n\n return value[content_attribute]\n\n return None", "response": "Get a value from a zimbra - like json dict"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nconverting input string compatible between py2 and py3 Taxonomy", "response": "def convert_to_str(input_string):\n\n \"\"\" Returns a string of the input compatible between py2 and py3\n :param input_string:\n :return:\n \"\"\"\n\n if sys.version < '3':\n\n if isinstance(input_string, str) \\\n or isinstance(input_string, unicode): # pragma: no cover py3\n\n return input_string # pragma: no cover py3\n\n else:\n\n if isinstance(input_string, str): # pragma: no cover py3\n\n return input_string # pragma: no cover py3\n\n return str(input_string)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate a DOM node and optionally several subnodes from a dictionary.", "response": "def dict_to_dom(root_node, xml_dict):\n \"\"\" Create a DOM node and optionally several subnodes from a dictionary.\n\n :param root_node: DOM-Node set the dictionary is applied upon\n :type root_node: xml.dom.Element\n :param xml_dict: The dictionary containing the nodes to process\n :type xml_dict: dict\n \"\"\"\n\n if '_content' in list(xml_dict.keys()):\n\n root_node.appendChild(\n root_node.ownerDocument.createTextNode(\n convert_to_str(xml_dict['_content'])\n )\n )\n\n for key, value in xml_dict.items():\n\n if key == '_content':\n continue\n\n if type(value) == dict:\n\n # Root node\n\n tmp_node = root_node.ownerDocument.createElement(key)\n\n dict_to_dom(tmp_node, value)\n\n root_node.appendChild(tmp_node)\n\n elif type(value) == list:\n\n for multinode in value:\n\n tmp_node = root_node.ownerDocument.createElement(key)\n\n dict_to_dom(tmp_node, multinode)\n\n root_node.appendChild(tmp_node)\n\n else:\n\n # Attributes\n\n root_node.setAttribute(\n key,\n convert_to_str(value)\n )"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef dom_to_dict(root_node):\n\n \"\"\" Serializes the given node to the dictionary\n\n Serializes the given node to the documented dictionary format.\n\n :param root_node: Node to serialize\n :returns: The dictionary\n :rtype: dict\n \"\"\"\n\n # Remove namespaces from tagname\n\n tag = root_node.tagName\n\n if \":\" in tag:\n\n tag = tag.split(\":\")[1]\n\n root_dict = {\n tag: {}\n }\n\n node_dict = root_dict[tag]\n\n # Set attributes\n\n if root_node.hasAttributes():\n\n for key in list(root_node.attributes.keys()):\n\n node_dict[key] = root_node.getAttribute(key)\n\n # Check out child nodes\n\n for child in root_node.childNodes:\n\n if child.nodeType == root_node.TEXT_NODE:\n\n # This is the content\n\n node_dict['_content'] = child.data\n\n else:\n\n subnode_dict = dom_to_dict(child)\n\n child_tag = child.tagName\n\n if \":\" in child_tag:\n\n child_tag = child_tag.split(\":\")[1]\n\n new_val = subnode_dict[child_tag]\n\n # If we have several child with same name, put them in a list.\n\n if child_tag in node_dict:\n prev_val = node_dict[child_tag]\n\n if type(prev_val) != list:\n node_dict[child_tag] = [prev_val]\n\n node_dict[child_tag].append(new_val)\n\n else:\n node_dict[child_tag] = new_val\n\n return root_dict", "response": "Serializes the given node to the documented dictionary format."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\noverrides HTTPSConnection. connect to specify TLS version", "response": "def connect(self):\n \"\"\"Overrides HTTPSConnection.connect to specify TLS version\"\"\"\n # Standard implementation from HTTPSConnection, which is not\n # designed for extension, unfortunately\n sock = socket.create_connection((self.host, self.port),\n self.timeout, self.source_address)\n if getattr(self, '_tunnel_host', None):\n self.sock = sock # pragma: no cover\n self._tunnel() # pragma: no cover\n\n # This is the only difference; default wrap_socket uses SSLv23\n self.sock = ssl.wrap_socket(sock, self.key_file, self.cert_file,\n ssl_version=ssl.PROTOCOL_TLSv1_2)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsends the request and returns the results in a dict or a list.", "response": "def send_request(self, request, response=None):\n\n \"\"\" Send the request.\n\n Sends the request and retrieves the results, formats them and returns\n them in a dict or a list (when it's a batchresponse). If something\n goes wrong, raises a SoapFailure or a HTTPError on system-side\n failures. Note: AuthRequest raises an HTTPError on failed\n authentications!\n\n :param request: The request to send\n :type request: pythonzimbra.request.Request\n :param response: A prebuilt response object\n :type response: pythonzimbra.response.Response\n :raises: pythonzimbra.exceptions.communication.SoapFailure or\n urllib2.HTTPError\n \"\"\"\n\n local_response = None\n\n if response is None:\n\n if request.request_type == \"json\":\n\n local_response = ResponseJson()\n\n elif request.request_type == \"xml\":\n\n local_response = ResponseXml()\n\n else:\n\n raise UnknownRequestType()\n\n try:\n\n server_request = ur.urlopen(\n self.url,\n request.get_request().encode(\"utf-8\"),\n self.timeout\n )\n\n server_response = server_request.read()\n\n if isinstance(server_response, bytes):\n\n server_response = server_response.decode(\"utf-8\")\n\n if response is None:\n\n local_response.set_response(\n server_response\n )\n\n else:\n\n response.set_response(server_response)\n\n except ue.HTTPError as e:\n\n if e.code == 500:\n\n # 500 codes normally returns a SoapFault, that we can use\n\n server_response = e.fp.read()\n\n if isinstance(server_response, bytes):\n\n server_response = server_response.decode(\"utf-8\")\n\n if response is None:\n\n local_response.set_response(server_response)\n\n else:\n\n response.set_response(server_response)\n\n else:\n\n raise e\n\n if response is None:\n return local_response"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nauthenticating to the Zimbra server", "response": "def authenticate(url, account, key, by='name', expires=0, timestamp=None,\n timeout=None, request_type=\"xml\", admin_auth=False,\n use_password=False, raise_on_error=False):\n\n \"\"\" Authenticate to the Zimbra server\n\n :param url: URL of Zimbra SOAP service\n :param account: The account to be authenticated against\n :param key: The preauth key of the domain of the account or a password (if\n admin_auth or use_password is True)\n :param by: If the account is specified as a name, an ID or a\n ForeignPrincipal\n :param expires: When the token expires (or 0 for default expiration)\n :param timestamp: When the token was requested (None for \"now\")\n :param timeout: Timeout for the communication with the server. Defaults\n to the urllib2-default\n :param request_type: Which type of request to use (\"xml\" (default) or\n \"json\")\n :param admin_auth: This request should authenticate and generate an admin\n token. The \"key\"-parameter therefore holds the admin password (implies\n use_password)\n :param use_password: The \"key\"-parameter holds a password. Do a password-\n based user authentication.\n :param raise_on_error: Should I raise an exception when an authentication\n error occurs or just return None?\n :return: The authentication token or None\n :rtype: str or None or unicode\n \"\"\"\n\n if timestamp is None:\n timestamp = int(time.time()) * 1000\n\n pak = \"\"\n if not admin_auth:\n pak = preauth.create_preauth(account, key, by, expires, timestamp)\n\n if request_type == 'xml':\n\n auth_request = RequestXml()\n\n else:\n\n auth_request = RequestJson()\n\n request_data = {\n 'account': {\n 'by': by,\n '_content': account\n }\n }\n\n ns = \"urn:zimbraAccount\"\n\n if admin_auth:\n\n ns = \"urn:zimbraAdmin\"\n\n request_data['password'] = key\n\n elif use_password:\n\n request_data['password'] = {\n \"_content\": key\n }\n\n else:\n\n request_data['preauth'] = {\n 'timestamp': timestamp,\n 'expires': expires,\n '_content': pak\n }\n\n auth_request.add_request(\n 'AuthRequest',\n request_data,\n ns\n )\n\n server = Communication(url, timeout)\n\n if request_type == 'xml':\n\n response = ResponseXml()\n\n else:\n\n response = ResponseJson()\n\n server.send_request(auth_request, response)\n\n if response.is_fault():\n\n if raise_on_error:\n\n raise AuthenticationFailed(\n \"Cannot authenticate user: (%s) %s\" % (\n response.get_fault_code(),\n response.get_fault_message()\n )\n )\n\n return None\n\n return response.get_response()['AuthResponse']['authToken']"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef read_dbf(dbf_path, index = None, cols = False, incl_index = False):\n db = ps.open(dbf_path)\n if cols:\n if incl_index:\n cols.append(index)\n vars_to_read = cols\n else:\n vars_to_read = db.header\n data = dict([(var, db.by_col(var)) for var in vars_to_read])\n if index:\n index = db.by_col(index)\n db.close()\n return pd.DataFrame(data, index=index)\n else:\n db.close()\n return pd.DataFrame(data)", "response": "Reads a DBF file into a pandas. DataFrame"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef column_mask(self):\n margin = compress_pruned(\n self._slice.margin(\n axis=0,\n weighted=False,\n include_transforms_for_dims=self._hs_dims,\n prune=self._prune,\n )\n )\n mask = margin < self._size\n\n if margin.shape == self._shape:\n # If margin shape is the same as slice's (such as in a col margin for\n # MR x CAT), don't broadcast the mask to the array shape, since\n # they're already the same.\n return mask\n\n # If the row margin is a row vector - broadcast it's mask to the array shape\n return np.logical_or(np.zeros(self._shape, dtype=bool), mask)", "response": "ndarray where column margin < min_base_size same shape as slice."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef pairwise_indices(self):\n return np.array([sig.pairwise_indices for sig in self.values]).T", "response": "ndarray containing tuples of pairwise indices."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef summary_pairwise_indices(self):\n summary_pairwise_indices = np.empty(\n self.values[0].t_stats.shape[1], dtype=object\n )\n summary_pairwise_indices[:] = [\n sig.summary_pairwise_indices for sig in self.values\n ]\n return summary_pairwise_indices", "response": "ndarray containing tuples of pairwise indices for the column summary."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncalculates and return a heuristic score for this Parser against the provided script source and path.", "response": "def score(self):\n \"\"\"\n Calculate and return a heuristic score for this Parser against the provided\n script source and path. This is used to order the ArgumentParsers as \"most likely to work\"\n against a given script/source file.\n\n Each parser has a calculate_score() function that returns a list of booleans representing\n the matches against conditions. This is converted into a % match and used to sort parse engines.\n\n :return: float\n \"\"\"\n if self._heuristic_score is None:\n matches = self.heuristic()\n self._heuristic_score = float(sum(matches)) / float(len(matches))\n return self._heuristic_score"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreset the calibration to its initial state", "response": "def reset(self):\n \"\"\"\n Reset the calibration to it initial state\n \"\"\"\n simulation = self.survey_scenario.simulation\n holder = simulation.get_holder(self.weight_name)\n holder.array = numpy.array(self.initial_weight, dtype = holder.variable.dtype)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _set_survey_scenario(self, survey_scenario):\n self.survey_scenario = survey_scenario\n # TODO deal with baseline if reform is present\n if survey_scenario.simulation is None:\n survey_scenario.simulation = survey_scenario.new_simulation()\n period = self.period\n self.filter_by = filter_by = survey_scenario.calculate_variable(\n variable = self.filter_by_name, period = period)\n # TODO: shoud not be france specific\n self.weight_name = weight_name = self.survey_scenario.weight_column_name_by_entity['menage']\n self.initial_weight_name = weight_name + \"_ini\"\n self.initial_weight = initial_weight = survey_scenario.calculate_variable(\n variable = weight_name, period = period)\n self.initial_total_population = sum(initial_weight * filter_by)\n self.weight = survey_scenario.calculate_variable(variable = weight_name, period = period)", "response": "Set the survey scenario for the current species."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nset the parameters of the current object", "response": "def set_parameters(self, parameter, value):\n \"\"\"\n Set parameters value\n\n :param parameter: the parameter to be set\n :param value: the valeu used to set the parameter\n \"\"\"\n if parameter == 'lo':\n self.parameters['lo'] = 1 / value\n else:\n self.parameters[parameter] = value"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nbuilding the data dictionary used as calmar input argument", "response": "def _build_calmar_data(self):\n \"\"\"\n Builds the data dictionnary used as calmar input argument\n \"\"\"\n # Select only filtered entities\n assert self.initial_weight_name is not None\n data = pd.DataFrame()\n data[self.initial_weight_name] = self.initial_weight * self.filter_by\n for variable in self.margins_by_variable:\n if variable == 'total_population':\n continue\n assert variable in self.survey_scenario.tax_benefit_system.variables\n period = self.period\n data[variable] = self.survey_scenario.calculate_variable(variable = variable, period = period)\n\n return data"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nupdates the weight attribute of the entry in the cluster.", "response": "def _update_weights(self, margins, parameters = {}):\n \"\"\"\n Run calmar, stores new weights and returns adjusted margins\n \"\"\"\n data = self._build_calmar_data()\n assert self.initial_weight_name is not None\n parameters['initial_weight'] = self.initial_weight_name\n val_pondfin, lambdasol, updated_margins = calmar(\n data, margins, **parameters)\n # Updating only afetr filtering weights\n self.weight = val_pondfin * self.filter_by + self.weight * (logical_not(self.filter_by))\n return updated_margins"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef set_calibrated_weights(self):\n period = self.period\n survey_scenario = self.survey_scenario\n assert survey_scenario.simulation is not None\n for simulation in [survey_scenario.simulation, survey_scenario.baseline_simulation]:\n if simulation is None:\n continue\n simulation.set_input(self.weight_name, period, self.weight)", "response": "Modify the weights to use the calibrated weights"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_parameter_action(action):\n actions = set()\n if isinstance(action, argparse._AppendAction):\n actions.add(SPECIFY_EVERY_PARAM)\n return actions", "response": "Returns a set of actions that can be applied to a single parameter."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef nnd_hotdeck_using_feather(receiver = None, donor = None, matching_variables = None, z_variables = None):\n import feather\n\n assert receiver is not None and donor is not None\n assert matching_variables is not None\n\n temporary_directory_path = os.path.join(config_files_directory, 'tmp')\n assert os.path.exists(temporary_directory_path)\n receiver_path = os.path.join(temporary_directory_path, 'receiver.feather')\n donor_path = os.path.join(temporary_directory_path, 'donor.feather')\n feather.write_dataframe(receiver, receiver_path)\n feather.write_dataframe(donor, donor_path)\n if isinstance(matching_variables, str):\n match_vars = '\"{}\"'.format(matching_variables)\n elif len(matching_variables) == 1:\n match_vars = '\"{}\"'.format(matching_variables[0])\n else:\n match_vars = '\"{}\"'.format('todo')\n\n r_script = \"\"\"\nrm(list=ls())\ngc()\ndevtools::install_github(\"wesm/feather/R\")\nlibrary(feather)\nlibrary(StatMatch)\n\nreceiver <- read_feather({receiver_path})\ndonor <- read_feather({donor_path})\nsummary(receiver)\nsummary(donor)\n\n# variables\nreceiver = as.data.frame(receiver)\ndonor = as.data.frame(donor)\ngc()\nmatch_vars = {match_vars}\n# don_class = c(\"sexe\")\nout.nnd <- NND.hotdeck(\n data.rec = receiver, data.don = donor, match.vars = match_vars\n )\n\n# out.nndsummary(out.nnd$mtc.ids)\n# head(out.nnd$mtc.ids, 10)\n# head(receiver, 10)\n\nfused.nnd.m <- create.fused(\n data.rec = receiver, data.don = donor,\n mtc.ids = out.nnd$mtc.ids,\n z.vars = \"{z_variables}\"\n )\nsummary(fused.nnd.m)\n\"\"\".format(\n receiver_path = receiver_path,\n donor_path = donor_path,\n match_vars = match_vars,\n z_variables = z_variables,\n )\n print(r_script)", "response": "This function is used to create a new NND Hotdeck using FEather."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef other_ind(self):\n return np.full(self.n_min, self.size - 1, dtype=np.int)", "response": "last row or column of square A"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nnormalizing constant for wishart CDF.", "response": "def K(self):\n \"\"\"Normalizing constant for wishart CDF.\"\"\"\n K1 = np.float_power(pi, 0.5 * self.n_min * self.n_min)\n K1 /= (\n np.float_power(2, 0.5 * self.n_min * self._n_max)\n * self._mgamma(0.5 * self._n_max, self.n_min)\n * self._mgamma(0.5 * self.n_min, self.n_min)\n )\n\n K2 = np.float_power(\n 2, self.alpha * self.size + 0.5 * self.size * (self.size + 1)\n )\n for i in xrange(self.size):\n K2 *= gamma(self.alpha + i + 1)\n return K1 * K2"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef value(self):\n wishart = self._wishart_cdf\n\n # Prepare variables for integration algorithm\n A = self.A\n p = self._gammainc_a\n g = gamma(wishart.alpha_vec)\n q_ind = np.arange(2 * wishart.n_min - 2)\n q_vec = 2 * wishart.alpha + q_ind + 2\n q = np.float_power(0.5, q_vec) * gamma(q_vec) * gammainc(q_vec, self._chisq_val)\n\n # Perform integration (i.e. calculate Pfaffian CDF)\n for i in xrange(wishart.n_min):\n # TODO consider index tricks instead of iteration here\n b = 0.5 * p[i] * p[i]\n for j in xrange(i, wishart.n_min - 1):\n b -= q[i + j] / (g[i] * g[j + 1])\n A[j + 1, i] = p[i] * p[j + 1] - 2 * b\n A[i, j + 1] = -A[j + 1, i]\n\n if np.any(np.isnan(A)):\n return 0\n return np.sqrt(det(A))", "response": "return float Cumulative Distribution Function. thereturn value represents a floating point number of the CDF of the CDF of the largest eigenvalue of a Wishart ( n p ) evaluated at chisq_val."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef data(cls, cube, weighted, prune):\n return cls()._data(cube, weighted, prune)", "response": "Return ndarray representing table index by margin."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _data(self, cube, weighted, prune):\n result = []\n for slice_ in cube.slices:\n if cube.has_mr:\n return self._mr_index(cube, weighted, prune)\n num = slice_.margin(axis=0, weighted=weighted, prune=prune)\n den = slice_.margin(weighted=weighted, prune=prune)\n margin = num / den\n proportions = slice_.proportions(axis=1, weighted=weighted, prune=prune)\n result.append(proportions / margin)\n\n if len(result) == 1 and cube.ndim < 3:\n result = result[0]\n else:\n if prune:\n mask = np.array([slice_.mask for slice_ in result])\n result = np.ma.masked_array(result, mask)\n else:\n result = np.array(result)\n\n return result", "response": "ndarray representing table index by margin."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncalculating the Gini coefficient of a set of observations.", "response": "def gini(values, weights = None, bin_size = None):\n \"\"\"\n Gini coefficient (normalized to 1)\n Using fastgini formula :\n\n\n i=N j=i\n SUM W_i*(SUM W_j*X_j - W_i*X_i/2)\n i=1 j=1\n G = 1 - 2* ----------------------------------\n i=N i=N\n SUM W_i*X_i * SUM W_i\n i=1 i=1\n\n\n where observations are sorted in ascending order of X.\n\n From http://fmwww.bc.edu/RePec/bocode/f/fastgini.html\n \"\"\"\n if weights is None:\n weights = ones(len(values))\n\n df = pd.DataFrame({'x': values, 'w': weights})\n df = df.sort_values(by='x')\n x = df['x']\n w = df['w']\n wx = w * x\n\n cdf = cumsum(wx) - 0.5 * wx\n numerator = (w * cdf).sum()\n denominator = ((wx).sum()) * (w.sum())\n gini = 1 - 2 * (numerator / denominator)\n\n return gini"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef kakwani(values, ineq_axis, weights = None):\n from scipy.integrate import simps\n\n if weights is None:\n weights = ones(len(values))\n\n# sign = -1\n# if tax == True:\n# sign = -1\n# else:\n# sign = 1\n\n PLCx, PLCy = pseudo_lorenz(values, ineq_axis, weights)\n LCx, LCy = lorenz(ineq_axis, weights)\n\n del PLCx\n\n return simps((LCy - PLCy), LCx)", "response": "Computes the Kakwani index for a set of values."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncompute the Lorenz curve coordinates for a set of values.", "response": "def lorenz(values, weights = None):\n \"\"\"\n Computes Lorenz Curve coordinates\n \"\"\"\n if weights is None:\n weights = ones(len(values))\n\n df = pd.DataFrame({'v': values, 'w': weights})\n df = df.sort_values(by = 'v')\n x = cumsum(df['w'])\n x = x / float(x[-1:])\n y = cumsum(df['v'] * df['w'])\n y = y / float(y[-1:])\n\n return x, y"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef pvals(cls, slice_, axis=0, weighted=True):\n return cls._factory(slice_, axis, weighted).pvals", "response": "Return the CDF values for the given slice columns as square ndarray."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _chi_squared(self, proportions, margin, observed):\n n = self._element_count\n chi_squared = np.zeros([n, n])\n for i in xrange(1, n):\n for j in xrange(0, n - 1):\n denominator = 1 / margin[i] + 1 / margin[j]\n chi_squared[i, j] = chi_squared[j, i] = (\n np.sum(np.square(proportions[:, i] - proportions[:, j]) / observed)\n / denominator\n )\n return chi_squared", "response": "Calculates the chi - squared of the given proportions and margin."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning statistical significance for props columns.", "response": "def _pvals_from_chi_squared(self, pairwise_chisq):\n \"\"\"return statistical significance for props' columns.\n\n *pairwise_chisq* (ndarray) Matrix of chi-squared values (bases for Wishart CDF)\n \"\"\"\n return self._intersperse_insertion_rows_and_columns(\n 1.0 - WishartCDF(pairwise_chisq, self._n_min, self._n_max).values\n )"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn subclass for PairwiseSignificance based on slice dimension types.", "response": "def _factory(slice_, axis, weighted):\n \"\"\"return subclass for PairwiseSignificance, based on slice dimension types.\"\"\"\n if slice_.dim_types[0] == DT.MR_SUBVAR:\n return _MrXCatPairwiseSignificance(slice_, axis, weighted)\n return _CatXCatPairwiseSignificance(slice_, axis, weighted)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn pvals matrix with inserted NaN rows and columns as numpy. ndarray.", "response": "def _intersperse_insertion_rows_and_columns(self, pairwise_pvals):\n \"\"\"Return pvals matrix with inserted NaN rows and columns, as numpy.ndarray.\n\n Each insertion (a header or a subtotal) creates an offset in the calculated\n pvals. These need to be taken into account when converting each pval to a\n corresponding column letter. For this reason, we need to insert an all-NaN\n row and a column at the right indices. These are the inserted indices of each\n insertion, along respective dimensions.\n \"\"\"\n for i in self._insertion_indices:\n pairwise_pvals = np.insert(pairwise_pvals, i, np.nan, axis=0)\n pairwise_pvals = np.insert(pairwise_pvals, i, np.nan, axis=1)\n return pairwise_pvals"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _opposite_axis_margin(self):\n off_axis = 1 - self._axis\n return self._slice.margin(axis=off_axis, include_mr_cat=self._include_mr_cat)", "response": "ndarray representing margin along the opposite of self. _axis\n "} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _proportions(self):\n return self._slice.proportions(\n axis=self._axis, include_mr_cat=self._include_mr_cat\n )", "response": "ndarray representing slice proportions along correct axis."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _pairwise_chisq(self):\n return self._chi_squared(self._proportions, self._margin, self._observed)", "response": "Pairwise comparisons ( Chi - Square along axis as numpy. ndarray.\n Returns a square matrix of test statistics for the null\n hypothesis that each vector along axis is equal to each other."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _pairwise_chisq(self):\n return [\n self._chi_squared(\n mr_subvar_proportions,\n self._margin[idx],\n self._opposite_axis_margin[idx]\n / np.sum(self._opposite_axis_margin[idx]),\n )\n for (idx, mr_subvar_proportions) in enumerate(self._proportions)\n ]", "response": "Pairwise comparisons for Chi - Square along axis as numpy. ndarray.\n Returns a list of square and symmetric matrices of test statistics for the null ArcGIS - level hypothesis that each vector along axis is equal to each other."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef process_parser(self):\n\n try:\n # Parse with --help to enforce exit\n usage_sections = docopt.docopt(self.parser, ['--help'])\n except SystemExit as e:\n parser = inspect.trace()[-2][0].f_locals\n\n '''\n docopt represents all values as strings and doesn't automatically cast, we probably want to do\n some testing to see if we can convert the default value (Option.value) to a particular type.\n '''\n\n def guess_type(s):\n try:\n v = float(s)\n v = int(s)\n v = s\n except ValueError:\n pass\n\n return type(v)\n\n self.script_groups = ['Arguments']\n self.nodes = OrderedDict()\n self.containers = OrderedDict()\n self.containers['default'] = []\n\n for option in parser['options']:\n if option.long in ['--help', '--version']:\n continue\n\n option.type = guess_type(option.value)\n option_name = option.long.strip('-')\n node = DocOptNode(option_name, option=option)\n\n self.nodes[option_name] = node\n self.containers['default'].append(option_name)\n\n self.class_name = os.path.splitext(os.path.basename(self.script_path))[0]\n self.script_path = self.script_path\n self.script_description = self.parser", "response": "Parse the parser and process the options."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef build_dummies_dict(data):\n unique_val_list = unique(data)\n output = {}\n for val in unique_val_list:\n output[val] = (data == val)\n return output", "response": "Build a dict with unique values as keys and vectors as values\n "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ncalibrating weights to satisfy some margin constraints.", "response": "def calmar(data_in, margins, initial_weight = 'wprm_init', method = 'linear', lo = None, up = None, use_proportions = False,\n xtol = 1.49012e-08, maxfev = 256):\n \"\"\"\n Calibrate weights to satisfy some margin constraints\n\n :param dataframe data_in: The observations data\n :param str initial_weight: The initial weight variable name\n :param dict margins: Margins is a dictionnary containing for each variable as key the following values\n - a scalar for numeric variables\n - a dictionnary with categories as key and populations as values\n - eventually a key named `total_population` with value the total population. If absent it is initialized to the actual total population\n\n :param str method: Should be 'linear', 'raking ratio' or 'logit'\n :param float lo: lower bound on weights ratio. Mandatory when using logit method. Should be < 1.\n :param float up: upper bound on weights ratio. Mandatory when using logit method. Should be > 1.\n :param bool use_proportions: default to False. if True use proportions if total population from margins doesn't match total population\n :param xtol: relative precision on lagrangian multipliers. By default xtol = 1.49012e-08 (default fsolve xtol)\n :param maxfev: maximum number of function evaluation default to 256\n \"\"\"\n\n from scipy.optimize import fsolve\n\n # remove null weights and keep original data\n null_weight_observations = data_in[initial_weight].isnull().sum()\n if null_weight_observations > 0:\n log.info(\"{} observations have a NaN weight. Not used in the calibration.\".format(null_weight_observations))\n\n is_non_zero_weight = (data_in[initial_weight].fillna(0) > 0)\n if is_non_zero_weight.sum() > null_weight_observations:\n log.info(\"{} observations have a zero weight. Not used in the calibration.\".format(\n (data_in[initial_weight].fillna(0) <= 0).sum() - null_weight_observations))\n\n variables = set(margins.keys()).intersection(set(data_in.columns))\n for variable in variables:\n null_value_observations = data_in[variable].isnull().sum()\n if null_value_observations > 0:\n log.info(\"For variable {}, {} observations have a NaN value. Not used in the calibration.\".format(\n variable, null_value_observations))\n is_non_zero_weight = is_non_zero_weight & data_in[variable].notnull()\n\n if not is_non_zero_weight.all():\n log.info(\"We drop {} observations.\".format((~is_non_zero_weight).sum()))\n\n data = dict()\n for a in data_in.columns:\n data[a] = data_in.loc[is_non_zero_weight, a].copy()\n\n if not margins:\n raise Exception(\"Calmar requires non empty dict of margins\")\n\n # choose method\n assert method in ['linear', 'raking ratio', 'logit'], \"method should be 'linear', 'raking ratio' or 'logit'\"\n if method == 'linear':\n F = linear\n F_prime = linear_prime\n elif method == 'raking ratio':\n F = raking_ratio\n F_prime = raking_ratio_prime\n elif method == 'logit':\n assert up is not None, \"When method == 'logit', a value > 1 for up is mandatory\"\n assert up > 1, \"up should be > 1\"\n assert lo is not None, \"When method == 'logit', a value < 1 for lo is mandatory\"\n assert lo < 1, \"lo should be < 1\"\n\n def F(x):\n return logit(x, lo, up)\n\n def F_prime(x):\n return logit_prime(x, lo, up)\n\n # Construction observations matrix\n if 'total_population' in margins:\n total_population = margins.pop('total_population')\n else:\n total_population = data[initial_weight].fillna(0).sum()\n\n nk = len(data[initial_weight])\n # number of Lagrange parameters (at least total population)\n nj = 1\n\n margins_new = {}\n margins_new_dict = {}\n for var, val in margins.items():\n if isinstance(val, dict):\n dummies_dict = build_dummies_dict(data[var])\n k, pop = 0, 0\n for cat, nb in val.items():\n cat_varname = var + '_' + str(cat)\n data[cat_varname] = dummies_dict[cat]\n margins_new[cat_varname] = nb\n if var not in margins_new_dict:\n margins_new_dict[var] = {}\n margins_new_dict[var][cat] = nb\n pop += nb\n k += 1\n nj += 1\n # Check total popualtion\n if pop != total_population:\n if use_proportions:\n log.info(\n 'calmar: categorical variable {} is inconsistent with population; using proportions'.format(\n var\n )\n )\n for cat, nb in val.items():\n cat_varname = var + '_' + str(cat)\n margins_new[cat_varname] = nb * total_population / pop\n margins_new_dict[var][cat] = nb * total_population / pop\n else:\n raise Exception('calmar: categorical variable {} weights sums up to {} != {}'.format(\n var, pop, total_population))\n else:\n margins_new[var] = val\n margins_new_dict[var] = val\n nj += 1\n\n # On conserve systematiquement la population\n if hasattr(data, 'dummy_is_in_pop'):\n raise Exception('dummy_is_in_pop is not a valid variable name')\n\n data['dummy_is_in_pop'] = ones(nk)\n margins_new['dummy_is_in_pop'] = total_population\n\n # param\u00e8tres de Lagrange initialis\u00e9s \u00e0 z\u00e9ro\n lambda0 = zeros(nj)\n\n # initial weights\n d = data[initial_weight].values\n x = zeros((nk, nj)) # nb obs x nb constraints\n xmargins = zeros(nj)\n margins_dict = {}\n j = 0\n for var, val in margins_new.items():\n x[:, j] = data[var]\n xmargins[j] = val\n margins_dict[var] = val\n j += 1\n\n # R\u00e9solution des \u00e9quations du premier ordre\n def constraint(l):\n return dot(d * F(dot(x, l)), x) - xmargins\n\n def constraint_prime(l):\n return dot(d * (x.T * F_prime(dot(x, l))), x)\n # le jacobien ci-dessus est constraintprime = @(l) x*(d.*Fprime(x'*l)*x');\n\n tries, ier = 0, 2\n err_max = 1\n conv = 1\n while (ier == 2 or ier == 5 or ier == 4) and not (tries >= 10 or (err_max < 1e-6 and conv < 1e-8)):\n lambdasol, infodict, ier, mesg = fsolve(\n constraint,\n lambda0,\n fprime = constraint_prime,\n maxfev = maxfev,\n xtol = xtol,\n full_output = 1,\n )\n lambda0 = 1 * lambdasol\n tries += 1\n\n pondfin = d * F(dot(x, lambdasol))\n rel_error = {}\n for var, val in margins_new.items():\n rel_error[var] = abs((data[var] * pondfin).sum() - margins_dict[var]) / margins_dict[var]\n sorted_err = sorted(rel_error.items(), key = operator.itemgetter(1), reverse = True)\n\n conv = abs(err_max - sorted_err[0][1])\n err_max = sorted_err[0][1]\n\n if (ier == 2 or ier == 5 or ier == 4):\n log.debug(\"optimization converged after {} tries\".format(tries))\n\n # rebuilding a weight vector with the same size of the initial one\n pondfin_out = array(data_in[initial_weight], dtype = float64)\n pondfin_out[is_non_zero_weight] = pondfin\n return pondfin_out, lambdasol, margins_new_dict"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef ca_main_axis(self):\n try:\n ca_ind = self.dim_types.index(DT.CA_SUBVAR)\n return 1 - ca_ind\n except ValueError:\n return None", "response": "For univariate CA the main axis is the categorical axis"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn bool indicating if slice can compute pairwise comparisons.", "response": "def can_compare_pairwise(self):\n \"\"\"Return bool indicating if slice can compute pairwise comparisons.\n\n Currently, only the CAT x CAT slice can compute pairwise comparisons. This also\n includes the categorical array categories dimnension (CA_CAT).\n \"\"\"\n if self.ndim != 2:\n return False\n\n return all(dt in DT.ALLOWED_PAIRWISE_TYPES for dt in self.dim_types)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the shape of the array.", "response": "def get_shape(self, prune=False, hs_dims=None):\n \"\"\"Tuple of array dimensions' lengths.\n\n It returns a tuple of ints, each representing the length of a cube\n dimension, in the order those dimensions appear in the cube.\n Pruning is supported. Dimensions that get reduced to a single element\n (e.g. due to pruning) are removed from the returning shape, thus\n allowing for the differentiation between true 2D cubes (over which\n statistical testing can be performed) and essentially\n 1D cubes (over which it can't).\n\n Usage:\n\n >>> shape = get_shape()\n >>> pruned_shape = get_shape(prune=True)\n \"\"\"\n if not prune:\n return self.as_array(include_transforms_for_dims=hs_dims).shape\n\n shape = compress_pruned(\n self.as_array(prune=True, include_transforms_for_dims=hs_dims)\n ).shape\n # Eliminate dimensions that get reduced to 1\n # (e.g. single element categoricals)\n return tuple(n for n in shape if n > 1)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef index_table(self, axis=None, baseline=None, prune=False):\n proportions = self.proportions(axis=axis)\n baseline = (\n baseline if baseline is not None else self._prepare_index_baseline(axis)\n )\n\n # Fix the shape to enable correct broadcasting\n if (\n axis == 0\n and len(baseline.shape) <= 1\n and self.ndim == len(self.get_shape())\n ):\n baseline = baseline[:, None]\n\n indexes = proportions / baseline * 100\n\n return self._apply_pruning_mask(indexes) if prune else indexes", "response": "Return index percentages for a given axis and baseline."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting labels for the cube slice and perform pruning by slice.", "response": "def labels(self, hs_dims=None, prune=False):\n \"\"\"Get labels for the cube slice, and perform pruning by slice.\"\"\"\n if self.ca_as_0th:\n labels = self._cube.labels(include_transforms_for_dims=hs_dims)[1:]\n else:\n labels = self._cube.labels(include_transforms_for_dims=hs_dims)[-2:]\n\n if not prune:\n return labels\n\n def prune_dimension_labels(labels, prune_indices):\n \"\"\"Get pruned labels for single dimension, besed on prune inds.\"\"\"\n labels = [label for label, prune in zip(labels, prune_indices) if not prune]\n return labels\n\n labels = [\n prune_dimension_labels(dim_labels, dim_prune_inds)\n for dim_labels, dim_prune_inds in zip(labels, self._prune_indices(hs_dims))\n ]\n return labels"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef margin(\n self,\n axis=None,\n weighted=True,\n include_missing=False,\n include_transforms_for_dims=None,\n prune=False,\n include_mr_cat=False,\n ):\n \"\"\"Return ndarray representing slice margin across selected axis.\n\n A margin (or basis) can be calculated for a contingency table, provided\n that the dimensions of the desired directions are marginable. The\n dimensions are marginable if they represent mutualy exclusive data,\n such as true categorical data. For array types the items dimensions are\n not marginable. Requesting a margin across these dimensions\n (e.g. slice.margin(axis=0) for a categorical array cube slice) will\n produce an error. For multiple response slices, the implicit convention\n is that the provided direction scales to the selections dimension of the\n slice. These cases produce meaningful data, but of a slightly different\n shape (e.g. slice.margin(0) for a MR x CAT slice will produce 2D ndarray\n (variable dimensions are never collapsed!)).\n\n :param axis: Axis across which to sum. Can be 0 (columns margin),\n 1 (rows margin) and None (table margin). If requested across\n variables dimension (e.g. requesting 0 margin for CA array) it will\n produce an error.\n :param weighted: Weighted or unweighted counts.\n :param include_missing: Include missing categories or not.\n :param include_transforms_for_dims: Indices of dimensions for which to\n include transformations\n :param prune: Perform pruning based on unweighted counts.\n :returns: (weighed or unweighted counts) summed across provided axis.\n For multiple response types, items dimensions are not collapsed.\n \"\"\"\n\n axis = self._calculate_correct_axis_for_cube(axis)\n hs_dims = self._hs_dims_for_cube(include_transforms_for_dims)\n\n margin = self._cube.margin(\n axis=axis,\n weighted=weighted,\n include_missing=include_missing,\n include_transforms_for_dims=hs_dims,\n prune=prune,\n include_mr_cat=include_mr_cat,\n )\n\n return self._extract_slice_result_from_cube(margin)", "response": "Returns ndarray representing margin across selected axis."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef min_base_size_mask(self, size, hs_dims=None, prune=False):\n return MinBaseSizeMask(self, size, hs_dims=hs_dims, prune=prune)", "response": "Returns a MinBaseSizeMask object with correct row col and table masks."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef mr_dim_ind(self):\n mr_dim_ind = self._cube.mr_dim_ind\n if self._cube.ndim == 3:\n if isinstance(mr_dim_ind, int):\n if mr_dim_ind == 0:\n # If only the 0th dimension of a 3D is an MR, the sliced\n # don't actuall have the MR... Thus return None.\n return None\n return mr_dim_ind - 1\n elif isinstance(mr_dim_ind, tuple):\n # If MR dimension index is a tuple, that means that the cube\n # (only a 3D one if it reached this path) has 2 MR dimensions.\n # If any of those is 0 ind dimension we don't need to include\n # in the slice dimension (because the slice doesn't see the tab\n # that it's on). If it's 1st and 2nd dimension, then subtract 1\n # from those, and present them as 0th and 1st dimension of the\n # slice. This can happend e.g. in a CAT x MR x MR cube (which\n # renders MR x MR slices).\n mr_dim_ind = tuple(i - 1 for i in mr_dim_ind if i)\n return mr_dim_ind if len(mr_dim_ind) > 1 else mr_dim_ind[0]\n\n return mr_dim_ind", "response": "Get the correct index of the MR dimension in the cube slice."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef scale_means(self, hs_dims=None, prune=False):\n scale_means = self._cube.scale_means(hs_dims, prune)\n\n if self.ca_as_0th:\n # If slice is used as 0th CA, then we need to observe the 1st dimension,\n # because the 0th dimension is CA items, which is only used for slicing\n # (and thus doesn't have numerical values, and also doesn't constitute any\n # dimension of the actual crosstabs that will be created in this case).\n scale_means = scale_means[0][-1]\n if scale_means is None:\n return [None]\n return [scale_means[self._index]]\n\n return scale_means[self._index]", "response": "Return list of column and row scaled means for this slice."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef table_name(self):\n if self._cube.ndim < 3 and not self.ca_as_0th:\n return None\n\n title = self._cube.name\n table_name = self._cube.labels()[0][self._index]\n return \"%s: %s\" % (title, table_name)", "response": "Get slice name.\n\n In case of 2D return cube name. In case of 3D, return the combination\n of the cube name with the label of the corresponding slice\n (nth label of the 0th dimension)."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef wishart_pairwise_pvals(self, axis=0):\n if axis != 0:\n raise NotImplementedError(\"Pairwise comparison only implemented for colums\")\n return WishartPairwiseSignificance.pvals(self, axis=axis)", "response": "Return square symmetric matrix of pairwise column - comparison p - values."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef pvals(self, weighted=True, prune=False, hs_dims=None):\n stats = self.zscore(weighted=weighted, prune=prune, hs_dims=hs_dims)\n pvals = 2 * (1 - norm.cdf(np.abs(stats)))\n\n return self._apply_pruning_mask(pvals, hs_dims) if prune else pvals", "response": "Return 2D ndarray with calculated P values for each unique unique"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning ndarray with slices s standardized residuals.", "response": "def zscore(self, weighted=True, prune=False, hs_dims=None):\n \"\"\"Return ndarray with slices's standardized residuals (Z-scores).\n\n (Only applicable to a 2D contingency tables.) The Z-score or\n standardized residual is the difference between observed and expected\n cell counts if row and column variables were independent divided\n by the residual cell variance. They are assumed to come from a N(0,1)\n or standard Normal distribution, and can show which cells deviate from\n the null hypothesis that the row and column variables are uncorrelated.\n\n See also *pairwise_chisq*, *pairwise_pvals* for a pairwise column-\n or row-based test of statistical significance.\n\n :param weighted: Use weighted counts for zscores\n :param prune: Prune based on unweighted counts\n :param hs_dims: Include headers and subtotals (as NaN values)\n :returns zscore: ndarray representing cell standardized residuals (Z)\n \"\"\"\n counts = self.as_array(weighted=weighted)\n total = self.margin(weighted=weighted)\n colsum = self.margin(axis=0, weighted=weighted)\n rowsum = self.margin(axis=1, weighted=weighted)\n zscore = self._calculate_std_res(counts, total, colsum, rowsum)\n\n if hs_dims:\n zscore = intersperse_hs_in_std_res(self, hs_dims, zscore)\n\n if prune:\n return self._apply_pruning_mask(zscore, hs_dims)\n\n return zscore"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef pairwise_indices(self, alpha=0.05, only_larger=True, hs_dims=None):\n return PairwiseSignificance(\n self, alpha=alpha, only_larger=only_larger, hs_dims=hs_dims\n ).pairwise_indices", "response": "Indices of columns where p < alpha for column - comparison t - tests\n Returns an array of tuples of columns where p < alpha for column - comparison t - tests\n Returns an array of tuples of columns where p < alpha for column - comparison t - tests\n Returns only the indices striclty on the test statistic."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns ndarray containing standard residuals for array values.", "response": "def _array_type_std_res(self, counts, total, colsum, rowsum):\n \"\"\"Return ndarray containing standard residuals for array values.\n\n The shape of the return value is the same as that of *counts*.\n Array variables require special processing because of the\n underlying math. Essentially, it boils down to the fact that the\n variable dimensions are mutually independent, and standard residuals\n are calculated for each of them separately, and then stacked together\n in the resulting array.\n \"\"\"\n if self.mr_dim_ind == 0:\n # --This is a special case where broadcasting cannot be\n # --automatically done. We need to \"inflate\" the single dimensional\n # --ndarrays, to be able to treat them as \"columns\" (essentially a\n # --Nx1 ndarray). This is needed for subsequent multiplication\n # --that needs to happen column wise (rowsum * colsum) / total.\n total = total[:, np.newaxis]\n rowsum = rowsum[:, np.newaxis]\n\n expected_counts = rowsum * colsum / total\n variance = rowsum * colsum * (total - rowsum) * (total - colsum) / total ** 3\n return (counts - expected_counts) / np.sqrt(variance)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncalculating the standard residuals of the record types.", "response": "def _calculate_std_res(self, counts, total, colsum, rowsum):\n \"\"\"Return ndarray containing standard residuals.\n\n The shape of the return value is the same as that of *counts*.\n \"\"\"\n if set(self.dim_types) & DT.ARRAY_TYPES: # ---has-mr-or-ca---\n return self._array_type_std_res(counts, total, colsum, rowsum)\n return self._scalar_type_std_res(counts, total, colsum, rowsum)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn correct axis for cube, based on ndim. If cube has 3 dimensions, increase axis by 1. This will translate the default 0 (cols direction) and 1 (rows direction) to actual 1 (cols direction) and 2 (rows direction). This is needed because the 0th dimension of the 3D cube is only used to slice across. The actual margins need to be calculated for each slice separately, and since they're implemented as an ndarray, the direction needs to be increased by one. For the value of `None`, don't modify the axis parameter. :param axis: 0, 1, or None. Axis that will be passed to self._cube methods. If the cube is 3D, the axis is typically increased by 1, to represent correct measure direction. :returns: int or None, representing the updated axis to pass to cube", "response": "def _calculate_correct_axis_for_cube(self, axis):\n \"\"\"Return correct axis for cube, based on ndim.\n\n If cube has 3 dimensions, increase axis by 1. This will translate the\n default 0 (cols direction) and 1 (rows direction) to actual 1\n (cols direction) and 2 (rows direction). This is needed because the\n 0th dimension of the 3D cube is only used to slice across. The actual\n margins need to be calculated for each slice separately, and since\n they're implemented as an ndarray, the direction needs to be increased\n by one. For the value of `None`, don't modify the axis parameter.\n\n :param axis: 0, 1, or None. Axis that will be passed to self._cube\n methods. If the cube is 3D, the axis is typically\n increased by 1, to represent correct measure direction.\n :returns: int or None, representing the updated axis to pass to cube\n \"\"\"\n\n if self._cube.ndim < 3:\n if self.ca_as_0th and axis is None:\n # Special case for CA slices (in multitables). In this case,\n # we need to calculate a measurement across CA categories\n # dimension (and not across items, because it's not\n # allowed). The value for the axis parameter of None, would\n # imply both cat and items dimensions, and we don't want that.\n return 1\n return axis\n\n # Expected usage of the 'axis' parameter from CubeSlice is 0, 1, or\n # None. CrunchCube handles all other logic. The only 'smart' thing\n # about the handling here, is that the axes are increased for 3D cubes.\n # This way the 3Dness is hidden from the user and he still sees 2D\n # crosstabs, with col and row axes (0 and 1), which are transformed to\n # corresponding numbers in case of 3D cubes (namely 1 and 2). In the\n # case of None, we need to analyze across all valid dimensions, and the\n # CrunchCube takes care of that (no need to update axis if it's None).\n # If the user provides a tuple, it's considered that he \"knows\" what\n # he's doing, and the axis argument is not updated in this case.\n if isinstance(axis, int):\n axis += 1\n return axis"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _scalar_type_std_res(self, counts, total, colsum, rowsum):\n expected_counts = expected_freq(counts)\n residuals = counts - expected_counts\n variance = (\n np.outer(rowsum, colsum)\n * np.outer(total - rowsum, total - colsum)\n / total ** 3\n )\n return residuals / np.sqrt(variance)", "response": "Return ndarray containing standard residuals for category values."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef data(self):\n means = []\n table = self._slice.as_array()\n products = self._inner_prods(table, self.values)\n\n for axis, product in enumerate(products):\n if product is None:\n means.append(product)\n continue\n\n # Calculate means\n valid_indices = self._valid_indices(axis)\n num = np.sum(product[valid_indices], axis)\n den = np.sum(table[valid_indices], axis)\n mean = num / den\n if not isinstance(mean, np.ndarray):\n mean = np.array([mean])\n means.append(mean)\n return means", "response": "list of mean numeric values of categorical responses."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef margin(self, axis):\n if self._slice.ndim < 2:\n msg = (\n \"Scale Means marginal cannot be calculated on 1D cubes, as\"\n \"the scale means already get reduced to a scalar value.\"\n )\n raise ValueError(msg)\n\n dimension_index = 1 - axis\n margin = self._slice.margin(axis=axis)\n if len(margin.shape) > 1:\n index = [\n 0 if d.dimension_type == DT.MR else slice(None)\n for d in self._slice.dimensions\n ]\n margin = margin[index]\n total = np.sum(margin)\n values = self.values[dimension_index]\n\n if values is None:\n return None\n\n return np.sum(values * margin) / total", "response": "Return marginal value of the current slice scaled means."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef values(self):\n return [\n (\n np.array(dim.numeric_values)\n if (dim.numeric_values and any(~np.isnan(dim.numeric_values)))\n else None\n )\n for dim in self._slice.dimensions\n ]", "response": "list of ndarray value - ids for each dimension in slice."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef compress_pruned(table):\n if not isinstance(table, np.ma.core.MaskedArray):\n return table\n\n if table.ndim == 0:\n return table.data\n\n if table.ndim == 1:\n return np.ma.compressed(table)\n\n row_inds = ~table.mask.all(axis=1)\n col_inds = ~table.mask.all(axis=0)\n table = table[row_inds, :][:, col_inds]\n if table.dtype == float and table.mask.any():\n table[table.mask] = np.nan\n return table", "response": "Compress a table based on pruning mask."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nperform the insertions of place - holding rows and cols for insertions.", "response": "def intersperse_hs_in_std_res(slice_, hs_dims, res):\n \"\"\"Perform the insertions of place-holding rows and cols for insertions.\"\"\"\n for dim, inds in enumerate(slice_.inserted_hs_indices()):\n if dim not in hs_dims:\n continue\n for i in inds:\n res = np.insert(res, i, np.nan, axis=(dim - slice_.ndim))\n return res"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef inflate_parameter_leaf(sub_parameter, base_year, inflator, unit_type = 'unit'):\n\n if isinstance(sub_parameter, Scale):\n if unit_type == 'threshold_unit':\n for bracket in sub_parameter.brackets:\n threshold = bracket.children['threshold']\n inflate_parameter_leaf(threshold, base_year, inflator)\n return\n else:\n # Remove new values for year > base_year\n kept_instants_str = [\n parameter_at_instant.instant_str\n for parameter_at_instant in sub_parameter.values_list\n if periods.instant(parameter_at_instant.instant_str).year <= base_year\n ]\n if not kept_instants_str:\n return\n\n last_admissible_instant_str = max(kept_instants_str)\n sub_parameter.update(\n start = last_admissible_instant_str,\n value = sub_parameter(last_admissible_instant_str)\n )\n restricted_to_base_year_value_list = [\n parameter_at_instant for parameter_at_instant in sub_parameter.values_list\n if periods.instant(parameter_at_instant.instant_str).year == base_year\n ]\n # When value is changed in the base year\n if restricted_to_base_year_value_list:\n for parameter_at_instant in reversed(restricted_to_base_year_value_list):\n if parameter_at_instant.instant_str.startswith(str(base_year)):\n value = (\n parameter_at_instant.value * (1 + inflator)\n if parameter_at_instant.value is not None\n else None\n )\n sub_parameter.update(\n start = parameter_at_instant.instant_str.replace(\n str(base_year), str(base_year + 1)\n ),\n value = value,\n )\n # Or use the value at that instant even when it is defined earlier tahn the base year\n else:\n value = (\n sub_parameter(\"{}-12-31\".format(base_year)) * (1 + inflator)\n if sub_parameter(\"{}-12-31\".format(base_year)) is not None\n else None\n )\n sub_parameter.update(\n start = \"{}-01-01\".format(base_year + 1),\n value = value\n )", "response": "Inflate a Parameter leaf according to unit type"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ncomputing and return the variable values for the given variable and period.", "response": "def calculate_variable(self, variable = None, period = None, use_baseline = False):\n \"\"\"\n Compute and return the variable values for period and baseline or reform tax_benefit_system\n \"\"\"\n if use_baseline:\n assert self.baseline_simulation is not None, \"self.baseline_simulation is None\"\n simulation = self.baseline_simulation\n else:\n assert self.simulation is not None\n simulation = self.simulation\n\n tax_benefit_system = simulation.tax_benefit_system\n\n assert period is not None\n if not isinstance(period, periods.Period):\n period = periods.period(period)\n assert simulation is not None\n assert tax_benefit_system is not None\n\n assert variable in tax_benefit_system.variables, \"{} is not a valid variable\".format(variable)\n period_size_independent = tax_benefit_system.get_variable(variable).is_period_size_independent\n definition_period = tax_benefit_system.get_variable(variable).definition_period\n\n if period_size_independent is False and definition_period != u'eternity':\n values = simulation.calculate_add(variable, period = period)\n elif period_size_independent is True and definition_period == u'month' and period.size_in_months > 1:\n values = simulation.calculate(variable, period = period.first_month)\n elif period_size_independent is True and definition_period == u'month' and period.size_in_months == 1:\n values = simulation.calculate(variable, period = period)\n elif period_size_independent is True and definition_period == u'year' and period.size_in_months > 12:\n values = simulation.calculate(variable, period = period.start.offset('first-of', 'year').period('year'))\n elif period_size_independent is True and definition_period == u'year' and period.size_in_months == 12:\n values = simulation.calculate(variable, period = period)\n elif period_size_independent is True and definition_period == u'year':\n values = simulation.calculate(variable, period = period.this_year)\n elif definition_period == u'eternity':\n values = simulation.calculate(variable, period = period)\n else:\n values = None\n assert values is not None, 'Unspecified calculation period for variable {}'.format(variable)\n\n return values"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef filter_input_variables(self, input_data_frame = None, simulation = None):\n assert input_data_frame is not None\n assert simulation is not None\n id_variable_by_entity_key = self.id_variable_by_entity_key\n role_variable_by_entity_key = self.role_variable_by_entity_key\n used_as_input_variables = self.used_as_input_variables\n\n tax_benefit_system = simulation.tax_benefit_system\n variables = tax_benefit_system.variables\n\n id_variables = [\n id_variable_by_entity_key[_entity.key] for _entity in simulation.entities.values()\n if not _entity.is_person]\n role_variables = [\n role_variable_by_entity_key[_entity.key] for _entity in simulation.entities.values()\n if not _entity.is_person]\n\n log.debug('Variable used_as_input_variables in filter: \\n {}'.format(used_as_input_variables))\n\n unknown_columns = []\n for column_name in input_data_frame:\n if column_name in id_variables + role_variables:\n continue\n if column_name not in variables:\n unknown_columns.append(column_name)\n input_data_frame.drop(column_name, axis = 1, inplace = True)\n\n if unknown_columns:\n log.debug('The following unknown columns {}, are dropped from input table'.format(\n sorted(unknown_columns)))\n\n used_columns = []\n dropped_columns = []\n for column_name in input_data_frame:\n if column_name in id_variables + role_variables:\n continue\n variable = variables[column_name]\n # Keeping the calculated variables that are initialized by the input data\n if variable.formulas:\n if column_name in used_as_input_variables:\n used_columns.append(column_name)\n continue\n\n dropped_columns.append(column_name)\n input_data_frame.drop(column_name, axis = 1, inplace = True)\n #\n #\n #\n if used_columns:\n log.debug(\n 'These columns are not dropped because present in used_as_input_variables:\\n {}'.format(\n sorted(used_columns)))\n if dropped_columns:\n log.debug(\n 'These columns in survey are set to be calculated, we drop them from the input table:\\n {}'.format(\n sorted(dropped_columns)))\n\n log.info('Keeping the following variables in the input_data_frame:\\n {}'.format(\n sorted(list(input_data_frame.columns))))\n return input_data_frame", "response": "Filter the input data frame from variables that won t be used or are set to be computed."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef init_from_data(self, calibration_kwargs = None, inflation_kwargs = None,\n rebuild_input_data = False, rebuild_kwargs = None, data = None, memory_config = None):\n '''Initialises a survey scenario from data.\n\n :param rebuild_input_data: Whether or not to clean, format and save data.\n Take a look at :func:`build_input_data`\n\n :param data: Contains the data, or metadata needed to know where to find it.\n '''\n\n # When not ``None``, it'll try to get the data for *year*.\n if data is not None:\n data_year = data.get(\"data_year\", self.year)\n\n if calibration_kwargs is not None:\n assert set(calibration_kwargs.keys()).issubset(set(\n ['target_margins_by_variable', 'parameters', 'total_population']))\n\n if inflation_kwargs is not None:\n assert set(inflation_kwargs.keys()).issubset(set(['inflator_by_variable', 'target_by_variable']))\n\n self._set_id_variable_by_entity_key()\n self._set_role_variable_by_entity_key()\n self._set_used_as_input_variables_by_entity()\n\n # When ``True`` it'll assume it is raw data and do all that described supra.\n # When ``False``, it'll assume data is ready for consumption.\n if rebuild_input_data:\n if rebuild_kwargs is not None:\n self.build_input_data(year = data_year, **rebuild_kwargs)\n else:\n self.build_input_data(year = data_year)\n\n debug = self.debug\n trace = self.trace\n\n # Inverting reform and baseline because we are more likely\n # to use baseline input in reform than the other way around\n if self.baseline_tax_benefit_system is not None:\n self.new_simulation(debug = debug, data = data, trace = trace, memory_config = memory_config,\n use_baseline = True)\n\n # Note that I can pass a :class:`pd.DataFrame` directly, if I don't want to rebuild the data.\n self.new_simulation(debug = debug, data = data, trace = trace, memory_config = memory_config)\n\n if calibration_kwargs:\n self.calibrate(**calibration_kwargs)\n\n if inflation_kwargs:\n self.inflate(**inflation_kwargs)", "response": "Initialises a survey scenario from data."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ninitializes the entity with the given input data frame.", "response": "def init_entity(self, entity = None, input_data_frame = None, period = None, simulation = None):\n \"\"\"\n Initialize the simulation period with current input_data_frame\n \"\"\"\n assert entity is not None\n assert input_data_frame is not None\n assert period is not None\n assert simulation is not None\n used_as_input_variables = self.used_as_input_variables_by_entity[entity]\n\n variables_mismatch = set(used_as_input_variables).difference(set(input_data_frame.columns)) if used_as_input_variables else None\n if variables_mismatch:\n log.info(\n 'The following variables are used as input variables are not present in the input data frame: \\n {}'.format(\n sorted(variables_mismatch)))\n if variables_mismatch:\n log.debug('The following variables are used as input variables: \\n {}'.format(\n sorted(used_as_input_variables)))\n log.debug('The input_data_frame contains the following variables: \\n {}'.format(\n sorted(list(input_data_frame.columns))))\n\n entity = simulation.entities[entity]\n id_variables = [\n self.id_variable_by_entity_key[_entity.key] for _entity in simulation.entities.values()\n if not _entity.is_person]\n role_variables = [\n self.role_variable_by_entity_key[_entity.key] for _entity in simulation.entities.values()\n if not _entity.is_person]\n\n if entity.is_person:\n for id_variable in id_variables + role_variables:\n assert id_variable in input_data_frame.columns, \\\n \"Variable {} is not present in input dataframe\".format(id_variable)\n\n input_data_frame = self.filter_input_variables(input_data_frame = input_data_frame, simulation = simulation)\n\n if entity.is_person:\n entity.count = entity.step_size = len(input_data_frame)\n for collective_entity in simulation.entities.values():\n if collective_entity.is_person:\n continue\n _key = collective_entity.key\n _id_variable = self.id_variable_by_entity_key[_key]\n _role_variable = self.role_variable_by_entity_key[_key]\n\n collective_entity.count = len(input_data_frame[_id_variable].unique())\n collective_entity.members_entity_id = input_data_frame[_id_variable].astype('int').values\n # TODO remove legacy use\n collective_entity.members_legacy_role = input_data_frame[_role_variable].astype('int').values\n\n for (legacy_role, flattened_role) in enumerate(collective_entity.flattened_roles):\n if legacy_role < len(collective_entity.flattened_roles):\n collective_entity.members_role = np.where(\n collective_entity.members_legacy_role == legacy_role,\n flattened_role,\n collective_entity.members_role,\n )\n else:\n collective_entity.members_role = np.where(\n collective_entity.members_legacy_role >= len(collective_entity.flattened_roles),\n collective_entity.flattened_roles[-1],\n collective_entity.members_role,\n )\n\n else:\n entity.count = entity.step_size = len(input_data_frame)\n\n for column_name, column_serie in input_data_frame.iteritems():\n if column_name in (id_variables + role_variables):\n continue\n\n variable_instance = self.tax_benefit_system.variables.get(column_name)\n if variable_instance.entity.key != entity.key:\n log.info(\"Ignoring variable {} which is not part of entity {} but {}\".format(\n column_name, entity.key, variable_instance.entity.key))\n continue\n\n init_variable_in_entity(\n entity = entity,\n variable_name = column_name,\n series = column_serie,\n period = period,\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef init_simulation_with_data_frame(self, input_data_frame = None, period = None, simulation = None, entity = None):\n assert input_data_frame is not None\n assert period is not None\n assert simulation is not None\n used_as_input_variables = self.used_as_input_variables\n id_variable_by_entity_key = self.id_variable_by_entity_key\n role_variable_by_entity_key = self.role_variable_by_entity_key\n\n variables_mismatch = set(used_as_input_variables).difference(set(input_data_frame.columns))\n if variables_mismatch:\n log.info(\n 'The following variables used as input variables are not present in the input data frame: \\n {}'.format(\n sorted(variables_mismatch)))\n if variables_mismatch:\n log.debug('The following variables are used as input variables: \\n {}'.format(\n sorted(used_as_input_variables)))\n log.debug('The input_data_frame contains the following variables: \\n {}'.format(\n sorted(list(input_data_frame.columns))))\n\n id_variables = [\n id_variable_by_entity_key[_entity.key] for _entity in simulation.entities.values()\n if not _entity.is_person]\n role_variables = [\n role_variable_by_entity_key[_entity.key] for _entity in simulation.entities.values()\n if not _entity.is_person]\n\n for id_variable in id_variables + role_variables:\n entity_key = entity.key if entity is not None else None\n if (entity_key is not None) and (not simulation.entities[entity].is_person):\n assert id_variable in [id_variable_by_entity_key[entity], role_variable_by_entity_key[entity]], \\\n \"variable {} for entity {} is not valid (not {} nor {})\".format(\n id_variable,\n entity_key,\n id_variable_by_entity_key[entity_key],\n role_variable_by_entity_key[entity_key],\n )\n continue\n assert id_variable in input_data_frame.columns, \\\n \"Variable {} is not present in input dataframe\".format(id_variable)\n\n input_data_frame = self.filter_input_variables(input_data_frame = input_data_frame, simulation = simulation)\n\n index_by_entity_key = dict()\n\n for key, entity in simulation.entities.items():\n if entity.is_person:\n entity.count = entity.step_size = len(input_data_frame)\n else:\n entity.count = entity.step_size = \\\n (input_data_frame[role_variable_by_entity_key[key]] == 0).sum()\n unique_ids_count = len(input_data_frame[id_variable_by_entity_key[key]].unique())\n assert entity.count == unique_ids_count, \\\n \"There are {0} person of role 0 in {1} but {2} {1}\".format(\n entity.count, entity.key, unique_ids_count)\n\n entity.members_entity_id = input_data_frame[id_variable_by_entity_key[key]].astype('int').values\n entity.members_legacy_role = input_data_frame[role_variable_by_entity_key[key]].astype('int').values\n index_by_entity_key[entity.key] = input_data_frame.loc[\n input_data_frame[role_variable_by_entity_key[entity.key]] == 0,\n id_variable_by_entity_key[key]\n ].sort_values().index\n\n for column_name, column_serie in input_data_frame.iteritems():\n if role_variable_by_entity_key is not None:\n if column_name in role_variable_by_entity_key.values():\n continue\n\n if id_variable_by_entity_key is not None:\n if column_name in id_variable_by_entity_key.values():\n continue\n\n entity = simulation.get_variable_entity(column_name)\n if entity.is_person:\n init_variable_in_entity(entity, column_name, column_serie, period)\n else:\n init_variable_in_entity(entity, column_name, column_serie[index_by_entity_key[entity.key]], period)", "response": "Initialize the simulation with the input data frame."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nneutralize input variables not in input dataframe and keep some crucial variables", "response": "def neutralize_variables(self, tax_benefit_system):\n \"\"\"\n Neutralizing input variables not in input dataframe and keep some crucial variables\n \"\"\"\n for variable_name, variable in tax_benefit_system.variables.items():\n if variable.formulas:\n continue\n if self.used_as_input_variables and (variable_name in self.used_as_input_variables):\n continue\n if self.non_neutralizable_variables and (variable_name in self.non_neutralizable_variables):\n continue\n if self.weight_column_name_by_entity and (variable_name in self.weight_column_name_by_entity.values()):\n continue\n\n tax_benefit_system.neutralize_variable(variable_name)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsets the tax and benefit systems and eventually the baseline tax and benefit system.", "response": "def set_tax_benefit_systems(self, tax_benefit_system = None, baseline_tax_benefit_system = None):\n \"\"\"\n Set the tax and benefit system and eventually the baseline tax and benefit system\n \"\"\"\n assert tax_benefit_system is not None\n self.tax_benefit_system = tax_benefit_system\n if self.cache_blacklist is not None:\n self.tax_benefit_system.cache_blacklist = self.cache_blacklist\n if baseline_tax_benefit_system is not None:\n self.baseline_tax_benefit_system = baseline_tax_benefit_system\n if self.cache_blacklist is not None:\n self.baseline_tax_benefit_system.cache_blacklist = self.cache_blacklist"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nprinting a summary of a variable including its memory usage.", "response": "def summarize_variable(self, variable = None, use_baseline = False, weighted = False, force_compute = False):\n \"\"\"\n Prints a summary of a variable including its memory usage.\n\n :param string variable: the variable being summarized\n :param bool use_baseline: the tax-benefit-system considered\n :param bool weighted: whether the produced statistics should be weigthted or not\n :param bool force_compute: whether the computation of the variable should be forced\n\n Example:\n >>> from openfisca_survey_manager.tests.test_scenario import create_randomly_initialized_survey_scenario\n >>> survey_scenario = create_randomly_initialized_survey_scenario()\n >>> survey_scenario.summarize_variable(variable = \"housing_occupancy_status\", force_compute = True)\n \n housing_occupancy_status: 1 periods * 5 cells * item size 2 (, default = HousingOccupancyStatus.tenant) = 10B\n Details:\n 2017-01: owner = 0.00e+00 (0.0%), tenant = 5.00e+00 (100.0%), free_lodger = 0.00e+00 (0.0%), homeless = 0.00e+00 (0.0%).\n >>> survey_scenario.summarize_variable(variable = \"rent\", force_compute = True)\n \n rent: 2 periods * 5 cells * item size 4 (, default = 0) = 40B\n Details:\n 2017-01: mean = 562.385070801, min = 156.01864624, max = 950.714294434, mass = 2.81e+03, default = 0.0%, median = 598.658508301\n 2018-01: mean = 562.385070801, min = 156.01864624, max = 950.714294434, mass = 2.81e+03, default = 0.0%, median = 598.658508301\n \"\"\"\n\n if use_baseline:\n simulation = self.baseline_simulation\n else:\n simulation = self.simulation\n\n tax_benefit_system = simulation.tax_benefit_system\n assert variable in tax_benefit_system.variables, \"{} is not a valid variable\".format(variable)\n variable_instance = tax_benefit_system.variables[variable]\n\n default_value = variable_instance.default_value\n value_type = variable_instance.value_type\n\n if weighted:\n weight_variable = self.weight_column_name_by_entity[variable_instance.entity.key]\n weights = simulation.calculate(weight_variable, simulation.period)\n\n infos = simulation.get_memory_usage(variables = [variable])['by_variable'].get(variable)\n if not infos:\n if force_compute:\n self.calculate_variable(variable = variable, period = simulation.period, use_baseline = use_baseline)\n self.summarize_variable(variable = variable, use_baseline = use_baseline, weighted = weighted)\n return\n else:\n print(\"{} is not computed yet. Use keyword argument force_compute = True\".format(variable))\n return\n header_line = \"{}: {} periods * {} cells * item size {} ({}, default = {}) = {}\".format(\n variable,\n infos['nb_arrays'],\n infos['nb_cells_by_array'],\n infos['cell_size'],\n infos['dtype'],\n default_value,\n humanize.naturalsize(infos['total_nb_bytes'], gnu = True),\n )\n print(\"\")\n print(header_line)\n print(\"Details:\")\n holder = simulation.get_holder(variable)\n if holder is not None:\n if holder.variable.definition_period == ETERNITY:\n array = holder.get_array(ETERNITY)\n print(\"permanent: mean = {}, min = {}, max = {}, median = {}, default = {:.1%}\".format(\n array.mean() if not weighted else np.average(array, weights = weights),\n array.min(),\n array.max(),\n np.median(array),\n (\n (array == default_value).sum() / len(array)\n if not weighted\n else ((array == default_value) * weights).sum() / weights.sum()\n )\n ))\n else:\n for period in sorted(holder.get_known_periods()):\n array = holder.get_array(period)\n if array.shape == ():\n print(\"{}: always = {}\".format(period, array))\n continue\n\n if value_type == Enum:\n possible_values = variable_instance.possible_values\n categories_by_index = dict(zip(\n range(len(possible_values._member_names_)),\n possible_values._member_names_\n ))\n categories_type = pd.api.types.CategoricalDtype(categories = possible_values._member_names_, ordered = True)\n df = pd.DataFrame({variable: array}).replace(categories_by_index).astype(categories_type)\n df['weights'] = weights if weighted else 1\n groupby = df.groupby(variable)['weights'].sum()\n total = groupby.sum()\n expr = [\" {} = {:.2e} ({:.1%})\".format(index, row, row / total) for index, row in groupby.iteritems()]\n print(\"{}:{}.\".format(period, \",\".join(expr)))\n continue\n\n print(\"{}: mean = {}, min = {}, max = {}, mass = {:.2e}, default = {:.1%}, median = {}\".format(\n period,\n array.mean() if not weighted else np.average(array, weights = weights),\n array.min(),\n array.max(),\n array.sum() if not weighted else np.sum(array * weights),\n (\n (array == default_value).sum() / len(array)\n if not weighted\n else ((array == default_value) * weights).sum() / weights.sum()\n ),\n np.median(array),\n ))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _set_id_variable_by_entity_key(self) -> Dict[str, str]:\n '''Identify and set the good ids for the different entities'''\n if self.id_variable_by_entity_key is None:\n self.id_variable_by_entity_key = dict(\n (entity.key, entity.key + '_id') for entity in self.tax_benefit_system.entities)\n log.debug(\"Use default id_variable names:\\n {}\".format(self.id_variable_by_entity_key))\n\n return self.id_variable_by_entity_key", "response": "Identify and set the good ids for the different entities"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _set_role_variable_by_entity_key(self) -> Dict[str, str]:\n '''Identify and set the good roles for the different entities'''\n if self.role_variable_by_entity_key is None:\n self.role_variable_by_entity_key = dict(\n (entity.key, entity.key + '_legacy_role') for entity in self.tax_benefit_system.entities)\n\n return self.role_variable_by_entity_key", "response": "Identify and set the good roles for the different entities"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nidentifying and set the good input variables for the different entities", "response": "def _set_used_as_input_variables_by_entity(self) -> Dict[str, List[str]]:\n '''Identify and set the good input variables for the different entities'''\n if self.used_as_input_variables_by_entity is not None:\n return\n\n tax_benefit_system = self.tax_benefit_system\n\n assert set(self.used_as_input_variables) <= set(tax_benefit_system.variables.keys()), \\\n \"Some variables used as input variables are not part of the tax benefit system:\\n {}\".format(\n set(self.used_as_input_variables).difference(set(tax_benefit_system.variables.keys()))\n )\n\n self.used_as_input_variables_by_entity = dict()\n\n for entity in tax_benefit_system.entities:\n self.used_as_input_variables_by_entity[entity.key] = [\n variable\n for variable in self.used_as_input_variables\n if tax_benefit_system.get_variable(variable).entity == entity\n ]\n\n return self.used_as_input_variables_by_entity"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _iter_dimensions(self):\n return (\n Dimension(raw_dimension.dimension_dict, raw_dimension.dimension_type)\n for raw_dimension in self._raw_dimensions\n )", "response": "Generate Dimension object for each dimension dict."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _raw_dimensions(self):\n return tuple(\n _RawDimension(dimension_dict, self._dimension_dicts)\n for dimension_dict in self._dimension_dicts\n )", "response": "Sequence of _RawDimension objects wrapping each dimension dict."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns member of DIMENSION_TYPE appropriate to dimension_dict.", "response": "def dimension_type(self):\n \"\"\"Return member of DIMENSION_TYPE appropriate to dimension_dict.\"\"\"\n base_type = self._base_type\n if base_type == \"categorical\":\n return self._resolve_categorical()\n if base_type == \"enum.variable\":\n return self._resolve_array_type()\n if base_type == \"enum.datetime\":\n return DT.DATETIME\n if base_type == \"enum.numeric\":\n return DT.BINNED_NUMERIC\n if base_type == \"enum.text\":\n return DT.TEXT\n raise NotImplementedError(\"unrecognized dimension type %s\" % base_type)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _base_type(self):\n type_class = self._dimension_dict[\"type\"][\"class\"]\n if type_class == \"categorical\":\n return \"categorical\"\n if type_class == \"enum\":\n subclass = self._dimension_dict[\"type\"][\"subtype\"][\"class\"]\n return \"enum.%s\" % subclass\n raise NotImplementedError(\"unexpected dimension type class '%s'\" % type_class)", "response": "Return str like enum. numeric representing dimension type."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning one of the ARRAY_TYPES members of DIMENSION_TYPE.", "response": "def _resolve_array_type(self):\n \"\"\"Return one of the ARRAY_TYPES members of DIMENSION_TYPE.\n\n This method distinguishes between CA and MR dimensions. The return\n value is only meaningful if the dimension is known to be of array\n type (i.e. either CA or MR, base-type 'enum.variable').\n \"\"\"\n next_raw_dimension = self._next_raw_dimension\n if next_raw_dimension is None:\n return DT.CA\n\n is_mr_subvar = (\n next_raw_dimension._base_type == \"categorical\"\n and next_raw_dimension._has_selected_category\n and next_raw_dimension._alias == self._alias\n )\n return DT.MR if is_mr_subvar else DT.CA"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _resolve_categorical(self):\n # ---an array categorical is either CA_CAT or MR_CAT---\n if self._is_array_cat:\n return DT.MR_CAT if self._has_selected_category else DT.CA_CAT\n\n # ---what's left is logical or plain-old categorical---\n return DT.LOGICAL if self._has_selected_category else DT.CAT", "response": "Return one of the categorical members of DIMENSION_TYPE."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef hs_indices(self):\n if self.dimension_type in {DT.MR_CAT, DT.LOGICAL}:\n return ()\n\n return tuple(\n (subtotal.anchor_idx, subtotal.addend_idxs) for subtotal in self._subtotals\n )", "response": "tuple of anchor_idx and addend_idxs pair for each subtotal."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef inserted_hs_indices(self):\n # ---don't do H&S insertions for CA and MR subvar dimensions---\n if self.dimension_type in DT.ARRAY_TYPES:\n return []\n\n return [\n idx\n for idx, item in enumerate(\n self._iter_interleaved_items(self.valid_elements)\n )\n if item.is_insertion\n ]", "response": "list of int index of each inserted subtotal for the dimension."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef is_marginable(self):\n return self.dimension_type not in {DT.CA, DT.MR, DT.MR_CAT, DT.LOGICAL}", "response": "True if adding counts across this dimension axis is meaningful."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef labels(\n self, include_missing=False, include_transforms=False, include_cat_ids=False\n ):\n \"\"\"Return list of str labels for the elements of this dimension.\n\n Returns a list of (label, element_id) pairs if *include_cat_ids* is\n True. The `element_id` value in the second position of the pair is\n None for subtotal items (which don't have an element-id).\n \"\"\"\n # TODO: Having an alternate return type triggered by a flag-parameter\n # (`include_cat_ids` in this case) is poor practice. Using flags like\n # that effectively squashes what should be two methods into one.\n # Either get rid of the need for that alternate return value type or\n # create a separate method for it.\n elements = self.all_elements if include_missing else self.valid_elements\n\n include_subtotals = include_transforms and self.dimension_type != DT.CA_SUBVAR\n\n # ---items are elements or subtotals, interleaved in display order---\n interleaved_items = tuple(self._iter_interleaved_items(elements))\n\n labels = list(\n item.label\n for item in interleaved_items\n if include_subtotals or not item.is_insertion\n )\n\n if include_cat_ids:\n element_ids = tuple(\n None if item.is_insertion else item.element_id\n for item in interleaved_items\n if include_subtotals or not item.is_insertion\n )\n return list(zip(labels, element_ids))\n\n return labels", "response": "Return a list of str labels for the elements of this dimension."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _iter_interleaved_items(self, elements):\n subtotals = self._subtotals\n\n for subtotal in subtotals.iter_for_anchor(\"top\"):\n yield subtotal\n\n for element in elements:\n yield element\n for subtotal in subtotals.iter_for_anchor(element.element_id):\n yield subtotal\n\n for subtotal in subtotals.iter_for_anchor(\"bottom\"):\n yield subtotal", "response": "Iterate over the items in interleaved order."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _subtotals(self):\n view = self._dimension_dict.get(\"references\", {}).get(\"view\", {})\n # ---view can be both None and {}, thus the edge case.---\n insertion_dicts = (\n [] if view is None else view.get(\"transform\", {}).get(\"insertions\", [])\n )\n return _Subtotals(insertion_dicts, self.valid_elements)", "response": "_Subtotals sequence object for this dimension."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the type and source dicts for this dimension s elements.", "response": "def _element_makings(self):\n \"\"\"(ElementCls, element_dicts) pair for this dimension's elements.\n\n All the elements of a given dimension are the same type. This method\n determines the type (class) and source dicts for the elements of this\n dimension and provides them for the element factory.\n \"\"\"\n if self._type_dict[\"class\"] == \"categorical\":\n return _Category, self._type_dict[\"categories\"]\n return _Element, self._type_dict[\"elements\"]"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncomposes tuple storing actual sequence of element objects.", "response": "def _elements(self):\n \"\"\"Composed tuple storing actual sequence of element objects.\"\"\"\n ElementCls, element_dicts = self._element_makings\n return tuple(\n ElementCls(element_dict, idx, element_dicts)\n for idx, element_dict in enumerate(element_dicts)\n )"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef numeric_value(self):\n numeric_value = self._element_dict.get(\"numeric_value\")\n return np.nan if numeric_value is None else numeric_value", "response": "Numeric value assigned to element by user np. nan if absent."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef label(self):\n value = self._element_dict.get(\"value\")\n type_name = type(value).__name__\n\n if type_name == \"NoneType\":\n return \"\"\n\n if type_name == \"list\":\n # ---like '10-15' or 'A-F'---\n return \"-\".join([str(item) for item in value])\n\n if type_name in (\"float\", \"int\"):\n return str(value)\n\n if type_name in (\"str\", \"unicode\"):\n return value\n\n # ---For CA and MR subvar dimensions---\n name = value.get(\"references\", {}).get(\"name\")\n return name if name else \"\"", "response": "str display - name for this element"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef iter_for_anchor(self, anchor):\n return (subtotal for subtotal in self._subtotals if subtotal.anchor == anchor)", "response": "Generate each subtotal having matching anchor."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nyields each dict that represents a valid subtotal.", "response": "def _iter_valid_subtotal_dicts(self):\n \"\"\"Generate each insertion dict that represents a valid subtotal.\"\"\"\n for insertion_dict in self._insertion_dicts:\n # ---skip any non-dicts---\n if not isinstance(insertion_dict, dict):\n continue\n\n # ---skip any non-subtotal insertions---\n if insertion_dict.get(\"function\") != \"subtotal\":\n continue\n\n # ---skip any malformed subtotal-dicts---\n if not {\"anchor\", \"args\", \"name\"}.issubset(insertion_dict.keys()):\n continue\n\n # ---skip if doesn't reference at least one non-missing element---\n if not self._element_ids.intersection(insertion_dict[\"args\"]):\n continue\n\n # ---an insertion-dict that successfully runs this gauntlet\n # ---is a valid subtotal dict\n yield insertion_dict"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncomposing tuple storing actual sequence of _Subtotal objects.", "response": "def _subtotals(self):\n \"\"\"Composed tuple storing actual sequence of _Subtotal objects.\"\"\"\n return tuple(\n _Subtotal(subtotal_dict, self.valid_elements)\n for subtotal_dict in self._iter_valid_subtotal_dicts()\n )"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef anchor(self):\n anchor = self._subtotal_dict[\"anchor\"]\n try:\n anchor = int(anchor)\n if anchor not in self.valid_elements.element_ids:\n return \"bottom\"\n return anchor\n except (TypeError, ValueError):\n return anchor.lower()", "response": "int or str indicating element anchor"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef anchor_idx(self):\n anchor = self.anchor\n if anchor in [\"top\", \"bottom\"]:\n return anchor\n return self.valid_elements.get_by_id(anchor).index_in_valids", "response": "int or str representing index of anchor element in dimension."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef addend_idxs(self):\n return tuple(\n self.valid_elements.get_by_id(addend_id).index_in_valids\n for addend_id in self.addend_ids\n )", "response": "tuple of int index of each addend element in this subtotal."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a dictionary of data files by format.", "response": "def create_data_file_by_format(directory_path = None):\n \"\"\"\n Browse subdirectories to extract stata and sas files\n \"\"\"\n stata_files = []\n sas_files = []\n\n for root, subdirs, files in os.walk(directory_path):\n for file_name in files:\n file_path = os.path.join(root, file_name)\n if os.path.basename(file_name).endswith(\".dta\"):\n log.info(\"Found stata file {}\".format(file_path))\n stata_files.append(file_path)\n if os.path.basename(file_name).endswith(\".sas7bdat\"):\n log.info(\"Found sas file {}\".format(file_path))\n sas_files.append(file_path)\n return {'stata': stata_files, 'sas': sas_files}"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns an array representation of the crunch cuts.", "response": "def as_array(\n self,\n include_missing=False,\n weighted=True,\n include_transforms_for_dims=None,\n prune=False,\n ):\n \"\"\"Return `ndarray` representing cube values.\n\n Returns the tabular representation of the crunch cube. The returned\n array has the same number of dimensions as the cube. E.g. for\n a cross-tab representation of a categorical and numerical variable,\n the resulting cube will have two dimensions.\n\n *include_missing* (bool): Include rows/cols for missing values.\n\n Example 1 (Categorical x Categorical)::\n\n >>> cube = CrunchCube(response)\n >>> cube.as_array()\n np.array([\n [5, 2],\n [5, 3],\n ])\n\n Example 2 (Categorical x Categorical, include missing values)::\n\n >>> cube = CrunchCube(response)\n >>> cube.as_array(include_missing=True)\n np.array([\n [5, 3, 2, 0],\n [5, 2, 3, 0],\n [0, 0, 0, 0],\n ])\n \"\"\"\n array = self._as_array(\n include_missing=include_missing,\n weighted=weighted,\n include_transforms_for_dims=include_transforms_for_dims,\n )\n\n # ---prune array if pruning was requested---\n if prune:\n array = self._prune_body(array, transforms=include_transforms_for_dims)\n\n return self._drop_mr_cat_dims(array)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning numberic count of rows considered for cube response.", "response": "def count(self, weighted=True):\n \"\"\"Return numberic count of rows considered for cube response.\"\"\"\n return self._measures.weighted_n if weighted else self._measures.unweighted_n"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_slices(self, ca_as_0th=False):\n if self.ndim < 3 and not ca_as_0th:\n return [CubeSlice(self, 0)]\n\n return [CubeSlice(self, i, ca_as_0th) for i, _ in enumerate(self.labels()[0])]", "response": "Return list of slice objects."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef index(self, weighted=True, prune=False):\n warnings.warn(\n \"CrunchCube.index() is deprecated. Use CubeSlice.index_table().\",\n DeprecationWarning,\n )\n return Index.data(self, weighted, prune)", "response": "Return cube index measurement."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget indices of the inserted H&S.", "response": "def inserted_hs_indices(self, prune=False):\n \"\"\"Get indices of the inserted H&S (for formatting purposes).\"\"\"\n if self.ndim == 2 and prune:\n # If pruning is applied, we need to subtract from the H&S indes\n # the number of pruned rows (cols) that come before that index.\n pruning_bases = [self._pruning_base(axis=i, hs_dims=[0, 1]) for i in [1, 0]]\n pruning_bases = [\n base if base.ndim == 1 else np.sum(base, axis=(1 - i))\n for i, base in enumerate(pruning_bases)\n ]\n # Obtain prune indices as subscripts\n prune_indices_list = [\n np.arange(len(base))[np.logical_or(base == 0, np.isnan(base))]\n for base in pruning_bases\n ]\n inserted_indices_list = [dim.inserted_hs_indices for dim in self.dimensions]\n return self._adjust_inserted_indices(\n inserted_indices_list, prune_indices_list\n )\n\n return [dim.inserted_hs_indices for dim in self.dimensions]"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ntrues if cube only contains a CA dimension - pair in either order.", "response": "def is_univariate_ca(self):\n \"\"\"True if cube only contains a CA dimension-pair, in either order.\"\"\"\n return self.ndim == 2 and set(self.dim_types) == {DT.CA_SUBVAR, DT.CA_CAT}"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef labels(self, include_missing=False, include_transforms_for_dims=False):\n return [\n dim.labels(include_missing, include_transforms_for_dims)\n for dim in self.dimensions\n ]", "response": "Returns a list of labels for each cube s dimension."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the margin of the selected axis.", "response": "def margin(\n self,\n axis=None,\n weighted=True,\n include_missing=False,\n include_transforms_for_dims=None,\n prune=False,\n include_mr_cat=False,\n ):\n \"\"\"Get margin for the selected axis.\n\n the selected axis. For MR variables, this is the sum of the selected\n and non-selected slices.\n\n Args\n axis (int): Axis across the margin is calculated. If no axis is\n provided the margin is calculated across all axis.\n For Categoricals, Num, Datetime, and Text, this\n translates to sumation of all elements.\n Returns\n Calculated margin for the selected axis\n\n Example 1:\n >>> cube = CrunchCube(fixt_cat_x_cat)\n np.array([\n [5, 2],\n [5, 3],\n ])\n\n >>> cube.margin(axis=0)\n np.array([10, 5])\n\n Example 2:\n >>> cube = CrunchCube(fixt_cat_x_num_x_datetime)\n np.array([\n [[1, 1],\n [0, 0],\n [0, 0],\n [0, 0]],\n [[2, 1],\n [1, 1],\n [0, 0],\n [0, 0]],\n [[0, 0],\n [2, 3],\n [0, 0],\n [0, 0]],\n [[0, 0],\n [0, 0],\n [3, 2],\n [0, 0]],\n [[0, 0],\n [0, 0],\n [1, 1],\n [0, 1]]\n ])\n\n >>> cube.margin(axis=0)\n np.array([\n [3, 2],\n [3, 4],\n [4, 3],\n [0, 1],\n ])\n \"\"\"\n\n table = self._counts(weighted).raw_cube_array\n new_axis = self._adjust_axis(axis)\n index = tuple(\n None if i in new_axis else slice(None) for i, _ in enumerate(table.shape)\n )\n\n # Calculate denominator. Only include those H&S dimensions, across\n # which we DON'T sum. These H&S are needed because of the shape, when\n # dividing. Those across dims which are summed across MUST NOT be\n # included, because they would change the result.\n hs_dims = self._hs_dims_for_den(include_transforms_for_dims, axis)\n den = self._apply_subtotals(\n self._apply_missings(table, include_missing=include_missing), hs_dims\n )\n\n # Apply correct mask (based on the as_array shape)\n arr = self._as_array(\n include_transforms_for_dims=hs_dims, include_missing=include_missing\n )\n\n # ---prune array if pruning was requested---\n if prune:\n arr = self._prune_body(arr, transforms=hs_dims)\n\n arr = self._drop_mr_cat_dims(arr, fix_valids=include_missing)\n\n if isinstance(arr, np.ma.core.MaskedArray):\n # Inflate the reduced version of the array, to match the\n # non-reduced version, for the purposes of creating the correct\n # mask. Create additional dimension (with no elements) where MR_CAT\n # dimensions should be. Don't inflate 0th dimension if it has only\n # a single element, because it's not being reduced\n # in self._drop_mr_cat_dims\n inflate_ind = tuple(\n (\n None\n if (\n d.dimension_type == DT.MR_CAT\n or i != 0\n and (n <= 1 or len(d.valid_elements) <= 1)\n )\n else slice(None)\n )\n for i, (d, n) in enumerate(zip(self._all_dimensions, table.shape))\n )\n mask = np.logical_or(np.zeros(den.shape, dtype=bool), arr.mask[inflate_ind])\n den = np.ma.masked_array(den, mask)\n\n if (\n self.ndim != 1\n or axis is None\n or axis == 0\n and len(self._all_dimensions) == 1\n ):\n # Special case for 1D cube wigh MR, for \"Table\" direction\n den = np.sum(den, axis=new_axis)[index]\n\n den = self._drop_mr_cat_dims(\n den, fix_valids=(include_missing or include_mr_cat)\n )\n if den.shape[0] == 1 and len(den.shape) > 1 and self.ndim < 3:\n den = den.reshape(den.shape[1:])\n return den"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns int tuple of int representing the index of each multiple - response MR dimension in this cube.", "response": "def mr_dim_ind(self):\n \"\"\"Return int, tuple of int, or None, representing MR indices.\n\n The return value represents the index of each multiple-response (MR)\n dimension in this cube. Return value is None if there are no MR\n dimensions, and int if there is one MR dimension, and a tuple of int\n when there are more than one. The index is the (zero-based) position\n of the MR dimensions in the _ApparentDimensions sequence returned by\n the :attr\"`.dimensions` property.\n \"\"\"\n # TODO: rename to `mr_dim_idxs` or better yet get rid of need for\n # this as it's really a cube internal characteristic.\n # TODO: Make this return a tuple in all cases, like (), (1,), or (0, 2).\n indices = tuple(\n idx\n for idx, d in enumerate(self.dimensions)\n if d.dimension_type == DT.MR_SUBVAR\n )\n if indices == ():\n return None\n if len(indices) == 1:\n return indices[0]\n return indices"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn counts scaled in proportion to overall population.", "response": "def population_counts(\n self,\n population_size,\n weighted=True,\n include_missing=False,\n include_transforms_for_dims=None,\n prune=False,\n ):\n \"\"\"Return counts scaled in proportion to overall population.\n\n The return value is a numpy.ndarray object. Count values are scaled\n proportionally to approximate their value if the entire population\n had been sampled. This calculation is based on the estimated size of\n the population provided as *population size*. The remaining arguments\n have the same meaning as they do for the `.proportions()` method.\n\n Example::\n\n >>> cube = CrunchCube(fixt_cat_x_cat)\n >>> cube.as_array()\n np.array([\n [5, 2],\n [5, 3],\n ])\n >>> cube.population_counts(9000)\n np.array([\n [3000, 1200],\n [3000, 1800],\n ])\n \"\"\"\n population_counts = [\n slice_.population_counts(\n population_size,\n weighted=weighted,\n include_missing=include_missing,\n include_transforms_for_dims=include_transforms_for_dims,\n prune=prune,\n )\n for slice_ in self.slices\n ]\n\n if len(population_counts) > 1:\n return np.array(population_counts)\n return population_counts[0]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn percentage values for cube as `numpy.ndarray`. This function calculates the proportions across the selected axis of a crunch cube. For most variable types, it means the value divided by the margin value. For a multiple-response variable, the value is divided by the sum of selected and non-selected slices. *axis* (int): base axis of proportions calculation. If no axis is provided, calculations are done across the entire table. *weighted* (bool): Specifies weighted or non-weighted proportions. *include_transforms_for_dims* (list): Also include headings and subtotals transformations for the provided dimensions. If the dimensions have the transformations, they'll be included in the resulting numpy array. If the dimensions don't have the transformations, nothing will happen (the result will be the same as if the argument weren't provided). *include_transforms_for_dims* (list): Include headers and subtotals (H&S) across various dimensions. The dimensions are provided as list elements. For example: \"include_transforms_for_dims=[0, 1]\" instructs the CrunchCube to return H&S for both rows and columns (if it's a 2D cube). *include_mr_cat* (bool): Include MR categories. *prune* (bool): Instructs the CrunchCube to prune empty rows/cols. Emptiness is determined by the state of the margin (if it's either 0 or nan at certain index). If it is, the corresponding row/col is not included in the result. Example 1:: >>> cube = CrunchCube(fixt_cat_x_cat) np.array([ [5, 2], [5, 3], ]) >>> cube.proportions() np.array([ [0.3333333, 0.1333333], [0.3333333, 0.2000000], ]) Example 2:: >>> cube = CrunchCube(fixt_cat_x_cat) np.array([ [5, 2], [5, 3], ]) >>> cube.proportions(axis=0) np.array([ [0.5, 0.4], [0.5, 0.6], ])", "response": "def proportions(\n self,\n axis=None,\n weighted=True,\n include_transforms_for_dims=None,\n include_mr_cat=False,\n prune=False,\n ):\n \"\"\"Return percentage values for cube as `numpy.ndarray`.\n\n This function calculates the proportions across the selected axis\n of a crunch cube. For most variable types, it means the value divided\n by the margin value. For a multiple-response variable, the value is\n divided by the sum of selected and non-selected slices.\n\n *axis* (int): base axis of proportions calculation. If no axis is\n provided, calculations are done across the entire table.\n\n *weighted* (bool): Specifies weighted or non-weighted proportions.\n\n *include_transforms_for_dims* (list): Also include headings and\n subtotals transformations for the provided dimensions. If the\n dimensions have the transformations, they'll be included in the\n resulting numpy array. If the dimensions don't have the\n transformations, nothing will happen (the result will be the same as\n if the argument weren't provided).\n\n *include_transforms_for_dims* (list): Include headers and subtotals\n (H&S) across various dimensions. The dimensions are provided as list\n elements. For example: \"include_transforms_for_dims=[0, 1]\" instructs\n the CrunchCube to return H&S for both rows and columns (if it's a 2D\n cube).\n\n *include_mr_cat* (bool): Include MR categories.\n\n *prune* (bool): Instructs the CrunchCube to prune empty rows/cols.\n Emptiness is determined by the state of the margin (if it's either\n 0 or nan at certain index). If it is, the corresponding row/col is\n not included in the result.\n\n Example 1::\n\n >>> cube = CrunchCube(fixt_cat_x_cat)\n np.array([\n [5, 2],\n [5, 3],\n ])\n\n >>> cube.proportions()\n np.array([\n [0.3333333, 0.1333333],\n [0.3333333, 0.2000000],\n ])\n\n Example 2::\n\n >>> cube = CrunchCube(fixt_cat_x_cat)\n np.array([\n [5, 2],\n [5, 3],\n ])\n\n >>> cube.proportions(axis=0)\n np.array([\n [0.5, 0.4],\n [0.5, 0.6],\n ])\n \"\"\"\n\n # Calculate numerator from table (include all H&S dimensions).\n table = self._measure(weighted).raw_cube_array\n num = self._apply_subtotals(\n self._apply_missings(table), include_transforms_for_dims\n )\n\n proportions = num / self._denominator(\n weighted, include_transforms_for_dims, axis\n )\n if not include_mr_cat:\n proportions = self._drop_mr_cat_dims(proportions)\n\n # Apply correct mask (based on the as_array shape)\n arr = self.as_array(\n prune=prune, include_transforms_for_dims=include_transforms_for_dims\n )\n if isinstance(arr, np.ma.core.MaskedArray):\n proportions = np.ma.masked_array(proportions, arr.mask)\n\n return proportions"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncalculate denominator for percentages. Only include those H&S dimensions across which we DON'T sum.", "response": "def _denominator(self, weighted, include_transforms_for_dims, axis):\n \"\"\"Calculate denominator for percentages.\n\n Only include those H&S dimensions, across which we DON'T sum. These H&S\n are needed because of the shape, when dividing. Those across dims\n which are summed across MUST NOT be included, because they would\n change the result.\"\"\"\n\n table = self._measure(weighted).raw_cube_array\n new_axis = self._adjust_axis(axis)\n index = tuple(\n None if i in new_axis else slice(None) for i, _ in enumerate(table.shape)\n )\n hs_dims = self._hs_dims_for_den(include_transforms_for_dims, axis)\n den = self._apply_subtotals(self._apply_missings(table), hs_dims)\n return np.sum(den, axis=new_axis)[index]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning ndarray with cube s zscore measurements.", "response": "def zscore(self, weighted=True, prune=False, hs_dims=None):\n \"\"\"Return ndarray with cube's zscore measurements.\n\n Zscore is a measure of statistical significance of observed vs.\n expected counts. It's only applicable to a 2D contingency tables.\n For 3D cubes, the measures of separate slices are stacked together\n and returned as the result.\n\n :param weighted: Use weighted counts for zscores\n :param prune: Prune based on unweighted counts\n :param hs_dims: Include headers and subtotals (as NaN values)\n :returns zscore: ndarray representing zscore measurements\n \"\"\"\n res = [s.zscore(weighted, prune, hs_dims) for s in self.slices]\n return np.array(res) if self.ndim == 3 else res[0]"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning matrices of column - comparison p - values along axis of pairwise p - values for each pair of columns.", "response": "def wishart_pairwise_pvals(self, axis=0):\n \"\"\"Return matrices of column-comparison p-values as list of numpy.ndarrays.\n\n Square, symmetric matrix along *axis* of pairwise p-values for the\n null hypothesis that col[i] = col[j] for each pair of columns.\n\n *axis* (int): axis along which to perform comparison. Only columns (0)\n are implemented currently.\n \"\"\"\n return [slice_.wishart_pairwise_pvals(axis=axis) for slice_ in self.slices]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nadjusting the axis of the items in the internal array.", "response": "def _adjust_axis(self, axis):\n \"\"\"Return raw axis/axes corresponding to apparent axis/axes.\n\n This method adjusts user provided 'axis' parameter, for some of the\n cube operations, mainly 'margin'. The user never sees the MR selections\n dimension, and treats all MRs as single dimensions. Thus we need to\n adjust the values of axis (to sum across) to what the user would've\n specified if he were aware of the existence of the MR selections\n dimension. The reason for this adjustment is that all of the operations\n performed troughout the margin calculations will be carried on an\n internal array, containing all the data (together with all selections).\n\n For more info on how it needs to operate, check the unit tests.\n \"\"\"\n if not self._is_axis_allowed(axis):\n ca_error_msg = \"Direction {} not allowed (items dimension)\"\n raise ValueError(ca_error_msg.format(axis))\n\n if isinstance(axis, int):\n # If single axis was provided, create a list out of it, so that\n # we can do the subsequent iteration.\n axis = list([axis])\n elif axis is None:\n # If axis was None, create what user would expect in terms of\n # finding out the Total(s). In case of 2D cube, this will be the\n # axis of all the dimensions that the user can see, that is (0, 1),\n # because the selections dimension is invisible to the user. In\n # case of 3D cube, this will be the \"total\" across each slice, so\n # we need to drop the 0th dimension, and only take last two (1, 2).\n axis = range(self.ndim)[-2:]\n else:\n # In case of a tuple, just keep it as a list.\n axis = list(axis)\n axis = np.array(axis)\n\n # Create new array for storing updated values of axis. It's necessary\n # because it's hard to update the values in place.\n new_axis = np.array(axis)\n\n # Iterate over user-visible dimensions, and update axis when MR is\n # detected. For each detected MR, we need to increment all subsequent\n # axis (that were provided by the user). But we don't need to update\n # the axis that are \"behind\" the current MR.\n for i, dim in enumerate(self.dimensions):\n if dim.dimension_type == DT.MR_SUBVAR:\n # This formula updates only the axis that come \"after\" the\n # current MR (items) dimension.\n new_axis[axis >= i] += 1\n\n return tuple(new_axis)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nadjusts inserted indices if there are pruned elements.", "response": "def _adjust_inserted_indices(inserted_indices_list, prune_indices_list):\n \"\"\"Adjust inserted indices, if there are pruned elements.\"\"\"\n # Created a copy, to preserve cached property\n updated_inserted = [[i for i in dim_inds] for dim_inds in inserted_indices_list]\n pruned_and_inserted = zip(prune_indices_list, updated_inserted)\n for prune_inds, inserted_inds in pruned_and_inserted:\n # Only prune indices if they're not H&S (inserted)\n prune_inds = prune_inds[~np.in1d(prune_inds, inserted_inds)]\n for i, ind in enumerate(inserted_inds):\n ind -= np.sum(prune_inds < ind)\n inserted_inds[i] = ind\n return updated_inserted"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _apply_missings(self, res, include_missing=False):\n # --element idxs that satisfy `include_missing` arg. Note this\n # --includes MR_CAT elements so is essentially all-or-valid-elements\n element_idxs = tuple(\n (\n d.all_elements.element_idxs\n if include_missing\n else d.valid_elements.element_idxs\n )\n for d in self._all_dimensions\n )\n return res[np.ix_(*element_idxs)] if element_idxs else res", "response": "Return ndarray with missing and insertions as specified."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\napply subtotals to the result array.", "response": "def _apply_subtotals(self, res, include_transforms_for_dims):\n \"\"\"* Insert subtotals (and perhaps other insertions later) for\n dimensions having their apparent dimension-idx in\n *include_transforms_for_dims*.\n \"\"\"\n if not include_transforms_for_dims:\n return res\n\n suppressed_dim_count = 0\n for (dim_idx, dim) in enumerate(self._all_dimensions):\n if dim.dimension_type == DT.MR_CAT:\n suppressed_dim_count += 1\n # ---only marginable dimensions can be subtotaled---\n if not dim.is_marginable:\n continue\n apparent_dim_idx = dim_idx - suppressed_dim_count\n transform = (\n dim.has_transforms and apparent_dim_idx in include_transforms_for_dims\n )\n if not transform:\n continue\n # ---insert subtotals into result array---\n insertions = self._insertions(res, dim, dim_idx)\n res = self._update_result(res, insertions, dim_idx)\n\n return res"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn crunch cube as numpy array.", "response": "def _as_array(\n self,\n include_missing=False,\n get_non_selected=False,\n weighted=True,\n include_transforms_for_dims=False,\n ):\n \"\"\"Get crunch cube as ndarray.\n\n Args\n include_missing (bool): Include rows/cols for missing values.\n get_non_selected (bool): Get non-selected slices for MR vars.\n weighted (bool): Take weighted or unweighted counts.\n include_transforms_for_dims (list): For which dims to\n include headings & subtotals (H&S) transformations.\n Returns\n res (ndarray): Tabular representation of crunch cube\n \"\"\"\n return self._apply_subtotals(\n self._apply_missings(\n self._measure(weighted).raw_cube_array, include_missing=include_missing\n ),\n include_transforms_for_dims,\n )"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _calculate_constraints_sum(cls, prop_table, prop_margin, axis):\n if axis not in [0, 1]:\n raise ValueError(\"Unexpected value for `axis`: {}\".format(axis))\n\n V = prop_table * (1 - prop_table)\n if axis == 0:\n # If axis is 0, sumation is performed across the 'i' index, which\n # requires the matrix to be multiplied from the right\n # (because of the inner matrix dimensions).\n return np.dot(V, prop_margin)\n elif axis == 1:\n # If axis is 1, sumation is performed across the 'j' index, which\n # requires the matrix to be multiplied from the left\n # (because of the inner matrix dimensions).\n return np.dot(prop_margin, V)", "response": "Calculates the sum of the constraints in the specified axis."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _counts(self, weighted):\n return (\n self._measures.weighted_counts\n if weighted\n else self._measures.unweighted_counts\n )", "response": "Return _BaseMeasure subclass for weighted * counts.\n "} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn a dict containing raw cube response parsed from JSON payload.", "response": "def _cube_dict(self):\n \"\"\"dict containing raw cube response, parsed from JSON payload.\"\"\"\n try:\n cube_response = self._cube_response_arg\n # ---parse JSON to a dict when constructed with JSON---\n cube_dict = (\n cube_response\n if isinstance(cube_response, dict)\n else json.loads(cube_response)\n )\n # ---cube is 'value' item in a shoji response---\n return cube_dict.get(\"value\", cube_dict)\n except TypeError:\n raise TypeError(\n \"Unsupported type <%s> provided. Cube response must be JSON \"\n \"(str) or dict.\" % type(self._cube_response_arg).__name__\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns ndarray reflecting array with MR_CAT dims dropped.", "response": "def _drop_mr_cat_dims(self, array, fix_valids=False):\n \"\"\"Return ndarray reflecting *array* with MR_CAT dims dropped.\n\n If any (except 1st) dimension has a single element, it is\n flattened in the resulting array (which is more convenient for the\n users of the CrunchCube).\n\n If the original shape of the cube is needed (e.g. to calculate the\n margins with correct axis arguments), this needs to happen before the\n call to this method '_drop_mr_cat_dims'.\n \"\"\"\n # TODO: We cannot arbitrarily drop any dimension simply because it\n # has a length (shape) of 1. We must target MR_CAT dimensions\n # specifically. Otherwise unexpected results can occur based on\n # accidents of cube category count etc. If \"user-friendly\" reshaping\n # needs be done, it should be as a very last step and much safer to\n # leave that to the cr.cube client; software being \"helpful\" almost\n # never is.\n\n if not array.shape or len(array.shape) != len(self._all_dimensions):\n # This condition covers two cases:\n # 1. In case of no dimensions, the shape of the array is empty\n # 2. If the shape was already fixed, we don't need to fix it again.\n # This might happen while constructing the masked arrays. In case\n # of MR, we will have the selections dimension included thoughout\n # the calculations, and will only remove it before returning the\n # result to the user.\n return array\n\n # We keep MR selections (MR_CAT) dimensions in the array, all the way\n # up to here. At this point, we need to remove the non-selected part of\n # selections dimension (and subsequently purge the dimension itself).\n\n display_ind = (\n tuple(\n 0 if dim.dimension_type == DT.MR_CAT else slice(None)\n for dim, n in zip(self._all_dimensions, array.shape)\n )\n if not fix_valids\n else np.ix_(\n *[\n dim.valid_elements.element_idxs if n > 1 else [0]\n for dim, n in zip(self._all_dimensions, array.shape)\n ]\n )\n )\n array = array[display_ind]\n\n # If a first dimension only has one element, we don't want to\n # remove it from the shape. Hence the i == 0 part. For other dimensions\n # that have one element, it means that these are the remnants of the MR\n # selections, which we don't need as separate dimensions.\n new_shape = [\n length for (i, length) in enumerate(array.shape) if length != 1 or i == 0\n ]\n return array.reshape(new_shape)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _fix_valid_indices(cls, valid_indices, insertion_index, dim):\n # TODO: make this accept an immutable sequence for valid_indices\n # (a tuple) and return an immutable sequence rather than mutating an\n # argument.\n indices = np.array(sorted(valid_indices[dim]))\n slice_index = np.sum(indices <= insertion_index)\n indices[slice_index:] += 1\n indices = np.insert(indices, slice_index, insertion_index + 1)\n valid_indices[dim] = indices.tolist()\n return valid_indices", "response": "Add indices for H&S inserted elements."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _is_axis_allowed(self, axis):\n if axis is None:\n # If table direction was requested, we must ensure that each slice\n # doesn't have the CA items dimension (thus the [-2:] part). It's\n # OK for the 0th dimension to be items, since no calculation is\n # performed over it.\n if DT.CA_SUBVAR in self.dim_types[-2:]:\n return False\n return True\n\n if isinstance(axis, int):\n if self.ndim == 1 and axis == 1:\n # Special allowed case of a 1D cube, where \"row\"\n # directions is requested.\n return True\n axis = [axis]\n\n # ---axis is a tuple---\n for dim_idx in axis:\n if self.dim_types[dim_idx] == DT.CA_SUBVAR:\n # If any of the directions explicitly asked for directly\n # corresponds to the CA items dimension, the requested\n # calculation is not valid.\n return False\n\n return True", "response": "Check if the axis is allowed."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the primary measure class representing this cube.", "response": "def _measure(self, weighted):\n \"\"\"_BaseMeasure subclass representing primary measure for this cube.\n\n If the cube response includes a means measure, the return value is\n means. Otherwise it is counts, with the choice between weighted or\n unweighted determined by *weighted*.\n\n Note that weighted counts are provided on an \"as-available\" basis.\n When *weighted* is True and the cube response is not weighted,\n unweighted counts are returned.\n \"\"\"\n return (\n self._measures.means\n if self._measures.means is not None\n else self._measures.weighted_counts\n if weighted\n else self._measures.unweighted_counts\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _prune_3d_body(self, res, transforms):\n mask = np.zeros(res.shape)\n mr_dim_idxs = self.mr_dim_ind\n\n for i, prune_inds in enumerate(self.prune_indices(transforms)):\n rows_pruned = prune_inds[0]\n cols_pruned = prune_inds[1]\n rows_pruned = np.repeat(rows_pruned[:, None], len(cols_pruned), axis=1)\n cols_pruned = np.repeat(cols_pruned[None, :], len(rows_pruned), axis=0)\n slice_mask = np.logical_or(rows_pruned, cols_pruned)\n\n # In case of MRs we need to \"inflate\" mask\n if mr_dim_idxs == (1, 2):\n slice_mask = slice_mask[:, np.newaxis, :, np.newaxis]\n elif mr_dim_idxs == (0, 1):\n slice_mask = slice_mask[np.newaxis, :, np.newaxis, :]\n elif mr_dim_idxs == (0, 2):\n slice_mask = slice_mask[np.newaxis, :, :, np.newaxis]\n elif mr_dim_idxs == 1 and self.ndim == 3:\n slice_mask = slice_mask[:, np.newaxis, :]\n elif mr_dim_idxs == 2 and self.ndim == 3:\n slice_mask = slice_mask[:, :, np.newaxis]\n\n mask[i] = slice_mask\n\n res = np.ma.masked_array(res, mask=mask)\n return res", "response": "Prune 3D MRs and MRs."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _prune_body(self, res, transforms=None):\n if self.ndim > 2:\n return self._prune_3d_body(res, transforms)\n\n res = self._drop_mr_cat_dims(res)\n\n # ---determine which rows should be pruned---\n row_margin = self._pruning_base(\n hs_dims=transforms, axis=self.row_direction_axis\n )\n # ---adjust special-case row-margin values---\n item_types = (DT.MR, DT.CA_SUBVAR)\n if self.ndim > 1 and self.dim_types[1] in item_types and len(res.shape) > 1:\n # ---when row-dimension has only one category it gets squashed---\n axis = 1 if res.shape[0] > 1 else None\n # ---in CAT x MR case (or if it has CA subvars) we get\n # a 2D margin (denom really)---\n row_margin = np.sum(row_margin, axis=axis)\n row_prune_inds = self._margin_pruned_indices(\n row_margin, self._inserted_dim_inds(transforms, 0), 0\n )\n\n # ---a 1D only has rows, so mask only with row-prune-idxs---\n if self.ndim == 1 or len(res.shape) == 1:\n # For 1D, margin is calculated as the row margin.\n return np.ma.masked_array(res, mask=row_prune_inds)\n\n # ---determine which columns should be pruned---\n col_margin = self._pruning_base(\n hs_dims=transforms, axis=self._col_direction_axis\n )\n if col_margin.ndim > 1:\n # In case of MR x CAT, we have 2D margin\n col_margin = np.sum(col_margin, axis=0)\n col_prune_inds = self._margin_pruned_indices(\n col_margin, self._inserted_dim_inds(transforms, 1), 1\n )\n\n # ---create rows x cols mask and mask the result array---\n mask = self._create_mask(res, row_prune_inds, col_prune_inds)\n res = np.ma.masked_array(res, mask=mask)\n\n # ---return the masked array---\n return res", "response": "Prune the body of the entry - set."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef prune_indices(self, transforms=None):\n if self.ndim >= 3:\n # In case of a 3D cube, return list of tuples\n # (of row and col pruned indices).\n return self._prune_3d_indices(transforms)\n\n def prune_non_3d_indices(transforms):\n row_margin = self._pruning_base(\n hs_dims=transforms, axis=self.row_direction_axis\n )\n row_indices = self._margin_pruned_indices(\n row_margin, self._inserted_dim_inds(transforms, 0), 0\n )\n\n if row_indices.ndim > 1:\n # In case of MR, we'd have 2D prune indices\n row_indices = row_indices.all(axis=1)\n\n if self.ndim == 1:\n return [row_indices]\n\n col_margin = self._pruning_base(\n hs_dims=transforms, axis=self._col_direction_axis\n )\n\n col_indices = self._margin_pruned_indices(\n col_margin, self._inserted_dim_inds(transforms, 1), 1\n )\n if col_indices.ndim > 1:\n # In case of MR, we'd have 2D prune indices\n col_indices = col_indices.all(axis=0)\n\n return [row_indices, col_indices]\n\n # In case of 1 or 2 D cubes, return a list of\n # row indices (or row and col indices)\n return prune_non_3d_indices(transforms)", "response": "Return indices of pruned rows and columns as list."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _pruning_base(self, axis=None, hs_dims=None):\n if not self._is_axis_allowed(axis):\n # In case we encountered axis that would go across items dimension,\n # we need to return at least some result, to prevent explicitly\n # checking for this condition, wherever self._margin is used\n return self.as_array(weighted=False, include_transforms_for_dims=hs_dims)\n\n # In case of allowed axis, just return the normal API margin. This call\n # would throw an exception when directly invoked with bad axis. This is\n # intended, because we want to be as explicit as possible. Margins\n # across items are not allowed.\n return self.margin(\n axis=axis, weighted=False, include_transforms_for_dims=hs_dims\n )", "response": "Returns the margin if across CAT dimension."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _update_result(self, result, insertions, dimension_index):\n for j, (ind_insertion, value) in enumerate(insertions):\n result = np.insert(\n result, ind_insertion + j + 1, value, axis=dimension_index\n )\n return result", "response": "Insert subtotals into resulting ndarray."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ntrues if weights have been applied to the measure(s ) for this cube.", "response": "def is_weighted(self):\n \"\"\"True if weights have been applied to the measure(s) for this cube.\n\n Unweighted counts are available for all cubes. Weighting applies to\n any other measures provided by the cube.\n \"\"\"\n cube_dict = self._cube_dict\n if cube_dict.get(\"query\", {}).get(\"weight\") is not None:\n return True\n if cube_dict.get(\"weight_var\") is not None:\n return True\n if cube_dict.get(\"weight_url\") is not None:\n return True\n unweighted_counts = cube_dict[\"result\"][\"counts\"]\n count_data = cube_dict[\"result\"][\"measures\"].get(\"count\", {}).get(\"data\")\n if unweighted_counts != count_data:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef means(self):\n mean_measure_dict = (\n self._cube_dict.get(\"result\", {}).get(\"measures\", {}).get(\"mean\")\n )\n if mean_measure_dict is None:\n return None\n return _MeanMeasure(self._cube_dict, self._all_dimensions)", "response": "_MeanMeasure object providing access to means values."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef missing_count(self):\n if self.means:\n return self.means.missing_count\n return self._cube_dict[\"result\"].get(\"missing\", 0)", "response": "numeric representing count of missing rows in cube response."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef weighted_counts(self):\n if not self.is_weighted:\n return self.unweighted_counts\n return _WeightedCountMeasure(self._cube_dict, self._all_dimensions)", "response": "Return a _WeightedCountMeasure object for this cube."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef weighted_n(self):\n if not self.is_weighted:\n return float(self.unweighted_n)\n return float(sum(self._cube_dict[\"result\"][\"measures\"][\"count\"][\"data\"]))", "response": "float count of returned rows adjusted for weighting."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef raw_cube_array(self):\n array = np.array(self._flat_values).reshape(self._all_dimensions.shape)\n # ---must be read-only to avoid hard-to-find bugs---\n array.flags.writeable = False\n return array", "response": "Return read - only ndarray of measure values from cube - response."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _flat_values(self):\n return tuple(\n np.nan if type(x) is dict else x\n for x in self._cube_dict[\"result\"][\"measures\"][\"mean\"][\"data\"]\n )", "response": "Return tuple of mean values as found in cube response."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngenerating a dictionnary of dataframes containing nb_persons persons spread in nb_groups groups.", "response": "def make_input_dataframe_by_entity(tax_benefit_system, nb_persons, nb_groups):\n \"\"\"\n Generate a dictionnary of dataframes containing nb_persons persons spread in nb_groups groups.\n\n :param TaxBenefitSystem tax_benefit_system: the tax_benefit_system to use\n :param int nb_persons: the number of persons in the system\n :param int nb_groups: the number of collective entities in the system\n\n :returns: A dictionary whose keys are entities and values the corresponding data frames\n\n Example:\n\n >>> from openfisca_survey_manager.input_dataframe_generator import make_input_dataframe_by_entity\n >>> from openfisca_country_template import CountryTaxBenefitSystem\n >>> tbs = CountryTaxBenefitSystem()\n >>> input_dataframe_by_entity = make_input_dataframe_by_entity(tbs, 400, 100)\n >>> sorted(input_dataframe_by_entity['person'].columns.tolist())\n ['household_id', 'household_legacy_role', 'household_role', 'person_id']\n >>> sorted(input_dataframe_by_entity['household'].columns.tolist())\n []\n \"\"\"\n input_dataframe_by_entity = dict()\n person_entity = [entity for entity in tax_benefit_system.entities if entity.is_person][0]\n person_id = np.arange(nb_persons)\n input_dataframe_by_entity = dict()\n input_dataframe_by_entity[person_entity.key] = pd.DataFrame({\n person_entity.key + '_id': person_id,\n })\n input_dataframe_by_entity[person_entity.key].set_index('person_id')\n #\n adults = [0] + sorted(random.sample(range(1, nb_persons), nb_groups - 1))\n members_entity_id = np.empty(nb_persons, dtype = int)\n # A legacy role is an index that every person within an entity has.\n # For instance, the 'first_parent' has legacy role 0, the 'second_parent' 1, the first 'child' 2, the second 3, etc.\n members_legacy_role = np.empty(nb_persons, dtype = int)\n id_group = -1\n for id_person in range(nb_persons):\n if id_person in adults:\n id_group += 1\n legacy_role = 0\n else:\n legacy_role = 2 if legacy_role == 0 else legacy_role + 1\n members_legacy_role[id_person] = legacy_role\n members_entity_id[id_person] = id_group\n\n for entity in tax_benefit_system.entities:\n if entity.is_person:\n continue\n key = entity.key\n person_dataframe = input_dataframe_by_entity[person_entity.key]\n person_dataframe[key + '_id'] = members_entity_id\n person_dataframe[key + '_legacy_role'] = members_legacy_role\n person_dataframe[key + '_role'] = np.where(\n members_legacy_role == 0, entity.flattened_roles[0].key, entity.flattened_roles[-1].key)\n input_dataframe_by_entity[key] = pd.DataFrame({\n key + '_id': range(nb_groups)\n })\n input_dataframe_by_entity[key].set_index(key + '_id', inplace = True)\n\n return input_dataframe_by_entity"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ninitialising a variable with random values from 0 to max_value.", "response": "def randomly_init_variable(tax_benefit_system, input_dataframe_by_entity, variable_name, max_value, condition = None, seed = None):\n \"\"\"\n Initialise a variable with random values (from 0 to max_value).\n If a condition vector is provided, only set the value of persons or groups for which condition is True.\n\n Exemple:\n\n >>> from openfisca_survey_manager.input_dataframe_generator import make_input_dataframe_by_entity\n >>> from openfisca_country_template import CountryTaxBenefitSystem\n >>> tbs = CountryTaxBenefitSystem()\n >>> input_dataframe_by_entity = make_input_dataframe_by_entity(tbs, 400, 100)\n >>> randomly_init_variable(tbs, input_dataframe_by_entity, 'salary', max_value = 50000, condition = \"household_role == 'first_parent'\") # Randomly set a salaire_net for all persons between 0 and 50000?\n >>> sorted(input_dataframe_by_entity['person'].columns.tolist())\n ['household_id', 'household_legacy_role', 'household_role', 'person_id', 'salary']\n >>> input_dataframe_by_entity['person'].salary.max() <= 50000\n True\n >>> len(input_dataframe_by_entity['person'].salary)\n 400\n >>> randomly_init_variable(tbs, input_dataframe_by_entity, 'rent', max_value = 1000)\n >>> sorted(input_dataframe_by_entity['household'].columns.tolist())\n ['rent']\n >>> input_dataframe_by_entity['household'].rent.max() <= 1000\n True\n >>> input_dataframe_by_entity['household'].rent.max() >= 1\n True\n >>> len(input_dataframe_by_entity['household'].rent)\n 100\n \"\"\"\n\n variable = tax_benefit_system.variables[variable_name]\n entity = variable.entity\n\n if condition is None:\n condition = True\n else:\n condition = input_dataframe_by_entity[entity.key].eval(condition).values\n\n if seed is None:\n seed = 42\n\n np.random.seed(seed)\n count = len(input_dataframe_by_entity[entity.key])\n value = (np.random.rand(count) * max_value * condition).astype(variable.dtype)\n input_dataframe_by_entity[entity.key][variable_name] = value"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets the value of a variable in a table", "response": "def get_value(self, variable = None, table = None):\n \"\"\"\n Get value\n\n Parameters\n ----------\n variable : string\n name of the variable\n table : string, default None\n name of the table hosting the variable\n Returns\n -------\n df : DataFrame, default None\n A DataFrame containing the variable\n \"\"\"\n assert variable is not None, \"A variable is needed\"\n if table not in self.tables:\n log.error(\"Table {} is not found in survey tables\".format(table))\n df = self.get_values([variable], table)\n return df"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting the values of the variables in the table.", "response": "def get_values(self, variables = None, table = None, lowercase = False, rename_ident = True):\n \"\"\"\n Get values\n\n Parameters\n ----------\n variables : list of strings, default None\n list of variables names, if None return the whole table\n table : string, default None\n name of the table hosting the variables\n lowercase : boolean, deflault True\n put variables of the table into lowercase\n rename_ident : boolean, deflault True\n rename variables ident+yr (e.g. ident08) into ident\n Returns\n -------\n df : DataFrame, default None\n A DataFrame containing the variables\n \"\"\"\n assert self.hdf5_file_path is not None\n assert os.path.exists(self.hdf5_file_path), '{} is not a valid path'.format(\n self.hdf5_file_path)\n store = pandas.HDFStore(self.hdf5_file_path)\n\n try:\n df = store.select(table)\n except KeyError:\n log.error('No table {} in the file {}'.format(table, self.hdf5_file_path))\n log.error('Table(s) available are: {}'.format(store.keys()))\n store.close()\n raise\n\n if lowercase:\n columns = dict((column_name, column_name.lower()) for column_name in df)\n df.rename(columns = columns, inplace = True)\n\n if rename_ident is True:\n for column_name in df:\n if ident_re.match(str(column_name)) is not None:\n df.rename(columns = {column_name: \"ident\"}, inplace = True)\n log.info(\"{} column have been replaced by ident\".format(column_name))\n break\n\n if variables is None:\n return df\n else:\n diff = set(variables) - set(df.columns)\n if diff:\n raise Exception(\"The following variable(s) {} are missing\".format(diff))\n variables = list(set(variables).intersection(df.columns))\n df = df[variables]\n return df"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ninserting a table in the Survey object.", "response": "def insert_table(self, label = None, name = None, **kwargs):\n \"\"\"\n Insert a table in the Survey object\n \"\"\"\n\n data_frame = kwargs.pop('data_frame', None)\n if data_frame is None:\n data_frame = kwargs.pop('dataframe', None)\n\n to_hdf_kwargs = kwargs.pop('to_hdf_kwargs', dict())\n if data_frame is not None:\n assert isinstance(data_frame, pandas.DataFrame)\n\n if data_frame is not None:\n if label is None:\n label = name\n table = Table(label = label, name = name, survey = self)\n assert table.survey.hdf5_file_path is not None\n log.debug(\"Saving table {} in {}\".format(name, table.survey.hdf5_file_path))\n table.save_data_frame(data_frame, **to_hdf_kwargs)\n\n if name not in self.tables:\n self.tables[name] = dict()\n for key, val in kwargs.items():\n self.tables[name][key] = val"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef quantile(q, variable, weight_variable = None, filter_variable = None):\n def formula(entity, period):\n value = entity(variable, period)\n if weight_variable is not None:\n weight = entity(weight_variable, period)\n weight = entity.filled_array(1)\n if filter_variable is not None:\n filter_value = entity(filter_variable, period)\n weight = filter_value * weight\n\n labels = arange(1, q + 1)\n quantile, _ = weightedcalcs_quantiles(\n value,\n labels,\n weight,\n return_quantiles = True,\n )\n if filter_variable is not None:\n quantile = where(weight > 0, quantile, -1)\n return quantile\n\n return formula", "response": "Return quantile of a variable with weight provided by a specific wieght variable potentially filtered\n "} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _get_version():\n with open(\"../waliki/__init__.py\") as fh:\n for line in fh:\n if line.startswith(\"__version__ = \"):\n return line.split(\"=\")[-1].strip().strip(\"'\").strip('\"')", "response": "Get the version from package itself."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nremoving metada from the top of the rst file", "response": "def clean_meta(rst_content):\n \"\"\"remove moinmoin metada from the top of the file\"\"\"\n\n rst = rst_content.split('\\n')\n for i, line in enumerate(rst):\n if line.startswith('#'):\n continue\n break\n return '\\n'.join(rst[i:])"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding the entry point to the context.", "response": "def entry_point(context, block_name):\n \"\"\"include an snippet at the bottom of a block, if it exists\n\n For example, if the plugin with slug 'attachments' is registered\n\n waliki/attachments_edit_content.html will be included with\n\n {% entry_point 'edit_content' %}\n\n which is declared at the bottom of the block 'content' in edit.html\n \"\"\"\n from waliki.plugins import get_plugins\n includes = []\n for plugin in get_plugins():\n template_name = 'waliki/%s_%s.html' % (plugin.slug, block_name)\n try:\n # template exists\n template.loader.get_template(template_name)\n includes.append(template_name)\n except template.TemplateDoesNotExist:\n continue\n context.update({'includes': includes})\n return context"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncheck permissions for a given object.", "response": "def check_perms(parser, token):\n \"\"\"\n Returns a list of permissions (as ``codename`` strings) for a given\n ``user``/``group`` and ``obj`` (Model instance).\n\n Parses ``check_perms`` tag which should be in format::\n\n {% check_perms \"perm1[, perm2, ...]\" for user in slug as \"context_var\" %}\n\n or\n\n {% check_perms \"perm1[, perm2, ...]\" for user in \"slug\" as \"context_var\" %}\n\n .. note::\n\n Make sure that you set and use those permissions in same template\n block (``{% block %}``).\n\n Example of usage (assuming ``page` objects are available from *context*)::\n\n {% check_perms \"delete_page\" for request.user in page.slug as \"can_delete\" %}\n {% if can_delete %}\n ...\n {% endif %}\n\n \"\"\"\n bits = token.split_contents()\n format = '{% check_perms \"perm1[, perm2, ...]\" for user in slug as \"context_var\" %}'\n if len(bits) != 8 or bits[2] != 'for' or bits[4] != \"in\" or bits[6] != 'as':\n raise template.TemplateSyntaxError(\"get_obj_perms tag should be in \"\n \"format: %s\" % format)\n perms = bits[1]\n user = bits[3]\n slug = bits[5]\n context_var = bits[7]\n if perms[0] != perms[-1] or perms[0] not in ('\"', \"'\"):\n raise template.TemplateSyntaxError(\"check_perms tag's perms \"\n \"argument should be in quotes\")\n\n if context_var[0] != context_var[-1] or context_var[0] not in ('\"', \"'\"):\n raise template.TemplateSyntaxError(\"check_perms tag's context_var \"\n \"argument should be in quotes\")\n context_var = context_var[1:-1]\n return CheckPermissionsNode(perms, user, slug, context_var)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_module(app, modname, verbose=False, failfast=False):\n module_name = '%s.%s' % (app, modname)\n try:\n module = import_module(module_name)\n except ImportError as e:\n if failfast:\n raise e\n elif verbose:\n print(\"Could not load %r from %r: %s\" % (modname, app, e))\n return None\n if verbose:\n print(\"Loaded %r from %r\" % (modname, app))\n return module", "response": "Internal function to load a module from a single app."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef load(modname, verbose=False, failfast=False):\n for app in settings.INSTALLED_APPS:\n get_module(app, modname, verbose, failfast)", "response": "Load all modules with name modname from all installed apps."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nregistering a new object of type .", "response": "def register(PluginClass):\n \"\"\"\n Register a plugin class. This function will call back your plugin's\n constructor.\n \"\"\"\n if PluginClass in _cache.keys():\n raise Exception(\"Plugin class already registered\")\n plugin = PluginClass()\n _cache[PluginClass] = plugin\n\n if getattr(PluginClass, 'extra_page_actions', False):\n for key in plugin.extra_page_actions:\n if key not in _extra_page_actions:\n _extra_page_actions[key] = []\n _extra_page_actions[key].extend(plugin.extra_page_actions[key])\n\n if getattr(PluginClass, 'extra_edit_actions', False):\n for key in plugin.extra_edit_actions:\n if key not in _extra_edit_actions:\n _extra_edit_actions[key] = []\n _extra_edit_actions[key].extend(plugin.extra_edit_actions[key])\n\n\n if getattr(PluginClass, 'navbar_links', False):\n _navbar_links.extend(list(plugin.navbar_links))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef render_form(form):\n\n if 'crispy_forms' in settings.INSTALLED_APPS:\n from crispy_forms.templatetags.crispy_forms_filters import as_crispy_form\n return as_crispy_form(form)\n\n template = get_template(\"bootstrap/form.html\")\n form = _preprocess_fields(form)\n\n return template.render({\"form\": form})", "response": "render a crispy form"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ninjecting few waliki s settings to the context to be used in templates", "response": "def settings(request):\n \"\"\"inject few waliki's settings to the context to be used in templates\"\"\"\n from waliki.settings import WALIKI_USE_MATHJAX # NOQA\n return {k: v for (k, v) in locals().items() if k.startswith('WALIKI')}"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncreate a UTF - 16 encoded PDF string literal for s.", "response": "def smart_encode_str(s):\n \"\"\"Create a UTF-16 encoded PDF string literal for `s`.\"\"\"\n try:\n utf16 = s.encode('utf_16_be')\n except AttributeError: # ints and floats\n utf16 = str(s).encode('utf_16_be')\n safe = utf16.replace(b'\\x00)', b'\\x00\\\\)').replace(b'\\x00(', b'\\x00\\\\(')\n return b''.join((codecs.BOM_UTF16_BE, safe))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef forge_fdf(pdf_form_url=None, fdf_data_strings=[], fdf_data_names=[],\n fields_hidden=[], fields_readonly=[],\n checkbox_checked_name=b\"Yes\"):\n \"\"\"Generates fdf string from fields specified\n\n * pdf_form_url (default: None): just the url for the form.\n * fdf_data_strings (default: []): array of (string, value) tuples for the\n form fields (or dicts). Value is passed as a UTF-16 encoded string,\n unless True/False, in which case it is assumed to be a checkbox\n (and passes names, '/Yes' (by default) or '/Off').\n * fdf_data_names (default: []): array of (string, value) tuples for the\n form fields (or dicts). Value is passed to FDF as a name, '/value'\n * fields_hidden (default: []): list of field names that should be set\n hidden.\n * fields_readonly (default: []): list of field names that should be set\n readonly.\n * checkbox_checked_value (default: \"Yes\"): By default means a checked\n checkboxes gets passed the value \"/Yes\". You may find that the default\n does not work with your PDF, in which case you might want to try \"On\".\n\n The result is a string suitable for writing to a .fdf file.\n\n \"\"\"\n fdf = [b'%FDF-1.2\\x0a%\\xe2\\xe3\\xcf\\xd3\\x0d\\x0a']\n fdf.append(b'1 0 obj\\x0a<>\\x0a')\n fdf.append(b'>>\\x0aendobj\\x0a')\n fdf.append(b'trailer\\x0a\\x0a<<\\x0a/Root 1 0 R\\x0a>>\\x0a')\n fdf.append(b'%%EOF\\x0a\\x0a')\n return b''.join(fdf)", "response": "Generates a FDF string from the specified fields."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nbases operator for managing an asynchronous sequence of sequences. The sequences are awaited concurrently, although it's possible to limit the amount of running sequences using the `task_limit` argument. The ``switch`` argument enables the switch mecanism, which cause the previous subsequence to be discarded when a new one is created. The items can either be generated in order or as soon as they're received, depending on the ``ordered`` argument.", "response": "async def base_combine(source, switch=False, ordered=False, task_limit=None):\n \"\"\"Base operator for managing an asynchronous sequence of sequences.\n\n The sequences are awaited concurrently, although it's possible to limit\n the amount of running sequences using the `task_limit` argument.\n\n The ``switch`` argument enables the switch mecanism, which cause the\n previous subsequence to be discarded when a new one is created.\n\n The items can either be generated in order or as soon as they're received,\n depending on the ``ordered`` argument.\n \"\"\"\n\n # Task limit\n if task_limit is not None and not task_limit > 0:\n raise ValueError('The task limit must be None or greater than 0')\n\n # Safe context\n async with StreamerManager() as manager:\n\n main_streamer = await manager.enter_and_create_task(source)\n\n # Loop over events\n while manager.tasks:\n\n # Extract streamer groups\n substreamers = manager.streamers[1:]\n mainstreamers = [main_streamer] if main_streamer in manager.tasks else []\n\n # Switch - use the main streamer then the substreamer\n if switch:\n filters = mainstreamers + substreamers\n # Concat - use the first substreamer then the main streamer\n elif ordered:\n filters = substreamers[:1] + mainstreamers\n # Flat - use the substreamers then the main streamer\n else:\n filters = substreamers + mainstreamers\n\n # Wait for next event\n streamer, task = await manager.wait_single_event(filters)\n\n # Get result\n try:\n result = task.result()\n\n # End of stream\n except StopAsyncIteration:\n\n # Main streamer is finished\n if streamer is main_streamer:\n main_streamer = None\n\n # A substreamer is finished\n else:\n await manager.clean_streamer(streamer)\n\n # Re-schedule the main streamer if necessary\n if main_streamer is not None and main_streamer not in manager.tasks:\n manager.create_task(main_streamer)\n\n # Process result\n else:\n\n # Switch mecanism\n if switch and streamer is main_streamer:\n await manager.clean_streamers(substreamers)\n\n # Setup a new source\n if streamer is main_streamer:\n await manager.enter_and_create_task(result)\n\n # Re-schedule the main streamer if task limit allows it\n if task_limit is None or task_limit > len(manager.tasks):\n manager.create_task(streamer)\n\n # Yield the result\n else:\n yield result\n\n # Re-schedule the streamer\n manager.create_task(streamer)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef concat(source, task_limit=None):\n return base_combine.raw(\n source, task_limit=task_limit, switch=False, ordered=True)", "response": "A function that returns a sequence of elements from the source."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef flatten(source, task_limit=None):\n return base_combine.raw(\n source, task_limit=task_limit, switch=False, ordered=False)", "response": "Given an asynchronous sequence of sequences generate the elements\n of the sequences as soon as they are received."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\napplying a given function that creates a sequence from the elements of one or several asynchronous sequences and generate the elements of the sequences in order.", "response": "def concatmap(source, func, *more_sources, task_limit=None):\n \"\"\"Apply a given function that creates a sequence from the elements of one\n or several asynchronous sequences, and generate the elements of the created\n sequences in order.\n\n The function is applied as described in `map`, and must return an\n asynchronous sequence. The returned sequences are awaited concurrently,\n although it's possible to limit the amount of running sequences using\n the `task_limit` argument.\n \"\"\"\n return concat.raw(\n combine.smap.raw(source, func, *more_sources), task_limit=task_limit)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef flatmap(source, func, *more_sources, task_limit=None):\n return flatten.raw(\n combine.smap.raw(source, func, *more_sources), task_limit=task_limit)", "response": "Apply a given function that creates a sequence from the elements of one\n or several asynchronous sequences and generate the elements of the one\n or several asynchronous sequences as soon as they arrive."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\napply a given function that creates a sequence from the elements of one or several asynchronous sequences and generate the elements of the most .", "response": "def switchmap(source, func, *more_sources):\n \"\"\"Apply a given function that creates a sequence from the elements of one\n or several asynchronous sequences and generate the elements of the most\n recently created sequence.\n\n The function is applied as described in `map`, and must return an\n asynchronous sequence. Errors raised in a source or output sequence (that\n was not already closed) are propagated.\n \"\"\"\n return switch.raw(combine.smap.raw(source, func, *more_sources))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\nasync def accumulate(source, func=op.add, initializer=None):\n iscorofunc = asyncio.iscoroutinefunction(func)\n async with streamcontext(source) as streamer:\n # Initialize\n if initializer is None:\n try:\n value = await anext(streamer)\n except StopAsyncIteration:\n return\n else:\n value = initializer\n # First value\n yield value\n # Iterate streamer\n async for item in streamer:\n value = func(value, item)\n if iscorofunc:\n value = await value\n yield value", "response": "Generate a series of accumulated sums or other binary function from an asynchronous sequence."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef reduce(source, func, initializer=None):\n acc = accumulate.raw(source, func, initializer)\n return select.item.raw(acc, -1)", "response": "Apply a function of two arguments cumulatively to the items\n of an asynchronous sequence."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\nasync def list(source):\n result = []\n async with streamcontext(source) as streamer:\n async for item in streamer:\n result.append(item)\n yield result", "response": "Generate a single list from an asynchronous sequence."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nwaits for an asynchronous iterable to finish and return the last item.", "response": "async def wait_stream(aiterable):\n \"\"\"Wait for an asynchronous iterable to finish and return the last item.\n\n The iterable is executed within a safe stream context.\n A StreamEmpty exception is raised if the sequence is empty.\n \"\"\"\n async with streamcontext(aiterable) as streamer:\n async for item in streamer:\n item\n try:\n return item\n except NameError:\n raise StreamEmpty()"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef operator(func=None, *, pipable=False):\n\n def decorator(func):\n \"\"\"Inner decorator for stream operator.\"\"\"\n\n # Gather data\n bases = (Stream,)\n name = func.__name__\n module = func.__module__\n extra_doc = func.__doc__\n doc = extra_doc or f'Regular {name} stream operator.'\n\n # Extract signature\n signature = inspect.signature(func)\n parameters = list(signature.parameters.values())\n if parameters and parameters[0].name in ('self', 'cls'):\n raise ValueError(\n 'An operator cannot be created from a method, '\n 'since the decorated function becomes an operator class')\n\n # Injected parameters\n self_parameter = inspect.Parameter(\n 'self', inspect.Parameter.POSITIONAL_OR_KEYWORD)\n cls_parameter = inspect.Parameter(\n 'cls', inspect.Parameter.POSITIONAL_OR_KEYWORD)\n\n # Wrapped static method\n original = func\n original.__qualname__ = name + '.original'\n\n # Raw static method\n raw = func\n raw.__qualname__ = name + '.raw'\n\n # Init method\n def init(self, *args, **kwargs):\n if pipable and args:\n assert_async_iterable(args[0])\n factory = functools.partial(self.raw, *args, **kwargs)\n return Stream.__init__(self, factory)\n\n # Customize init signature\n new_parameters = [self_parameter] + parameters\n init.__signature__ = signature.replace(parameters=new_parameters)\n\n # Customize init method\n init.__qualname__ = name + '.__init__'\n init.__name__ = '__init__'\n init.__module__ = module\n init.__doc__ = f'Initialize the {name} stream.'\n\n if pipable:\n\n # Raw static method\n def raw(*args, **kwargs):\n if args:\n assert_async_iterable(args[0])\n return func(*args, **kwargs)\n\n # Custonize raw method\n raw.__signature__ = signature\n raw.__qualname__ = name + '.raw'\n raw.__module__ = module\n raw.__doc__ = doc\n\n # Pipe class method\n def pipe(cls, *args, **kwargs):\n return lambda source: cls(source, *args, **kwargs)\n\n # Customize pipe signature\n if parameters and parameters[0].kind in (\n inspect.Parameter.POSITIONAL_ONLY,\n inspect.Parameter.POSITIONAL_OR_KEYWORD):\n new_parameters = [cls_parameter] + parameters[1:]\n else:\n new_parameters = [cls_parameter] + parameters\n pipe.__signature__ = signature.replace(parameters=new_parameters)\n\n # Customize pipe method\n pipe.__qualname__ = name + '.pipe'\n pipe.__module__ = module\n pipe.__doc__ = f'Pipable \"{name}\" stream operator.'\n if extra_doc:\n pipe.__doc__ += \"\\n\\n \" + extra_doc\n\n # Gather attributes\n attrs = {\n '__init__': init,\n '__module__': module,\n '__doc__': doc,\n 'raw': staticmethod(raw),\n 'original': staticmethod(original),\n 'pipe': classmethod(pipe) if pipable else None}\n\n # Create operator class\n return type(name, bases, attrs)\n\n return decorator if func is None else decorator(func)", "response": "Create an asynchronous stream operator from an asynchronous generator or a function returning an asynchronous iterable."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef action(source, func):\n if asyncio.iscoroutinefunction(func):\n async def innerfunc(arg):\n await func(arg)\n return arg\n else:\n def innerfunc(arg):\n func(arg)\n return arg\n return map.raw(source, innerfunc)", "response": "Perform an action for each element of an asynchronous sequence\n without modifying it."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nprint each element of an asynchronous sequence without modifying it.", "response": "def print(source, template=None, **kwargs):\n \"\"\"Print each element of an asynchronous sequence without modifying it.\n\n An optional template can be provided to be formatted with the elements.\n All the keyword arguments are forwarded to the builtin function print.\n \"\"\"\n def func(value):\n if template:\n value = template.format(value)\n builtins.print(value, **kwargs)\n return action.raw(source, func)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nwrap the given function into a coroutine function.", "response": "def async_(fn):\n \"\"\"Wrap the given function into a coroutine function.\"\"\"\n @functools.wraps(fn)\n async def wrapper(*args, **kwargs):\n return await fn(*args, **kwargs)\n return wrapper"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef aitercontext(aiterable, *, cls=AsyncIteratorContext):\n assert issubclass(cls, AsyncIteratorContext)\n aiterator = aiter(aiterable)\n if isinstance(aiterator, cls):\n return aiterator\n return cls(aiterator)", "response": "Return an asynchronous context manager from an asynchronous iterable."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nyield the last n elements from an asynchronous sequence.", "response": "async def takelast(source, n):\n \"\"\"Forward the last ``n`` elements from an asynchronous sequence.\n\n If ``n`` is negative, it simply terminates after iterating the source.\n\n Note: it is required to reach the end of the source before the first\n element is generated.\n \"\"\"\n queue = collections.deque(maxlen=n if n > 0 else 0)\n async with streamcontext(source) as streamer:\n async for item in streamer:\n queue.append(item)\n for item in queue:\n yield item"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nforwards an asynchronous sequence skipping the first n elements.", "response": "async def skip(source, n):\n \"\"\"Forward an asynchronous sequence, skipping the first ``n`` elements.\n\n If ``n`` is negative, no elements are skipped.\n \"\"\"\n source = transform.enumerate.raw(source)\n async with streamcontext(source) as streamer:\n async for i, item in streamer:\n if i >= n:\n yield item"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nforwarding an asynchronous sequence skipping the last n elements.", "response": "async def skiplast(source, n):\n \"\"\"Forward an asynchronous sequence, skipping the last ``n`` elements.\n\n If ``n`` is negative, no elements are skipped.\n\n Note: it is required to reach the ``n+1`` th element of the source\n before the first element is generated.\n \"\"\"\n queue = collections.deque(maxlen=n if n > 0 else 0)\n async with streamcontext(source) as streamer:\n async for item in streamer:\n if n <= 0:\n yield item\n continue\n if len(queue) == n:\n yield queue[0]\n queue.append(item)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\nasync def filterindex(source, func):\n source = transform.enumerate.raw(source)\n async with streamcontext(source) as streamer:\n async for i, item in streamer:\n if func(i):\n yield item", "response": "Filter an asynchronous sequence using the index of the elements."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nslice an asynchronous sequence.", "response": "def slice(source, *args):\n \"\"\"Slice an asynchronous sequence.\n\n The arguments are the same as the builtin type slice.\n\n There are two limitations compare to regular slices:\n - Positive stop index with negative start index is not supported\n - Negative step is not supported\n \"\"\"\n s = builtins.slice(*args)\n start, stop, step = s.start or 0, s.stop, s.step or 1\n # Filter the first items\n if start < 0:\n source = takelast.raw(source, abs(start))\n elif start > 0:\n source = skip.raw(source, start)\n # Filter the last items\n if stop is not None:\n if stop >= 0 and start < 0:\n raise ValueError(\n \"Positive stop with negative start is not supported\")\n elif stop >= 0:\n source = take.raw(source, stop - start)\n else:\n source = skiplast.raw(source, abs(stop))\n # Filter step items\n if step is not None:\n if step > 1:\n source = filterindex.raw(source, lambda i: i % step == 0)\n elif step < 0:\n raise ValueError(\"Negative step not supported\")\n # Return\n return source"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nyield the n th element of an asynchronous sequence.", "response": "async def item(source, index):\n \"\"\"Forward the ``n``th element of an asynchronous sequence.\n\n The index can be negative and works like regular indexing.\n If the index is out of range, and ``IndexError`` is raised.\n \"\"\"\n # Prepare\n if index >= 0:\n source = skip.raw(source, index)\n else:\n source = takelast(source, abs(index))\n async with streamcontext(source) as streamer:\n # Get first item\n try:\n result = await anext(streamer)\n except StopAsyncIteration:\n raise IndexError(\"Index out of range\")\n # Check length\n if index < 0:\n count = 1\n async for _ in streamer:\n count += 1\n if count != abs(index):\n raise IndexError(\"Index out of range\")\n # Yield result\n yield result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nforwarding one or several items from an asynchronous sequence.", "response": "def getitem(source, index):\n \"\"\"Forward one or several items from an asynchronous sequence.\n\n The argument can either be a slice or an integer.\n See the slice and item operators for more information.\n \"\"\"\n if isinstance(index, builtins.slice):\n return slice.raw(source, index.start, index.stop, index.step)\n if isinstance(index, int):\n return item.raw(source, index)\n raise TypeError(\"Not a valid index (int or slice)\")"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\nasync def takewhile(source, func):\n iscorofunc = asyncio.iscoroutinefunction(func)\n async with streamcontext(source) as streamer:\n async for item in streamer:\n result = func(item)\n if iscorofunc:\n result = await result\n if not result:\n return\n yield item", "response": "Forward an asynchronous sequence while a condition is met."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\npopulating the pipe module dynamically.", "response": "def update_pipe_module():\n \"\"\"Populate the pipe module dynamically.\"\"\"\n module_dir = __all__\n operators = stream.__dict__\n for key, value in operators.items():\n if getattr(value, 'pipe', None):\n globals()[key] = value.pipe\n if key not in module_dir:\n module_dir.append(key)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\nasync def enumerate(source, start=0, step=1):\n count = itertools.count(start, step)\n async with streamcontext(source) as streamer:\n async for item in streamer:\n yield next(count), item", "response": "Iterate over the items in an asynchronous sequence."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef starmap(source, func, ordered=True, task_limit=None):\n if asyncio.iscoroutinefunction(func):\n async def starfunc(args):\n return await func(*args)\n else:\n def starfunc(args):\n return func(*args)\n return map.raw(source, starfunc, ordered=ordered, task_limit=task_limit)", "response": "Apply a given function to the unpacked elements of a sequence of items and return the results."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\niterates indefinitely over an asynchronous sequence.", "response": "async def cycle(source):\n \"\"\"Iterate indefinitely over an asynchronous sequence.\n\n Note: it does not perform any buffering, but re-iterate over\n the same given sequence instead. If the sequence is not\n re-iterable, the generator might end up looping indefinitely\n without yielding any item.\n \"\"\"\n while True:\n async with streamcontext(source) as streamer:\n async for item in streamer:\n yield item\n # Prevent blocking while loop if the stream is empty\n await asyncio.sleep(0)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngenerates n - sized chunks from an asynchronous sequence.", "response": "async def chunks(source, n):\n \"\"\"Generate chunks of size ``n`` from an asynchronous sequence.\n\n The chunks are lists, and the last chunk might contain less than ``n``\n elements.\n \"\"\"\n async with streamcontext(source) as streamer:\n async for first in streamer:\n xs = select.take(create.preserve(streamer), n-1)\n yield [first] + await aggregate.list(xs)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\nasync def random(offset=0., width=1., interval=0.1):\n while True:\n await asyncio.sleep(interval)\n yield offset + width * random_module.random()", "response": "Generate a stream of random numbers."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\nasync def power(source, exponent):\n async with streamcontext(source) as streamer:\n async for item in streamer:\n yield item ** exponent", "response": "Raise the elements of an asynchronous sequence to the given power."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nyields items from an asynchronous sequence in time by the given interval.", "response": "async def spaceout(source, interval):\n \"\"\"Make sure the elements of an asynchronous sequence are separated\n in time by the given interval.\n \"\"\"\n timeout = 0\n loop = asyncio.get_event_loop()\n async with streamcontext(source) as streamer:\n async for item in streamer:\n delta = timeout - loop.time()\n delay = delta if delta > 0 else 0\n await asyncio.sleep(delay)\n yield item\n timeout = loop.time() + interval"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\nasync def timeout(source, timeout):\n async with streamcontext(source) as streamer:\n while True:\n try:\n item = await wait_for(anext(streamer), timeout)\n except StopAsyncIteration:\n break\n else:\n yield item", "response": "Raise a time - out if an element of the asynchronous sequence contains too long to arrive."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ndelaying the iteration of an asynchronous sequence.", "response": "async def delay(source, delay):\n \"\"\"Delay the iteration of an asynchronous sequence.\"\"\"\n await asyncio.sleep(delay)\n async with streamcontext(source) as streamer:\n async for item in streamer:\n yield item"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\nasync def chain(*sources):\n for source in sources:\n async with streamcontext(source) as streamer:\n async for item in streamer:\n yield item", "response": "Yields asynchronous sequences together in the order they are given."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\nasync def zip(*sources):\n async with AsyncExitStack() as stack:\n # Handle resources\n streamers = [await stack.enter_async_context(streamcontext(source))\n for source in sources]\n # Loop over items\n while True:\n try:\n coros = builtins.map(anext, streamers)\n items = await asyncio.gather(*coros)\n except StopAsyncIteration:\n break\n else:\n yield tuple(items)", "response": "Yields the elements of several asynchronous sequences in a single asynchronous sequence."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\nasync def smap(source, func, *more_sources):\n if more_sources:\n source = zip(source, *more_sources)\n async with streamcontext(source) as streamer:\n async for item in streamer:\n yield func(*item) if more_sources else func(item)", "response": "Apply a given function to the elements of one or several asynchronous sequences."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef amap(source, corofn, *more_sources, ordered=True, task_limit=None):\n\n def func(*args):\n return create.just(corofn(*args))\n\n if ordered:\n return advanced.concatmap.raw(\n source, func, *more_sources, task_limit=task_limit)\n return advanced.flatmap.raw(\n source, func, *more_sources, task_limit=task_limit)", "response": "Apply a given coroutine function to the elements of one or several asynchronous sequences."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef map(source, func, *more_sources, ordered=True, task_limit=None):\n if asyncio.iscoroutinefunction(func):\n return amap.raw(\n source, func, *more_sources,\n ordered=ordered, task_limit=task_limit)\n return smap.raw(source, func, *more_sources)", "response": "Apply a given function to the elements of one or several asynchronous sequences."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef iterate(it):\n if is_async_iterable(it):\n return from_async_iterable.raw(it)\n if isinstance(it, Iterable):\n return from_iterable.raw(it)\n raise TypeError(\n f\"{type(it).__name__!r} object is not (async) iterable\")", "response": "Generate values from a sychronous or asynchronous iterable."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef repeat(value, times=None, *, interval=0):\n args = () if times is None else (times,)\n it = itertools.repeat(value, *args)\n agen = from_iterable.raw(it)\n return time.spaceout.raw(agen, interval) if interval else agen", "response": "Generate the same value a given number of times."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ngenerate a given range of numbers.", "response": "def range(*args, interval=0):\n \"\"\"Generate a given range of numbers.\n\n It supports the same arguments as the builtin function.\n An optional interval can be given to space the values out.\n \"\"\"\n agen = from_iterable.raw(builtins.range(*args))\n return time.spaceout.raw(agen, interval) if interval else agen"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef count(start=0, step=1, *, interval=0):\n agen = from_iterable.raw(itertools.count(start, step))\n return time.spaceout.raw(agen, interval) if interval else agen", "response": "Generate consecutive numbers indefinitely."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef say(self, text):\n\n params = {\n \"input\": text,\n \"key\": self.key,\n \"cs\": self.cs,\n \"conversation_id\": self.convo_id,\n \"wrapper\": \"CleverWrap.py\"\n }\n\n reply = self._send(params)\n self._process_reply(reply)\n return self.output", "response": "Say something to www. cleverbot. com\n "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsending a request to www. cleverbot. com", "response": "def _send(self, params):\n \"\"\"\n Make the request to www.cleverbot.com\n :type params: dict\n Returns: dict\n \"\"\"\n # Get a response\n try:\n r = requests.get(self.url, params=params)\n # catch errors, print then exit.\n except requests.exceptions.RequestException as e:\n print(e)\n return r.json(strict=False)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _process_reply(self, reply):\n self.cs = reply.get(\"cs\", None)\n self.count = int(reply.get(\"interaction_count\", None))\n self.output = reply.get(\"output\", None)\n self.convo_id = reply.get(\"conversation_id\", None)\n self.history = {key:value for key, value in reply.items() if key.startswith(\"interaction\")}\n self.time_taken = int(reply.get(\"time_taken\", None))\n self.time_elapsed = int(reply.get(\"time_elapsed\", None))", "response": "take the cleverbot. com response and populate properties."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef end(self):\n '''Ends the tracer.\n\n May be called in any state. Transitions the state to ended and releases\n any SDK resources owned by this tracer (this includes only internal\n resources, things like passed-in\n :class:`oneagent.common.DbInfoHandle` need to be released manually).\n\n Prefer using the tracer as a context manager (i.e., with a\n :code:`with`-block) instead of manually calling this method.\n '''\n if self.handle is not None:\n self.nsdk.tracer_end(self.handle)\n self.handle = None", "response": "Ends the tracer.\n\n May be called in any state. Transitions the state to ended and releases\n any SDK resources owned by this tracer (this includes only internal\n resources, things like passed-in\n :class:`oneagent.common.DbInfoHandle` need to be released manually).\n\n Prefer using the tracer as a context manager (i.e., with a\n :code:`with`-block) instead of manually calling this method."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef mark_failed(self, clsname, msg):\n '''Marks the tracer as failed with the given exception class name\n :code:`clsname` and message :code:`msg`.\n\n May only be called in the started state and only if the tracer is not\n already marked as failed. Note that this does not end the tracer! Once a\n tracer is marked as failed, attempts to do it again are forbidden.\n\n If possible, using the tracer as a context manager (i.e., with a\n :code:`with`-block) or :meth:`.mark_failed_exc` is more convenient than\n this method.\n\n :param str clsname: Fully qualified name of the exception type that\n caused the failure.\n :param str msg: Exception message that caused the failure.\n '''\n self.nsdk.tracer_error(self.handle, clsname, msg)", "response": "Marks the tracer as failed with the given exception class name clsname and message msg."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nmarks the tracer as failed with the given exception.", "response": "def mark_failed_exc(self, e_val=None, e_ty=None):\n '''Marks the tracer as failed with the given exception :code:`e_val` of\n type :code:`e_ty` (defaults to the current exception).\n\n May only be called in the started state and only if the tracer is not\n already marked as failed. Note that this does not end the tracer! Once a\n tracer is marked as failed, attempts to do it again are forbidden.\n\n If possible, using the tracer as a context manager (i.e., with a\n :code:`with`-block) is more convenient than this method.\n\n If :code:`e_val` and :code:`e_ty` are both none, the current exception\n (as retured by :func:`sys.exc_info`) is used.\n\n :param BaseException e_val: The exception object that caused the\n failure. If :code:`None`, the current exception value\n (:code:`sys.exc_info()[1]`) is used.\n :param type e_ty: The type of the exception that caused the failure. If\n :code:`None` the type of :code:`e_val` is used. If that is also\n :code:`None`, the current exception type (:code:`sys.exc_info()[0]`)\n is used.\n '''\n _error_from_exc(self.nsdk, self.handle, e_val, e_ty)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _get_kvc(kv_arg):\n '''Returns a tuple keys, values, count for kv_arg (which can be a dict or a\n tuple containing keys, values and optinally count.'''\n if isinstance(kv_arg, Mapping):\n return six.iterkeys(kv_arg), six.itervalues(kv_arg), len(kv_arg)\n assert 2 <= len(kv_arg) <= 3, \\\n 'Argument must be a mapping or a sequence (keys, values, [len])'\n return (\n kv_arg[0],\n kv_arg[1],\n kv_arg[2] if len(kv_arg) == 3 else len(kv_arg[0]))", "response": "Returns a tuple keys values count for kv_arg which can be a dict or a\n tuple containing keys values and optinally count for kv_arg."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef create_database_info(\n self,\n name,\n vendor,\n channel):\n '''Creates a database info with the given information for use with\n :meth:`trace_sql_database_request`.\n\n :param str name: The name (e.g., connection string) of the database.\n :param str vendor: The type of the database (e.g., sqlite, PostgreSQL,\n MySQL).\n :param Channel channel: The channel used to communicate with the\n database.\n :returns: A new handle, holding the given database information.\n :rtype: DbInfoHandle\n '''\n return DbInfoHandle(self._nsdk, self._nsdk.databaseinfo_create(\n name, vendor, channel.type_, channel.endpoint))", "response": "Creates a new database info handle with the given information for use with the\n database."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_web_application_info(\n self, virtual_host, application_id, context_root):\n '''Creates a web application info for use with\n :meth:`trace_incoming_web_request`.\n\n See\n \n for more information about the meaning of the parameters.\n\n :param str virtual_host: The logical name of the web server that hosts\n the application.\n :param str application_id: A unique ID for the web application. This\n will also be used as the display name.\n :param str context_root: The context root of the web application. This\n is the common path prefix for requests which will be routed to the\n web application.\n\n If all requests to this server are routed to this application, use\n a slash :code:`'/'`.\n :rtype: WebapplicationInfoHandle\n '''\n return WebapplicationInfoHandle(\n self._nsdk, self._nsdk.webapplicationinfo_create(\n virtual_host, application_id, context_root))", "response": "Creates a web application info for use with with\n ."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef trace_sql_database_request(self, database, sql):\n '''Create a tracer for the given database info and SQL statement.\n\n :param DbInfoHandle database: Database information (see\n :meth:`create_database_info`).\n :param str sql: The SQL statement to trace.\n :rtype: tracers.DatabaseRequestTracer\n '''\n assert isinstance(database, DbInfoHandle)\n return tracers.DatabaseRequestTracer(\n self._nsdk,\n self._nsdk.databaserequesttracer_create_sql(database.handle, sql))", "response": "Create a tracer for the given database info and SQL statement."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef trace_incoming_web_request(\n self,\n webapp_info,\n url,\n method,\n headers=None,\n remote_address=None,\n str_tag=None,\n byte_tag=None):\n '''Create a tracer for an incoming webrequest.\n\n :param WebapplicationInfoHandle webapp_info: Web application\n information (see :meth:`create_web_application_info`).\n :param str url: The requested URL (including scheme, hostname/port,\n path and query).\n :param str method: The HTTP method of the request (e.g., GET or\n POST).\n :param headers: The HTTP headers of the request. Can be either a\n dictionary mapping header name to value (:class:`str` to\n :class:`str`) or a tuple containing a sequence of string header\n names as first element, an equally long sequence of\n corresponding values as second element and optionally a count as\n third element (this will default to the :func:`len` of the\n header names).\n\n Some headers can appear multiple times in an HTTP request. To\n capture all the values, either use the tuple-form and provide\n the name and corresponding values for each, or if possible for\n that particular header, set the value to an appropriately\n concatenated string.\n\n .. warning:: If you use Python 2, be sure to use the UTF-8 encoding\n or the :class:`unicode` type! See :ref:`here\n ` for more information.\n :type headers: \\\n dict[str, str] or \\\n tuple[~typing.Collection[str], ~typing.Collection[str]] or \\\n tuple[~typing.Iterable[str], ~typing.Iterable[str], int]]\n :param str remote_address: The remote (client) IP address (of the\n peer of the socket connection via which the request was\n received).\n\n The remote address is useful to gain information about load\n balancers, proxies and ultimately the end user that is sending\n the request.\n\n For the other parameters, see :ref:`tagging`.\n\n :rtype: tracers.IncomingWebRequestTracer\n '''\n assert isinstance(webapp_info, WebapplicationInfoHandle)\n result = tracers.IncomingWebRequestTracer(\n self._nsdk,\n self._nsdk.incomingwebrequesttracer_create(\n webapp_info.handle, url, method))\n if not result:\n return result\n try:\n if headers:\n self._nsdk.incomingwebrequesttracer_add_request_headers(\n result.handle, *_get_kvc(headers))\n if remote_address:\n self._nsdk.incomingwebrequesttracer_set_remote_address(\n result.handle, remote_address)\n self._applytag(result, str_tag, byte_tag)\n except:\n result.end()\n raise\n return result", "response": "Create a tracer for an incoming webrequest."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef trace_outgoing_web_request(self, url, method, headers=None):\n '''Create a tracer for an outgoing webrequest.\n\n :param str url: The request URL (including scheme, hostname/port, path and query).\n :param str method: The HTTP method of the request (e.g., GET or POST).\n :param headers: The HTTP headers of the request. Can be either a\n dictionary mapping header name to value (:class:`str` to\n :class:`str`) or a tuple containing a sequence of string header\n names as first element, an equally long sequence of\n corresponding values as second element and optionally a count as\n third element (this will default to the :func:`len` of the\n header names).\n\n Some headers can appear multiple times in an HTTP request. To\n capture all the values, either use the tuple-form and provide\n the name and corresponding values for each, or if possible for\n that particular header, set the value to an appropriately\n concatenated string.\n\n .. warning:: If you use Python 2, be sure to use the UTF-8 encoding\n or the :class:`unicode` type! See :ref:`here\n ` for more information.\n :type headers: \\\n dict[str, str] or \\\n tuple[~typing.Collection[str], ~typing.Collection[str]] or \\\n tuple[~typing.Iterable[str], ~typing.Iterable[str], int]]\n\n :rtype: tracers.OutgoingWebRequestTracer\n\n .. versionadded:: 1.1.0\n '''\n result = tracers.OutgoingWebRequestTracer(\n self._nsdk, self._nsdk.outgoingwebrequesttracer_create(url, method))\n\n if not result:\n return result\n\n try:\n if headers:\n self._nsdk.outgoingwebrequesttracer_add_request_headers(result.handle,\n *_get_kvc(headers))\n except:\n result.end()\n raise\n\n return result", "response": "Create a tracer for an outgoing webrequest."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a tracer for outgoing remote calls.", "response": "def trace_outgoing_remote_call(\n self,\n method,\n service,\n endpoint,\n channel,\n protocol_name=None):\n '''Creates a tracer for outgoing remote calls.\n\n :param str method: The name of the service method/operation.\n :param str service: The name of the service class/type.\n :param str endpoint: A string identifying the \"instance\" of the the\n service. See also `the general documentation on service\n endpoints`__.\n :param Channel channel: The channel used to communicate with the\n service.\n :param str protocol_name: The name of the remoting protocol (on top of\n the communication protocol specified in :code:`channel.type_`.) that\n is used to to communicate with the service (e.g., RMI, Protobuf,\n ...).\n\n __ \\\n https://github.com/Dynatrace/OneAgent-SDK#common-concepts-service-endpoints-and-communication-endpoints\n\n :rtype: tracers.OutgoingRemoteCallTracer\n '''\n result = tracers.OutgoingRemoteCallTracer(\n self._nsdk,\n self._nsdk.outgoingremotecalltracer_create(\n method,\n service,\n endpoint,\n channel.type_,\n channel.endpoint))\n if protocol_name is not None:\n self._nsdk.outgoingremotecalltracer_set_protocol_name(\n result.handle, protocol_name)\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncreates a tracer for incoming remote calls.", "response": "def trace_incoming_remote_call(\n self,\n method,\n name,\n endpoint,\n protocol_name=None,\n str_tag=None,\n byte_tag=None):\n '''Creates a tracer for incoming remote calls.\n\n For the parameters, see :ref:`tagging` (:code:`str_tag` and\n :code:`byte_tag`) and :meth:`trace_outgoing_remote_call` (all others).\n\n :rtype: tracers.IncomingRemoteCallTracer\n '''\n result = tracers.IncomingRemoteCallTracer(\n self._nsdk,\n self._nsdk.incomingremotecalltracer_create(method, name, endpoint))\n if protocol_name is not None:\n self._nsdk.incomingremotecalltracer_set_protocol_name(\n result.handle, protocol_name)\n self._applytag(result, str_tag, byte_tag)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef trace_in_process_link(self, link_bytes):\n '''Creates a tracer for tracing asynchronous related processing in the same process.\n\n For more information see :meth:`create_in_process_link`.\n\n :param bytes link_bytes: An in-process link created using :meth:`create_in_process_link`.\n\n :rtype: tracers.InProcessLinkTracer\n\n .. versionadded:: 1.1.0\n '''\n return tracers.InProcessLinkTracer(self._nsdk,\n self._nsdk.trace_in_process_link(link_bytes))", "response": "Creates a tracer for tracing asynchronous related processing in the same process."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nadd a custom request attribute to the current active tracer.", "response": "def add_custom_request_attribute(self, key, value):\n '''Adds a custom request attribute to the current active tracer.\n\n :param str key: The name of the custom request attribute, the name is mandatory and\n may not be None.\n :param value: The value of the custom request attribute. Currently supported types\n are integer, float and string values. The value is mandatory and may\n not be None.\n :type value: str or int or float\n\n .. versionadded:: 1.1.0\n '''\n\n if isinstance(value, int):\n self._nsdk.customrequestattribute_add_integer(key, value)\n elif isinstance(value, float):\n self._nsdk.customrequestattribute_add_float(key, value)\n elif isinstance(value, six.string_types):\n self._nsdk.customrequestattribute_add_string(key, value)\n else:\n warn = self._nsdk.agent_get_logging_callback()\n if warn:\n warn('Can\\'t add custom request attribute \\'{0}\\' '\n 'because the value type \\'{1}\\' is not supported!'.format(key, type(value)))"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef close(self):\n '''Closes the handle, if it is still open.\n\n Usually, you should prefer using the handle as a context manager to\n calling :meth:`close` manually.'''\n if self.handle is not None:\n self.close_handle(self.nsdk, self.handle)\n self.handle = None", "response": "Closes the handle if it is still open."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nyielding all direct and indirect children with LINK_CHILD.", "response": "def all_original_children(self):\n '''Yields all (direct and indirect) children with LINK_CHILD.'''\n return chain.from_iterable(\n c.all_nodes_in_subtree()\n for lnk, c in self.children\n if lnk == self.LINK_CHILD)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a list of SDK options for use with the SDK opts parameter of .", "response": "def sdkopts_from_commandline(argv=None, remove=False, prefix='--dt_'):\n '''Creates a SDK option list for use with the :code:`sdkopts` parameter of\n :func:`.initialize` from a list :code:`argv` of command line parameters.\n\n An element in :code:`argv` is treated as an SDK option if starts with\n :code:`prefix`. The return value of this function will then contain the\n remainder of that parameter (without the prefix). If :code:`remove` is\n :data:`True`, these arguments will be removed from :code:`argv`.\n\n :param argv: An iterable of command line parameter\n strings. Defaults to :data:`sys.argv`. Must be a\n :obj:`~typing.MutableSequence` if :code:`remove` is :data:`True`.\n :type argv: ~typing.Iterable[str] or ~typing.MutableSequence[str]\n :param bool remove: Whether to remove a command line parameter that was\n recognized as an SDK option from :code:`argv` (if :data:`True`) or leave\n :code:`argv` unmodified (if :data:`False`). If :data:`True`,\n :code:`argv` must be a :obj:`~typing.MutableSequence`.\n :param str prefix: The prefix string by which SDK options are recognized and\n which is removed from the copy of the command line parameter that is\n added to the return value.\n\n :rtype: list[str]\n '''\n\n if argv is None:\n argv = sys.argv\n\n if not remove:\n return [param[len(prefix):] for param in argv\n if param.startswith(prefix)]\n result = []\n for i in range(len(argv) - 1, -1, -1):\n if argv[i].startswith(prefix):\n result.append(argv[i][len(prefix):])\n del argv[i]\n result.reverse()\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nattempt to initialize the SDK with the specified options.", "response": "def initialize(sdkopts=(), sdklibname=None):\n '''Attempts to initialize the SDK with the specified options.\n\n Even if initialization fails, a dummy SDK will be available so that SDK\n functions can be called but will do nothing.\n\n If you call this function multiple times, you must call :func:`shutdown`\n just as many times. The options from all but the first :code:`initialize` call\n will be ignored (the return value will have the\n :data:`InitResult.STATUS_ALREADY_INITIALIZED` status code in that case).\n\n :param sdkopts: A sequence of strings of the form\n :samp:`{NAME}={VALUE}` that set the given SDK options. Igored in all but\n the first :code:`initialize` call.\n :type sdkopts: ~typing.Iterable[str]\n :param str sdklibname: The file or directory name of the native C SDK\n DLL. If None, the shared library packaged directly with the agent is\n used. Using a value other than None is only acceptable for debugging.\n You are responsible for providing a native SDK version that matches the\n Python SDK version.\n\n :rtype: .InitResult\n '''\n\n global _sdk_ref_count #pylint:disable=global-statement\n global _sdk_instance #pylint:disable=global-statement\n\n with _sdk_ref_lk:\n logger.debug(\"initialize: ref count = %d\", _sdk_ref_count)\n result = _try_init_noref(sdkopts, sdklibname)\n if _sdk_instance is None:\n _sdk_instance = SDK(try_get_sdk())\n _sdk_ref_count += 1\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nshuts down the SDK.", "response": "def shutdown():\n '''Shut down the SDK.\n\n :returns: An exception object if an error occurred, a falsy value otherwise.\n\n :rtype: Exception\n '''\n global _sdk_ref_count #pylint:disable=global-statement\n global _sdk_instance #pylint:disable=global-statement\n global _should_shutdown #pylint:disable=global-statement\n\n with _sdk_ref_lk:\n logger.debug(\"shutdown: ref count = %d, should_shutdown = %s\", \\\n _sdk_ref_count, _should_shutdown)\n nsdk = nativeagent.try_get_sdk()\n if not nsdk:\n logger.warning('shutdown: SDK not initialized or already shut down')\n _sdk_ref_count = 0\n return None\n if _sdk_ref_count > 1:\n logger.debug('shutdown: reference count is now %d', _sdk_ref_count)\n _sdk_ref_count -= 1\n return None\n logger.info('shutdown: Shutting down SDK.')\n try:\n if _should_shutdown:\n _rc = nsdk.shutdown()\n if _rc == ErrorCode.NOT_INITIALIZED:\n logger.warning('shutdown: native SDK was not initialized')\n else:\n nativeagent.checkresult(nsdk, _rc, 'shutdown')\n _should_shutdown = False\n except SDKError as e:\n logger.warning('shutdown failed', exc_info=sys.exc_info())\n return e\n _sdk_ref_count = 0\n _sdk_instance = None\n nativeagent._force_initialize(None) #pylint:disable=protected-access\n logger.debug('shutdown: completed')\n return None"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nattach appropriate error information to the current exception.", "response": "def error_from_exc(nsdk, tracer_h, e_val=None, e_ty=None):\n \"\"\"Attach appropriate error information to tracer_h.\n\n If e_val and e_ty are None, the current exception is used.\"\"\"\n\n if not tracer_h:\n return\n\n if e_ty is None and e_val is None:\n e_ty, e_val = sys.exc_info()[:2]\n if e_ty is None and e_val is not None:\n e_ty = type(e_val)\n nsdk.tracer_error(tracer_h, getfullname(e_ty), str(e_val))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef waitinglist_entry_form(context):\n initial = {}\n if \"request\" in context:\n initial.update({\n \"referrer\": context[\"request\"].META.get(\"HTTP_REFERER\", \"\"),\n \"campaign\": context[\"request\"].GET.get(\"wlc\", \"\")\n })\n return WaitingListEntryForm(initial=initial)", "response": "Return a new WaitingListEntryForm object to post a new comment."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _host():\n host_and_port = request.urlparts[1]\n try:\n host, _ = host_and_port.split(':')\n except ValueError:\n # No port yet. Host defaults to '127.0.0.1' in bottle.request.\n return DEFAULT_BIND\n return host or DEFAULT_BIND", "response": "Get the Host from the most recent HTTP request."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef wait_for(port_num, timeout):\n logger.debug(\"wait for {port_num}\".format(**locals()))\n t_start = time.time()\n sleeps = 0.1\n while time.time() - t_start < timeout:\n try:\n s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n try:\n s.connect((_host(), port_num))\n return True\n except (IOError, socket.error):\n time.sleep(sleeps)\n finally:\n s.close()\n return False", "response": "waits for a process to start."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nrepair mongodb after usafe shutdown", "response": "def repair_mongo(name, dbpath):\n \"\"\"repair mongodb after usafe shutdown\"\"\"\n log_file = os.path.join(dbpath, 'mongod.log')\n cmd = [name, \"--dbpath\", dbpath, \"--logpath\", log_file, \"--logappend\",\n \"--repair\"]\n proc = subprocess.Popen(\n cmd, universal_newlines=True,\n stdout=subprocess.PIPE, stderr=subprocess.STDOUT)\n timeout = 45\n t_start = time.time()\n while time.time() - t_start < timeout:\n line = str(proc.stdout.readline())\n logger.info(\"repair output: %s\" % (line,))\n return_code = proc.poll()\n if return_code is not None:\n if return_code:\n raise Exception(\"mongod --repair failed with exit code %s, \"\n \"check log file: %s\" % (return_code, log_file))\n # Success when poll() returns 0\n return\n time.sleep(1)\n proc.terminate()\n raise Exception(\"mongod --repair failed to exit after %s seconds, \"\n \"check log file: %s\" % (timeout, log_file))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nstarting a new process with params from config_path.", "response": "def mprocess(name, config_path, port=None, timeout=180, silence_stdout=True):\n \"\"\"start 'name' process with params from config_path.\n Args:\n name - process name or path\n config_path - path to file where should be stored configuration\n port - process's port\n timeout - specify how long, in seconds, a command can take before times out.\n if timeout <=0 - doesn't wait for complete start process\n silence_stdout - if True (default), redirect stdout to /dev/null\n return tuple (Popen object, host) if process started, return (None, None) if not\n \"\"\"\n\n logger.debug(\n \"mprocess(name={name!r}, config_path={config_path!r}, port={port!r}, \"\n \"timeout={timeout!r})\".format(**locals()))\n if not (config_path and isinstance(config_path, str) and os.path.exists(config_path)):\n raise OSError(\"can't find config file {config_path}\".format(**locals()))\n\n cfg = read_config(config_path)\n cmd = [name, \"--config\", config_path]\n\n if cfg.get('port', None) is None or port:\n port = port or PortPool().port(check=True)\n cmd.extend(['--port', str(port)])\n host = \"{host}:{port}\".format(host=_host(), port=port)\n try:\n logger.debug(\"execute process: %s\", ' '.join(cmd))\n proc = subprocess.Popen(\n cmd,\n stdout=DEVNULL if silence_stdout else None,\n stderr=subprocess.STDOUT)\n\n if proc.poll() is not None:\n logger.debug(\"process is not alive\")\n raise OSError(\"Process started, but died immediately.\")\n except (OSError, TypeError) as err:\n message = \"exception while executing process: {err}\".format(err=err)\n logger.debug(message)\n raise OSError(message)\n if timeout > 0 and wait_for(port, timeout):\n logger.debug(\"process '{name}' has started: pid={proc.pid}, host={host}\".format(**locals()))\n return (proc, host)\n elif timeout > 0:\n logger.debug(\"hasn't connected to pid={proc.pid} with host={host} during timeout {timeout} \".format(**locals()))\n logger.debug(\"terminate process with pid={proc.pid}\".format(**locals()))\n kill_mprocess(proc)\n proc_alive(proc) and time.sleep(3) # wait while process stoped\n message = (\"Could not connect to process during \"\n \"{timeout} seconds\".format(timeout=timeout))\n raise TimeoutError(message, errno.ETIMEDOUT)\n return (proc, host)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef wait_mprocess(process, timeout):\n if PY3:\n try:\n return process.wait(timeout=timeout)\n except subprocess.TimeoutExpired as exc:\n raise TimeoutError(str(exc))\n\n # On Python 2, simulate the timeout parameter and raise TimeoutError.\n start = time.time()\n while True:\n exit_code = process.poll()\n if exit_code is not None:\n return exit_code\n if time.time() - start > timeout:\n raise TimeoutError(\"Process %s timed out after %s seconds\" %\n (process.pid, timeout))\n time.sleep(0.05)", "response": "Compatibility function for waiting on a process with a timeout."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nkills process Returns True if process is still alive False otherwise", "response": "def kill_mprocess(process):\n \"\"\"kill process\n Args:\n process - Popen object for process\n \"\"\"\n if process and proc_alive(process):\n process.terminate()\n process.communicate()\n return not proc_alive(process)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nremove all process s stuff", "response": "def cleanup_mprocess(config_path, cfg):\n \"\"\"remove all process's stuff\n Args:\n config_path - process's options file\n cfg - process's config\n \"\"\"\n for key in ('keyFile', 'logPath', 'dbpath'):\n remove_path(cfg.get(key, None))\n isinstance(config_path, str) and os.path.exists(config_path) and remove_path(config_path)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nremove path from file system", "response": "def remove_path(path):\n \"\"\"remove path from file system\n If path is None - do nothing\"\"\"\n if path is None or not os.path.exists(path):\n return\n if platform.system() == 'Windows':\n # Need to have write permission before deleting the file.\n os.chmod(path, stat.S_IWRITE)\n try:\n if os.path.isdir(path):\n shutil.rmtree(path)\n elif os.path.isfile(path):\n shutil.os.remove(path)\n except OSError:\n logger.exception(\"Could not remove path: %s\" % path)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef write_config(params, config_path=None):\n if config_path is None:\n config_path = tempfile.mktemp(prefix=\"mongo-\")\n\n cfg = params.copy()\n if 'setParameter' in cfg:\n set_parameters = cfg.pop('setParameter')\n try:\n for key, value in set_parameters.items():\n cfg['setParameter = ' + key] = value\n except AttributeError:\n reraise(RequestError,\n 'Not a valid value for setParameter: %r '\n 'Expected \"setParameter\": { : value, ...}'\n % set_parameters)\n\n # fix boolean value\n for key, value in cfg.items():\n if isinstance(value, bool):\n cfg[key] = json.dumps(value)\n\n with open(config_path, 'w') as fd:\n data = '\\n'.join('%s=%s' % (key, item) for key, item in cfg.items())\n fd.write(data)\n return config_path", "response": "write a config file to a file"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreading config_path and return options as dictionary", "response": "def read_config(config_path):\n \"\"\"read config_path and return options as dictionary\"\"\"\n result = {}\n with open(config_path, 'r') as fd:\n for line in fd.readlines():\n if '=' in line:\n key, value = line.split('=', 1)\n try:\n result[key] = json.loads(value)\n except ValueError:\n result[key] = value.rstrip('\\n')\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef __check_port(self, port):\n s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n try:\n s.bind((_host(), port))\n return True\n except socket.error:\n return False\n finally:\n s.close()", "response": "check port status return True if port is free False otherwise"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef release_port(self, port):\n if port in self.__closed:\n self.__closed.remove(port)\n self.__ports.add(port)", "response": "release a port from the cache"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef port(self, check=False):\n if not self.__ports: # refresh ports if sequence is empty\n self.refresh()\n\n try:\n port = self.__ports.pop()\n if check:\n while not self.__check_port(port):\n self.release_port(port)\n port = self.__ports.pop()\n except (IndexError, KeyError):\n raise IndexError(\"Could not find a free port,\\nclosed ports: {closed}\".format(closed=self.__closed))\n self.__closed.add(port)\n return port", "response": "Returns next opened port in the sequence."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef refresh(self, only_closed=False):\n if only_closed:\n opened = filter(self.__check_port, self.__closed)\n self.__closed = self.__closed.difference(opened)\n self.__ports = self.__ports.union(opened)\n else:\n ports = self.__closed.union(self.__ports)\n self.__ports = set(filter(self.__check_port, ports))\n self.__closed = ports.difference(self.__ports)", "response": "refresh ports status\n Args:\n only_closed - check status only for closed ports"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef change_range(self, min_port=1025, max_port=2000, port_sequence=None):\n self.__init_range(min_port, max_port, port_sequence)", "response": "change Pool port range"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef setup_versioned_routes(routes, version=None):\n prefix = '/' + version if version else \"\"\n for r in routes:\n path, method = r\n route(prefix + path, method, routes[r])", "response": "Set up routes with a version prefix."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef daemonize_posix(self):\n logger.info('daemonize_posix')\n try:\n pid = os.fork()\n if pid > 0:\n logger.debug('forked first child, pid = %d' % (pid,))\n return pid\n logger.debug('in child after first fork, pid = %d' % (pid, ))\n except OSError as error:\n logger.exception('fork #1')\n sys.stderr.write(\"fork #1 failed: %d (%s)\\n\" % (error.errno, error.strerror))\n sys.exit(1)\n\n # decouple from parent environment\n os.chdir(\"/\")\n os.setsid()\n os.umask(0)\n\n # do second fork\n try:\n pid = os.fork()\n if pid > 0:\n # exit from second parent\n logger.debug('forked second child, pid = %d, exiting' % (pid,))\n sys.exit(0)\n except OSError as error:\n logger.exception('fork #2')\n sys.stderr.write(\"fork #2 failed: %d (%s)\\n\" % (error.errno, error.strerror))\n sys.exit(1)\n\n # redirect standard file descriptors\n logger.info('daemonized, pid = %d' % (pid, ))\n sys.stdin.flush()\n sys.stdout.flush()\n sys.stderr.flush()\n\n os.dup2(self.stdin.fileno(), sys.stdin.fileno())\n os.dup2(self.stdout.fileno(), sys.stdout.fileno())\n os.dup2(self.stderr.fileno(), sys.stderr.fileno())\n\n # write pidfile\n atexit.register(self.delpid)\n pid = str(os.getpid())\n with open(self.pidfile, 'w+') as fd:\n fd.write(\"%s\\n\" % pid)", "response": "Daemonize the current process."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nstop the daemon process", "response": "def stop(self):\n \"\"\"\n Stop the daemon\n \"\"\"\n # Get the pid from the pidfile\n logger.debug(\"reading %s\" % (self.pidfile,))\n try:\n with open(self.pidfile, 'r') as fd:\n pid = int(fd.read().strip())\n except IOError:\n logger.exception(\"reading %s\" % (self.pidfile, ))\n pid = None\n\n if not pid:\n message = \"pidfile %s does not exist. Daemon not running?\\n\"\n sys.stderr.write(message % self.pidfile)\n return # not an error in a restart\n\n if os.name == \"nt\":\n subprocess.call([\"taskkill\", \"/f\", \"/t\", \"/pid\", str(pid)])\n\n if os.path.exists(self.pidfile):\n os.remove(self.pidfile)\n else:\n # Try killing the daemon process\n try:\n os.kill(pid, SIGTERM)\n except OSError as err:\n err = str(err)\n if err.find(\"No such process\") > 0:\n if os.path.exists(self.pidfile):\n os.remove(self.pidfile)\n else:\n raise"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nremoving all members without reconfig", "response": "def cleanup(self):\n \"\"\"remove all members without reconfig\"\"\"\n for item in self.server_map:\n self.member_del(item, reconfig=False)\n self.server_map.clear()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning member id by hostname", "response": "def host2id(self, hostname):\n \"\"\"return member id by hostname\"\"\"\n for key, value in self.server_map.items():\n if value == hostname:\n return key"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nupdates server_map with config", "response": "def update_server_map(self, config):\n \"\"\"update server_map ({member_id:hostname})\"\"\"\n self.server_map = dict([(member['_id'], member['host']) for member in config['members']])"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef repl_init(self, config):\n self.update_server_map(config)\n # init_server - server which can init replica set\n init_server = [member['host'] for member in config['members']\n if not (member.get('arbiterOnly', False)\n or member.get('priority', 1) == 0)][0]\n\n servers = [member['host'] for member in config['members']]\n if not self.wait_while_reachable(servers):\n logger.error(\"all servers must be reachable\")\n self.cleanup()\n return False\n try:\n result = self.connection(init_server).admin.command(\"replSetInitiate\", config)\n logger.debug(\"replica init result: {result}\".format(**locals()))\n except pymongo.errors.PyMongoError:\n raise\n if int(result.get('ok', 0)) == 1:\n # Wait while members come up\n return self.waiting_member_state()\n else:\n self.cleanup()\n return False", "response": "create replica set by config\nAttributeNames return True if replica set created successfullyy else False"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef reset(self):\n # Need to use self.server_map, in case no Servers are left running.\n for member_id in self.server_map:\n host = self.member_id_to_host(member_id)\n server_id = self._servers.host_to_server_id(host)\n # Reset each member.\n self._servers.command(server_id, 'reset')\n # Wait for all members to have a state of 1, 2, or 7.\n # Note that this also waits for a primary to become available.\n self.waiting_member_state()\n # Wait for Server states to match the config from the primary.\n self.waiting_config_state()\n return self.info()", "response": "Reset all members and servers."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning information about replica set", "response": "def info(self):\n \"\"\"return information about replica set\"\"\"\n hosts = ','.join(x['host'] for x in self.members())\n mongodb_uri = 'mongodb://' + hosts + '/?replicaSet=' + self.repl_id\n result = {\"id\": self.repl_id,\n \"auth_key\": self.auth_key,\n \"members\": self.members(),\n \"mongodb_uri\": mongodb_uri,\n \"orchestration\": 'replica_sets'}\n if self.login:\n # Add replicaSet URI parameter.\n uri = ('%s&replicaSet=%s'\n % (self.mongodb_auth_uri(hosts), self.repl_id))\n result['mongodb_auth_uri'] = uri\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nadds new mongod instances and add it to the replica set. Returns True if operation success otherwise False.", "response": "def repl_member_add(self, params):\n \"\"\"create new mongod instances and add it to the replica set.\n Args:\n params - mongod params\n return True if operation success otherwise False\n \"\"\"\n repl_config = self.config\n member_id = max([member['_id'] for member in repl_config['members']]) + 1\n member_config = self.member_create(params, member_id)\n repl_config['members'].append(member_config)\n if not self.repl_update(repl_config):\n self.member_del(member_id, reconfig=True)\n raise ReplicaSetError(\"Could not add member to ReplicaSet.\")\n return member_id"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nruns command on replica set", "response": "def run_command(self, command, arg=None, is_eval=False, member_id=None):\n \"\"\"run command on replica set\n if member_id is specified command will be execute on this server\n if member_id is not specified command will be execute on the primary\n\n Args:\n command - command string\n arg - command argument\n is_eval - if True execute command as eval\n member_id - member id\n\n return command's result\n \"\"\"\n logger.debug(\"run_command({command}, {arg}, {is_eval}, {member_id})\".format(**locals()))\n mode = is_eval and 'eval' or 'command'\n hostname = None\n if isinstance(member_id, int):\n hostname = self.member_id_to_host(member_id)\n result = getattr(self.connection(hostname=hostname).admin, mode)(command, arg)\n logger.debug(\"command result: {result}\".format(result=result))\n return result"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef config(self):\n try:\n admin = self.connection().admin\n config = admin.command('replSetGetConfig')['config']\n except pymongo.errors.OperationFailure:\n # replSetGetConfig was introduced in 2.7.5.\n config = self.connection().local.system.replset.find_one()\n return config", "response": "return replica set config use rs. conf command"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nstarts new mongod instances as part of replica set", "response": "def member_create(self, params, member_id):\n \"\"\"start new mongod instances as part of replica set\n Args:\n params - member params\n member_id - member index\n\n return member config\n \"\"\"\n member_config = params.get('rsParams', {})\n server_id = params.pop('server_id', None)\n version = params.pop('version', self._version)\n proc_params = {'replSet': self.repl_id}\n proc_params.update(params.get('procParams', {}))\n if self.enable_ipv6:\n enable_ipv6_single(proc_params)\n # Make sure that auth isn't set the first time we start the servers.\n proc_params = self._strip_auth(proc_params)\n\n # Don't pass in auth_key the first time we start the servers.\n server_id = self._servers.create(\n name='mongod',\n procParams=proc_params,\n sslParams=self.sslParams,\n version=version,\n server_id=server_id\n )\n member_config.update({\"_id\": member_id,\n \"host\": self._servers.hostname(server_id)})\n return member_config"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nremove a member from the replica set.", "response": "def member_del(self, member_id, reconfig=True):\n \"\"\"remove member from replica set\n Args:\n member_id - member index\n reconfig - is need reconfig replica\n\n return True if operation success otherwise False\n \"\"\"\n server_id = self._servers.host_to_server_id(\n self.member_id_to_host(member_id))\n if reconfig and member_id in [member['_id'] for member in self.members()]:\n config = self.config\n config['members'].pop(member_id)\n self.repl_update(config)\n self._servers.remove(server_id)\n return True"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nupdate member s values with reconfig replica", "response": "def member_update(self, member_id, params):\n \"\"\"update member's values with reconfig replica\n Args:\n member_id - member index\n params - updates member params\n\n return True if operation success otherwise False\n \"\"\"\n config = self.config\n config['members'][member_id].update(params.get(\"rsParams\", {}))\n return self.repl_update(config)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn information about a member", "response": "def member_info(self, member_id):\n \"\"\"return information about member\"\"\"\n server_id = self._servers.host_to_server_id(\n self.member_id_to_host(member_id))\n server_info = self._servers.info(server_id)\n result = {'_id': member_id, 'server_id': server_id,\n 'mongodb_uri': server_info['mongodb_uri'],\n 'procInfo': server_info['procInfo'],\n 'statuses': server_info['statuses']}\n if self.login:\n result['mongodb_auth_uri'] = self.mongodb_auth_uri(\n self._servers.hostname(server_id))\n result['rsInfo'] = {}\n if server_info['procInfo']['alive']:\n # Can't call serverStatus on arbiter when running with auth enabled.\n # (SERVER-5479)\n if self.login or self.auth_key:\n arbiter_ids = [member['_id'] for member in self.arbiters()]\n if member_id in arbiter_ids:\n result['rsInfo'] = {\n 'arbiterOnly': True, 'secondary': False, 'primary': False}\n return result\n repl = self.run_command('serverStatus', arg=None, is_eval=False, member_id=member_id)['repl']\n logger.debug(\"member {member_id} repl info: {repl}\".format(**locals()))\n for key in ('votes', 'tags', 'arbiterOnly', 'buildIndexes', 'hidden', 'priority', 'slaveDelay', 'votes', 'secondary'):\n if key in repl:\n result['rsInfo'][key] = repl[key]\n result['rsInfo']['primary'] = repl.get('ismaster', False)\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef member_command(self, member_id, command):\n server_id = self._servers.host_to_server_id(\n self.member_id_to_host(member_id))\n return self._servers.command(server_id, command)", "response": "apply command ( start stop restart to member instance of replica set\n "} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns list of members information", "response": "def members(self):\n \"\"\"return list of members information\"\"\"\n result = list()\n for member in self.run_command(command=\"replSetGetStatus\", is_eval=False)['members']:\n result.append({\n \"_id\": member['_id'],\n \"host\": member[\"name\"],\n \"server_id\": self._servers.host_to_server_id(member[\"name\"]),\n \"state\": member['state']\n })\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns all members of replica set in specific state", "response": "def get_members_in_state(self, state):\n \"\"\"return all members of replica set in specific state\"\"\"\n members = self.run_command(command='replSetGetStatus', is_eval=False)['members']\n return [member['name'] for member in members if member['state'] == state]"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nauthenticates the client if necessary.", "response": "def _authenticate_client(self, client):\n \"\"\"Authenticate the client if necessary.\"\"\"\n if self.login and not self.restart_required:\n try:\n db = client[self.auth_source]\n if self.x509_extra_user:\n db.authenticate(\n DEFAULT_SUBJECT,\n mechanism='MONGODB-X509'\n )\n else:\n db.authenticate(\n self.login, self.password)\n except Exception:\n logger.exception(\n \"Could not authenticate to %r as %s/%s\"\n % (client, self.login, self.password))\n raise"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nconnect to the specified server.", "response": "def connection(self, hostname=None, read_preference=pymongo.ReadPreference.PRIMARY, timeout=300):\n \"\"\"return MongoReplicaSetClient object if hostname specified\n return MongoClient object if hostname doesn't specified\n Args:\n hostname - connection uri\n read_preference - default PRIMARY\n timeout - specify how long, in seconds, a command can take before server times out.\n \"\"\"\n logger.debug(\"connection({hostname}, {read_preference}, {timeout})\".format(**locals()))\n t_start = time.time()\n servers = hostname or \",\".join(self.server_map.values())\n while True:\n try:\n if hostname is None:\n c = pymongo.MongoReplicaSetClient(\n servers, replicaSet=self.repl_id,\n read_preference=read_preference,\n socketTimeoutMS=self.socket_timeout,\n w=self._write_concern, fsync=True, **self.kwargs)\n connected(c)\n if c.primary:\n self._authenticate_client(c)\n return c\n raise pymongo.errors.AutoReconnect(\"No replica set primary available\")\n else:\n logger.debug(\"connection to the {servers}\".format(**locals()))\n c = pymongo.MongoClient(\n servers, socketTimeoutMS=self.socket_timeout,\n w=self._write_concern, fsync=True, **self.kwargs)\n connected(c)\n self._authenticate_client(c)\n return c\n except (pymongo.errors.PyMongoError):\n exc_type, exc_value, exc_tb = sys.exc_info()\n err_message = traceback.format_exception(exc_type, exc_value, exc_tb)\n logger.error(\"Exception {exc_type} {exc_value}\".format(**locals()))\n logger.error(err_message)\n if time.time() - t_start > timeout:\n raise pymongo.errors.AutoReconnect(\"Couldn't connect while timeout {timeout} second\".format(**locals()))\n time.sleep(1)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns list of secondaries members", "response": "def secondaries(self):\n \"\"\"return list of secondaries members\"\"\"\n return [\n {\n \"_id\": self.host2id(member),\n \"host\": member,\n \"server_id\": self._servers.host_to_server_id(member)\n }\n for member in self.get_members_in_state(2)\n ]"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns list of arbiters", "response": "def arbiters(self):\n \"\"\"return list of arbiters\"\"\"\n return [\n {\n \"_id\": self.host2id(member),\n \"host\": member,\n \"server_id\": self._servers.host_to_server_id(member)\n }\n for member in self.get_members_in_state(7)\n ]"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns list of hidden members", "response": "def hidden(self):\n \"\"\"return list of hidden members\"\"\"\n members = [self.member_info(item[\"_id\"]) for item in self.members()]\n result = []\n for member in members:\n if member['rsInfo'].get('hidden'):\n server_id = member['server_id']\n result.append({\n '_id': member['_id'],\n 'host': self._servers.hostname(server_id),\n 'server_id': server_id})\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef passives(self):\n servers = self.run_command('ismaster').get('passives', [])\n return [member for member in self.members() if member['host'] in servers]", "response": "return list of passive servers"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nwaiting while all servers be reachable", "response": "def wait_while_reachable(self, servers, timeout=60):\n \"\"\"wait while all servers be reachable\n Args:\n servers - list of servers\n \"\"\"\n t_start = time.time()\n while True:\n try:\n for server in servers:\n # TODO: use state code to check if server is reachable\n server_info = self.connection(\n hostname=server, timeout=5).admin.command('ismaster')\n logger.debug(\"server_info: {server_info}\".format(server_info=server_info))\n if int(server_info['ok']) != 1:\n raise pymongo.errors.OperationFailure(\"{server} is not reachable\".format(**locals))\n return True\n except (KeyError, AttributeError, pymongo.errors.AutoReconnect, pymongo.errors.OperationFailure):\n if time.time() - t_start > timeout:\n return False\n time.sleep(0.1)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef waiting_member_state(self, timeout=300):\n t_start = time.time()\n while not self.check_member_state():\n if time.time() - t_start > timeout:\n return False\n time.sleep(0.1)\n return True", "response": "Wait for all RS members to be in an acceptable state."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef waiting_config_state(self, timeout=300):\n t_start = time.time()\n while not self.check_config_state():\n if time.time() - t_start > timeout:\n return False\n time.sleep(0.1)\n return True", "response": "waiting while real state equal config state"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nverifies that all RS members have an acceptable state.", "response": "def check_member_state(self):\n \"\"\"Verify that all RS members have an acceptable state.\"\"\"\n bad_states = (0, 3, 4, 5, 6, 9)\n try:\n rs_status = self.run_command('replSetGetStatus')\n bad_members = [member for member in rs_status['members']\n if member['state'] in bad_states]\n if bad_members:\n return False\n except pymongo.errors.AutoReconnect:\n # catch 'No replica set primary available' Exception\n return False\n logger.debug(\"all members in correct state\")\n return True"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns True if real state equal config state otherwise False.", "response": "def check_config_state(self):\n \"\"\"Return True if real state equal config state otherwise False.\"\"\"\n config = self.config\n self.update_server_map(config)\n for member in config['members']:\n cfg_member_info = self.default_params.copy()\n cfg_member_info.update(member)\n # Remove attributes we can't check.\n for attr in ('priority', 'votes', 'tags', 'buildIndexes'):\n cfg_member_info.pop(attr, None)\n cfg_member_info['host'] = cfg_member_info['host'].lower()\n\n real_member_info = self.default_params.copy()\n info = self.member_info(member[\"_id\"])\n real_member_info[\"_id\"] = info['_id']\n member_hostname = self._servers.hostname(info['server_id'])\n real_member_info[\"host\"] = member_hostname.lower()\n real_member_info.update(info['rsInfo'])\n logger.debug(\"real_member_info({member_id}): {info}\".format(member_id=member['_id'], info=info))\n for key in cfg_member_info:\n if cfg_member_info[key] != real_member_info.get(key, None):\n logger.debug(\"{key}: {value1} ! = {value2}\".format(key=key, value1=cfg_member_info[key], value2=real_member_info.get(key, None)))\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef restart(self, timeout=300, config_callback=None):\n for member_id in self.server_map:\n host = self.server_map[member_id]\n server_id = self._servers.host_to_server_id(host)\n server = self._servers._storage[server_id]\n server.restart(timeout, config_callback)\n self.waiting_member_state()", "response": "Restart each member of the replica set."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nset path to storage", "response": "def set_settings(self, releases=None, default_release=None):\n \"\"\"set path to storage\"\"\"\n super(ReplicaSets, self).set_settings(releases, default_release)\n Servers().set_settings(releases, default_release)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncreates a new replica set with the given configuration", "response": "def create(self, rs_params):\n \"\"\"create new replica set\n Args:\n rs_params - replica set configuration\n Return repl_id which can use to take the replica set\n \"\"\"\n repl_id = rs_params.get('id', None)\n if repl_id is not None and repl_id in self:\n raise ReplicaSetError(\n \"replica set with id={id} already exists\".format(id=repl_id))\n repl = ReplicaSet(rs_params)\n self[repl.repl_id] = repl\n return repl.repl_id"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nfinding and return primary hostname", "response": "def primary(self, repl_id):\n \"\"\"find and return primary hostname\n Args:\n repl_id - replica set identity\n \"\"\"\n repl = self[repl_id]\n primary = repl.primary()\n return repl.member_info(repl.host2id(primary))"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef remove(self, repl_id):\n repl = self._storage.pop(repl_id)\n repl.cleanup()\n del(repl)", "response": "Removes the specified replica set with kill members\n "} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef command(self, rs_id, command, *args):\n rs = self._storage[rs_id]\n try:\n return getattr(rs, command)(*args)\n except AttributeError:\n raise ValueError(\"Cannot issue the command %r to ReplicaSet %s\"\n % (command, rs_id))", "response": "Call a ReplicaSet method."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef member_del(self, repl_id, member_id):\n repl = self[repl_id]\n result = repl.member_del(member_id)\n self[repl_id] = repl\n return result", "response": "remove member from replica set"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncreate instance and add it to existing replica set", "response": "def member_add(self, repl_id, params):\n \"\"\"create instance and add it to existing replcia\n Args:\n repl_id - replica set identity\n params - member params\n\n return True if operation success otherwise False\n \"\"\"\n repl = self[repl_id]\n member_id = repl.repl_member_add(params)\n self[repl_id] = repl\n return member_id"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef member_command(self, repl_id, member_id, command):\n repl = self[repl_id]\n result = repl.member_command(member_id, command)\n self[repl_id] = repl\n return result", "response": "apply command to the member of the replica set"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef member_update(self, repl_id, member_id, params):\n repl = self[repl_id]\n result = repl.member_update(member_id, params)\n self[repl_id] = repl\n return result", "response": "update params to a member in the replica set"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef key_file(self):\n if self.auth_key:\n key_file_path = os.path.join(orchestration_mkdtemp(), 'key')\n with open(key_file_path, 'w') as fd:\n fd.write(self.auth_key)\n os.chmod(key_file_path, stat.S_IRUSR)\n return key_file_path", "response": "Get the path to the key file containing our auth key or None."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _strip_auth(self, proc_params):\n params = proc_params.copy()\n params.pop(\"auth\", None)\n params.pop(\"clusterAuthMode\", None)\n return params", "response": "Remove options from parameters that cause auth to be enabled."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef mongodb_auth_uri(self, hosts):\n parts = ['mongodb://']\n if self.login:\n parts.append(self.login)\n if self.password:\n parts.append(':' + self.password)\n parts.append('@')\n parts.append(hosts + '/')\n if self.login:\n parts.append('?authSource=' + self.auth_source)\n if self.x509_extra_user:\n parts.append('&authMechanism=MONGODB-X509')\n return ''.join(parts)", "response": "Get a connection string with all info necessary to authenticate."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nadd given user and extra x509 user if necessary.", "response": "def _add_users(self, db, mongo_version):\n \"\"\"Add given user, and extra x509 user if necessary.\"\"\"\n if self.x509_extra_user:\n # Build dict of kwargs to pass to add_user.\n auth_dict = {\n 'name': DEFAULT_SUBJECT,\n 'roles': self._user_roles(db.client)\n }\n db.add_user(**auth_dict)\n # Fix kwargs to MongoClient.\n self.kwargs['ssl_certfile'] = DEFAULT_CLIENT_CERT\n\n # Add secondary user given from request.\n\n secondary_login = {\n 'name': self.login,\n 'roles': self._user_roles(db.client)\n }\n if self.password:\n secondary_login['password'] = self.password\n if mongo_version >= (3, 7, 2):\n # Use SCRAM_SHA-1 so that pymongo < 3.7 can authenticate.\n secondary_login['mechanisms'] = ['SCRAM-SHA-1']\n db.add_user(**secondary_login)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef base_link(rel, self_rel=False):\n link = _BASE_LINKS[rel].copy()\n link['rel'] = 'self' if self_rel else rel\n return link", "response": "Helper for getting a link document under the API root given a rel."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets a list of all links to be included to base API requests.", "response": "def all_base_links(rel_to=None):\n \"\"\"Get a list of all links to be included to base (/) API requests.\"\"\"\n links = [\n base_link('get-releases'),\n base_link('service'),\n server_link('get-servers'),\n server_link('add-server'),\n replica_set_link('add-replica-set'),\n replica_set_link('get-replica-sets'),\n sharded_cluster_link('add-sharded-cluster'),\n sharded_cluster_link('get-sharded-clusters')\n ]\n for link in links:\n if link['rel'] == rel_to:\n link['rel'] = 'self'\n return links"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef server_link(rel, server_id=None, self_rel=False):\n servers_href = '/v1/servers'\n link = _SERVER_LINKS[rel].copy()\n link['href'] = link['href'].format(**locals())\n link['rel'] = 'self' if self_rel else rel\n return link", "response": "Helper for getting a Server link document given a rel."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting a list of all links to be included with Servers.", "response": "def all_server_links(server_id, rel_to=None):\n \"\"\"Get a list of all links to be included with Servers.\"\"\"\n return [\n server_link(rel, server_id, self_rel=(rel == rel_to))\n for rel in ('delete-server', 'get-server-info', 'server-command')\n ]"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef replica_set_link(rel, repl_id=None, member_id=None, self_rel=False):\n repls_href = '/v1/replica_sets'\n link = _REPLICA_SET_LINKS[rel].copy()\n link['href'] = link['href'].format(**locals())\n link['rel'] = 'self' if self_rel else rel\n return link", "response": "Helper for getting a ReplicaSet link document given a rel."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting a list of all links to be included with replica sets.", "response": "def all_replica_set_links(rs_id, rel_to=None):\n \"\"\"Get a list of all links to be included with replica sets.\"\"\"\n return [\n replica_set_link(rel, rs_id, self_rel=(rel == rel_to))\n for rel in (\n 'get-replica-set-info',\n 'delete-replica-set', 'replica-set-command',\n 'get-replica-set-members', 'add-replica-set-member',\n 'get-replica-set-secondaries', 'get-replica-set-primary',\n 'get-replica-set-arbiters', 'get-replica-set-hidden-members',\n 'get-replica-set-passive-members', 'get-replica-set-servers'\n )\n ]"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef all_sharded_cluster_links(cluster_id, shard_id=None,\n router_id=None, rel_to=None):\n \"\"\"Get a list of all links to be included with ShardedClusters.\"\"\"\n return [\n sharded_cluster_link(rel, cluster_id, shard_id, router_id,\n self_rel=(rel == rel_to))\n for rel in (\n 'get-sharded-clusters', 'get-sharded-cluster-info',\n 'sharded-cluster-command', 'delete-sharded-cluster',\n 'add-shard', 'get-shards', 'get-configsvrs',\n 'get-routers', 'add-router'\n )\n ]", "response": "Get a list of all links to be included with ShardedClusters."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef cleanup_storage(*args):\n ShardedClusters().cleanup()\n ReplicaSets().cleanup()\n Servers().cleanup()\n sys.exit(0)", "response": "Clean up processes after SIGTERM or SIGINT is received."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef read_env():\n parser = argparse.ArgumentParser(description='mongo-orchestration server')\n parser.add_argument('-f', '--config',\n action='store', default=None, type=str, dest='config')\n parser.add_argument('-e', '--env',\n action='store', type=str, dest='env', default=None)\n parser.add_argument(action='store', type=str, dest='command',\n default='start', choices=('start', 'stop', 'restart'))\n parser.add_argument('--no-fork',\n action='store_true', dest='no_fork', default=False)\n parser.add_argument('-b', '--bind',\n action='store', dest='bind', type=str,\n default=DEFAULT_BIND)\n parser.add_argument('-p', '--port',\n action='store', dest='port', type=int,\n default=DEFAULT_PORT)\n parser.add_argument('--enable-majority-read-concern', action='store_true',\n default=False)\n parser.add_argument('-s', '--server',\n action='store', dest='server', type=str,\n default=DEFAULT_SERVER, choices=('cherrypy', 'wsgiref'))\n parser.add_argument('--version', action='version',\n version='Mongo Orchestration v' + __version__)\n parser.add_argument('--socket-timeout-ms', action='store',\n dest='socket_timeout',\n type=int, default=DEFAULT_SOCKET_TIMEOUT)\n parser.add_argument('--pidfile', action='store', type=str, dest='pidfile',\n default=PID_FILE)\n\n cli_args = parser.parse_args()\n\n if cli_args.env and not cli_args.config:\n print(\"Specified release '%s' without a config file\" % cli_args.env)\n sys.exit(1)\n if cli_args.command == 'stop' or not cli_args.config:\n return cli_args\n try:\n # read config\n with open(cli_args.config, 'r') as fd:\n config = json.loads(fd.read(), object_pairs_hook=SON)\n if not 'releases' in config:\n print(\"No releases defined in %s\" % cli_args.config)\n sys.exit(1)\n releases = config['releases']\n if cli_args.env is not None and cli_args.env not in releases:\n print(\"Release '%s' is not defined in %s\"\n % (cli_args.env, cli_args.config))\n sys.exit(1)\n cli_args.releases = releases\n return cli_args\n except (IOError):\n print(\"config file not found\")\n sys.exit(1)\n except (ValueError):\n print(\"config file is corrupted\")\n sys.exit(1)", "response": "return command - line arguments for a given environment"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn bottle app that includes all sub - apps", "response": "def get_app():\n \"\"\"return bottle app that includes all sub-apps\"\"\"\n from bottle import default_app\n default_app.push()\n for module in (\"mongo_orchestration.apps.servers\",\n \"mongo_orchestration.apps.replica_sets\",\n \"mongo_orchestration.apps.sharded_clusters\"):\n __import__(module)\n app = default_app.pop()\n return app"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nwaiting for the mongo - orchestration server to accept connections.", "response": "def await_connection(host, port):\n \"\"\"Wait for the mongo-orchestration server to accept connections.\"\"\"\n for i in range(CONNECT_ATTEMPTS):\n try:\n conn = socket.create_connection((host, port), CONNECT_TIMEOUT)\n conn.close()\n return True\n except (IOError, socket.error):\n time.sleep(1)\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef __init_config_params(self, config):\n if self.version >= (2, 4):\n params = config.get('setParameter', {})\n # Set enableTestCommands by default but allow enableTestCommands:0.\n params.setdefault('enableTestCommands', 1)\n # Reduce transactionLifetimeLimitSeconds for faster driver testing.\n if self.version >= (4, 1) and not self.is_mongos:\n params.setdefault('transactionLifetimeLimitSeconds', 3)\n # Increase transaction lock timeout to reduce the chance that tests\n # fail with LockTimeout: \"Unable to acquire lock {...} within 5ms\".\n if self.version >= (4, 0) and not self.is_mongos:\n params.setdefault('maxTransactionLockRequestTimeoutMillis', 25)\n config['setParameter'] = params\n\n compressors = config.get('networkMessageCompressors')\n if compressors is None:\n if self.version >= (4, 1, 7):\n # SERVER-38168 added zstd support in 4.1.7.\n config['networkMessageCompressors'] = 'zstd,zlib,snappy,noop'\n elif self.version >= (3, 5, 9):\n # SERVER-27310 added zlib support in 3.5.9.\n config['networkMessageCompressors'] = 'zlib,snappy,noop'\n elif self.version >= (3, 4):\n config['networkMessageCompressors'] = 'snappy,noop'", "response": "Conditionally enable options in the Server s config file."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef version(self):\n if not self.__version:\n command = (self.name, '--version')\n logger.debug(command)\n stdout, _ = subprocess.Popen(\n command, stdout=subprocess.PIPE).communicate()\n version_output = str(stdout)\n match = re.search(self.version_patt, version_output)\n if match is None:\n raise ServersError(\n 'Could not determine version of %s from string: %s'\n % (self.name, version_output))\n version_string = match.group('version')\n self.__version = tuple(map(int, version_string.split('.')))\n return self.__version", "response": "Get the version of MongoDB that this Server runs as a tuple."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef run_command(self, command, arg=None, is_eval=False):\n mode = is_eval and 'eval' or 'command'\n\n if isinstance(arg, tuple):\n name, d = arg\n else:\n name, d = arg, {}\n\n result = getattr(self.connection.admin, mode)(command, name, **d)\n return result", "response": "run command on the server"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef info(self):\n proc_info = {\"name\": self.name,\n \"params\": self.cfg,\n \"alive\": self.is_alive,\n \"optfile\": self.config_path}\n if self.is_alive:\n proc_info['pid'] = self.proc.pid\n logger.debug(\"proc_info: {proc_info}\".format(**locals()))\n mongodb_uri = ''\n server_info = {}\n status_info = {}\n if self.hostname and self.cfg.get('port', None):\n try:\n c = self.connection\n server_info = c.server_info()\n logger.debug(\"server_info: {server_info}\".format(**locals()))\n mongodb_uri = 'mongodb://' + self.hostname\n status_info = {\"primary\": c.is_primary, \"mongos\": c.is_mongos}\n logger.debug(\"status_info: {status_info}\".format(**locals()))\n except (pymongo.errors.AutoReconnect, pymongo.errors.OperationFailure, pymongo.errors.ConnectionFailure):\n server_info = {}\n status_info = {}\n\n result = {\"mongodb_uri\": mongodb_uri, \"statuses\": status_info,\n \"serverInfo\": server_info, \"procInfo\": proc_info,\n \"orchestration\": 'servers'}\n if self.login:\n result['mongodb_auth_uri'] = self.mongodb_auth_uri(self.hostname)\n logger.debug(\"return {result}\".format(result=result))\n return result", "response": "return info about server as dict object"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef start(self, timeout=300):\n if self.is_alive:\n return True\n try:\n dbpath = self.cfg.get('dbpath')\n if dbpath and self._is_locked:\n # repair if needed\n logger.info(\"Performing repair on locked dbpath %s\", dbpath)\n process.repair_mongo(self.name, self.cfg['dbpath'])\n\n self.proc, self.hostname = process.mprocess(\n self.name, self.config_path, self.cfg.get('port', None),\n timeout, self.silence_stdout)\n self.pid = self.proc.pid\n logger.debug(\"pid={pid}, hostname={hostname}\".format(pid=self.pid, hostname=self.hostname))\n self.host = self.hostname.split(':')[0]\n self.port = int(self.hostname.split(':')[1])\n\n # Wait for Server to respond to isMaster.\n # Only try 6 times, each ConnectionFailure is 30 seconds.\n max_attempts = 6\n for i in range(max_attempts):\n try:\n self.run_command('isMaster')\n break\n except pymongo.errors.ConnectionFailure:\n logger.exception('isMaster command failed:')\n else:\n raise TimeoutError(\n \"Server did not respond to 'isMaster' after %d attempts.\"\n % max_attempts)\n except (OSError, TimeoutError):\n logpath = self.cfg.get('logpath')\n if logpath:\n # Copy the server logs into the mongo-orchestration logs.\n logger.error(\n \"Could not start Server. Please find server log below.\\n\"\n \"=====================================================\")\n with open(logpath) as lp:\n logger.error(lp.read())\n else:\n logger.exception(\n 'Could not start Server, and no logpath was provided!')\n reraise(TimeoutError,\n 'Could not start Server. '\n 'Please check server log located in ' +\n self.cfg.get('logpath', '') +\n ' or the mongo-orchestration log in ' +\n LOG_FILE + ' for more details.')\n if self.restart_required:\n if self.login:\n # Add users to the appropriate database.\n self._add_users()\n self.stop()\n\n # Restart with keyfile and auth.\n if self.is_mongos:\n self.config_path, self.cfg = self.__init_mongos(self.cfg)\n else:\n # Add auth options to this Server's config file.\n self.config_path, self.cfg = self.__init_mongod(\n self.cfg, add_auth=True)\n self.restart_required = False\n self.start()\n\n return True", "response": "start server\n return True of False"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nsend shutdown command and wait for the process to exit.", "response": "def shutdown(self):\n \"\"\"Send shutdown command and wait for the process to exit.\"\"\"\n # Return early if this server has already exited.\n if not process.proc_alive(self.proc):\n return\n logger.info(\"Attempting to connect to %s\", self.hostname)\n client = self.connection\n # Attempt the shutdown command twice, the first attempt might fail due\n # to an election.\n attempts = 2\n for i in range(attempts):\n logger.info(\"Attempting to send shutdown command to %s\",\n self.hostname)\n try:\n client.admin.command(\"shutdown\", force=True)\n except ConnectionFailure:\n # A shutdown succeeds by closing the connection but a\n # connection error does not necessarily mean that the shutdown\n # has succeeded.\n pass\n # Wait for the server to exit otherwise rerun the shutdown command.\n try:\n return process.wait_mprocess(self.proc, 5)\n except TimeoutError as exc:\n logger.info(\"Timed out waiting on process: %s\", exc)\n continue\n raise ServersError(\"Server %s failed to shutdown after %s attempts\" %\n (self.hostname, attempts))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef restart(self, timeout=300, config_callback=None):\n self.stop()\n if config_callback:\n self.cfg = config_callback(self.cfg.copy())\n self.config_path = process.write_config(self.cfg)\n return self.start(timeout)", "response": "restart server: stop() and start()\n return status of start command"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates new server with specific parameters", "response": "def create(self, name, procParams, sslParams={},\n auth_key=None, login=None, password=None,\n auth_source='admin', timeout=300, autostart=True,\n server_id=None, version=None):\n \"\"\"create new server\n Args:\n name - process name or path\n procParams - dictionary with specific params for instance\n auth_key - authorization key\n login - username for the admin collection\n password - password\n timeout - specify how long, in seconds, a command can take before times out.\n autostart - (default: True), autostart instance\n Return server_id\n where server_id - id which can use to take the server from servers collection\n \"\"\"\n name = os.path.split(name)[1]\n if server_id is None:\n server_id = str(uuid4())\n if server_id in self:\n raise ServersError(\"Server with id %s already exists.\" % server_id)\n\n bin_path = self.bin_path(version)\n server = Server(os.path.join(bin_path, name), procParams, sslParams,\n auth_key, login, password, auth_source)\n if autostart:\n server.start(timeout)\n self[server_id] = server\n return server_id"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef remove(self, server_id):\n server = self._storage.pop(server_id)\n server.stop()\n server.cleanup()", "response": "remove server and data stuff\n "} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef command(self, server_id, command, *args):\n server = self._storage[server_id]\n try:\n if args:\n result = getattr(server, command)(*args)\n else:\n result = getattr(server, command)()\n except AttributeError:\n raise ValueError(\"Cannot issue the command %r to server %s\"\n % (command, server_id))\n self._storage[server_id] = server\n return result", "response": "run command in the cache"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns dicionary object with info about server", "response": "def info(self, server_id):\n \"\"\"return dicionary object with info about server\n Args:\n server_id - server identity\n \"\"\"\n result = self._storage[server_id].info()\n result['id'] = server_id\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsets path to storage", "response": "def set_settings(self, releases=None, default_release=None):\n \"\"\"set path to storage\"\"\"\n if (self._storage is None or\n getattr(self, 'releases', {}) != releases or\n getattr(self, 'default_release', '') != default_release):\n self._storage = {}\n self.releases = releases or {}\n self.default_release = default_release"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef bin_path(self, release=None):\n if release:\n for r in self.releases:\n if release in r:\n return self.releases[r]\n raise MongoOrchestrationError(\"No such release '%s' in %r\"\n % (release, self.releases))\n if self.default_release:\n return self.releases[self.default_release]\n if self.releases:\n return list(self.releases.values())[0]\n return ''", "response": "Get the bin path for a particular release."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating and start a config replica set.", "response": "def __init_configrs(self, rs_cfg):\n \"\"\"Create and start a config replica set.\"\"\"\n # Use 'rs_id' to set the id for consistency, but need to rename\n # to 'id' to use with ReplicaSets.create()\n rs_cfg['id'] = rs_cfg.pop('rs_id', None)\n for member in rs_cfg.setdefault('members', [{}]):\n member['procParams'] = self._strip_auth(\n member.get('procParams', {}))\n member['procParams']['configsvr'] = True\n if self.enable_ipv6:\n common.enable_ipv6_single(member['procParams'])\n rs_cfg['sslParams'] = self.sslParams\n self._configsvrs.append(ReplicaSets().create(rs_cfg))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncreate and start config servers", "response": "def __init_configsvrs(self, params):\n \"\"\"create and start config servers\"\"\"\n self._configsvrs = []\n for cfg in params:\n # Remove flags that turn on auth.\n cfg = self._strip_auth(cfg)\n server_id = cfg.pop('server_id', None)\n version = cfg.pop('version', self._version)\n cfg.update({'configsvr': True})\n if self.enable_ipv6:\n common.enable_ipv6_single(cfg)\n self._configsvrs.append(Servers().create(\n 'mongod', cfg, sslParams=self.sslParams, autostart=True,\n version=version, server_id=server_id))"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns list of config servers", "response": "def configsvrs(self):\n \"\"\"return list of config servers\"\"\"\n if self.uses_rs_configdb:\n rs_id = self._configsvrs[0]\n mongodb_uri = ReplicaSets().info(rs_id)['mongodb_uri']\n return [{'id': rs_id, 'mongodb_uri': mongodb_uri}]\n return [{'id': h_id, 'hostname': Servers().hostname(h_id)}\n for h_id in self._configsvrs]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning first available router", "response": "def router(self):\n \"\"\"return first available router\"\"\"\n for server in self._routers:\n info = Servers().info(server)\n if info['procInfo'].get('alive', False):\n return {'id': server, 'hostname': Servers().hostname(server)}"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nadding new router into existing configuration", "response": "def router_add(self, params):\n \"\"\"add new router (mongos) into existing configuration\"\"\"\n if self.uses_rs_configdb:\n # Replica set configdb.\n rs_id = self._configsvrs[0]\n config_members = ReplicaSets().members(rs_id)\n configdb = '%s/%s' % (\n rs_id, ','.join(m['host'] for m in config_members))\n else:\n configdb = ','.join(Servers().hostname(item)\n for item in self._configsvrs)\n server_id = params.pop('server_id', None)\n version = params.pop('version', self._version)\n params.update({'configdb': configdb})\n\n if self.enable_ipv6:\n common.enable_ipv6_single(params)\n # Remove flags that turn auth on.\n params = self._strip_auth(params)\n\n self._routers.append(Servers().create(\n 'mongos', params, sslParams=self.sslParams, autostart=True,\n version=version, server_id=server_id))\n return {'id': self._routers[-1], 'hostname': Servers().hostname(self._routers[-1])}"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef router_connections(self):\n clients = []\n for server in self._routers:\n if Servers().is_alive(server):\n client = self.create_connection(Servers().hostname(server))\n clients.append(client)\n return clients", "response": "Return a list of MongoClients one for each mongos."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nadd new member into existing configuration", "response": "def member_add(self, member_id=None, params=None):\n \"\"\"add new member into existing configuration\"\"\"\n member_id = member_id or str(uuid4())\n if self.enable_ipv6:\n common.enable_ipv6_repl(params)\n if 'members' in params:\n # is replica set\n for member in params['members']:\n if not member.get('rsParams', {}).get('arbiterOnly', False):\n member.setdefault('procParams', {})['shardsvr'] = True\n rs_params = params.copy()\n # Turn 'rs_id' -> 'id', to be consistent with 'server_id' below.\n rs_params['id'] = rs_params.pop('rs_id', None)\n rs_params.update({'sslParams': self.sslParams})\n\n rs_params['version'] = params.pop('version', self._version)\n rs_params['members'] = [\n self._strip_auth(params) for params in rs_params['members']]\n rs_id = ReplicaSets().create(rs_params)\n members = ReplicaSets().members(rs_id)\n cfgs = rs_id + r\"/\" + ','.join([item['host'] for item in members])\n result = self._add(cfgs, member_id)\n if result.get('ok', 0) == 1:\n self._shards[result['shardAdded']] = {'isReplicaSet': True, '_id': rs_id}\n # return self._shards[result['shardAdded']].copy()\n return self.member_info(member_id)\n\n else:\n # is single server\n params.setdefault('procParams', {})['shardsvr'] = True\n params.update({'autostart': True, 'sslParams': self.sslParams})\n params = params.copy()\n params['procParams'] = self._strip_auth(\n params.get('procParams', {}))\n params.setdefault('version', self._version)\n logger.debug(\"servers create params: {params}\".format(**locals()))\n server_id = Servers().create('mongod', **params)\n result = self._add(Servers().hostname(server_id), member_id)\n if result.get('ok', 0) == 1:\n self._shards[result['shardAdded']] = {'isServer': True, '_id': server_id}\n return self.member_info(member_id)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef member_info(self, member_id):\n info = self._shards[member_id].copy()\n info['id'] = member_id\n info['tags'] = self.tags.get(member_id, list())\n return info", "response": "return info about member"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nremoves member from configuration", "response": "def _remove(self, shard_name):\n \"\"\"remove member from configuration\"\"\"\n result = self.router_command(\"removeShard\", shard_name, is_eval=False)\n if result['ok'] == 1 and result['state'] == 'completed':\n shard = self._shards.pop(shard_name)\n if shard.get('isServer', False):\n Servers().remove(shard['_id'])\n if shard.get('isReplicaSet', False):\n ReplicaSets().remove(shard['_id'])\n return result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nresetting all the configuration servers and routers.", "response": "def reset(self):\n \"\"\"Ensure all shards, configs, and routers are running and available.\"\"\"\n # Ensure all shards by calling \"reset\" on each.\n for shard_id in self._shards:\n if self._shards[shard_id].get('isReplicaSet'):\n singleton = ReplicaSets()\n elif self._shards[shard_id].get('isServer'):\n singleton = Servers()\n singleton.command(self._shards[shard_id]['_id'], 'reset')\n # Ensure all config servers by calling \"reset\" on each.\n for config_id in self._configsvrs:\n self.configdb_singleton.command(config_id, 'reset')\n # Ensure all routers by calling \"reset\" on each.\n for router_id in self._routers:\n Servers().command(router_id, 'reset')\n return self.info()"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn info about configuration", "response": "def info(self):\n \"\"\"return info about configuration\"\"\"\n uri = ','.join(x['hostname'] for x in self.routers)\n mongodb_uri = 'mongodb://' + uri\n result = {'id': self.id,\n 'shards': self.members,\n 'configsvrs': self.configsvrs,\n 'routers': self.routers,\n 'mongodb_uri': mongodb_uri,\n 'orchestration': 'sharded_clusters'}\n if self.login:\n result['mongodb_auth_uri'] = self.mongodb_auth_uri(uri)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef cleanup(self):\n for _id, shard in self._shards.items():\n if shard.get('isServer', False):\n Servers().remove(shard['_id'])\n if shard.get('isReplicaSet', False):\n ReplicaSets().remove(shard['_id'])\n\n for mongos in self._routers:\n Servers().remove(mongos)\n\n for config_id in self._configsvrs:\n self.configdb_singleton.remove(config_id)\n\n self._configsvrs = []\n self._routers = []\n self._shards = {}", "response": "cleanup configuration stop and remove all servers and replicas"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef set_settings(self, releases=None, default_release=None):\n super(ShardedClusters, self).set_settings(releases, default_release)\n ReplicaSets().set_settings(releases, default_release)", "response": "set path to storage"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef create(self, params):\n sh_id = params.get('id', str(uuid4()))\n if sh_id in self:\n raise ShardedClusterError(\n \"Sharded cluster with id %s already exists.\" % sh_id)\n params['id'] = sh_id\n cluster = ShardedCluster(params)\n self[cluster.id] = cluster\n return cluster.id", "response": "create new ShardedCluster object"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nremove cluster and data stuff", "response": "def remove(self, cluster_id):\n \"\"\"remove cluster and data stuff\n Args:\n cluster_id - cluster identity\n \"\"\"\n cluster = self._storage.pop(cluster_id)\n cluster.cleanup()"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nremoving router from the ShardedCluster", "response": "def router_del(self, cluster_id, router_id):\n \"\"\"remove router from the ShardedCluster\"\"\"\n cluster = self._storage[cluster_id]\n result = cluster.router_remove(router_id)\n self._storage[cluster_id] = cluster\n return result"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef member_info(self, cluster_id, member_id):\n cluster = self._storage[cluster_id]\n return cluster.member_info(member_id)", "response": "return info about a member"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef command(self, cluster_id, command, *args):\n cluster = self._storage[cluster_id]\n try:\n return getattr(cluster, command)(*args)\n except AttributeError:\n raise ValueError(\"Cannot issue the command %r to ShardedCluster %s\"\n % (command, cluster_id))", "response": "Call a ShardedCluster method."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nremove member from cluster", "response": "def member_del(self, cluster_id, member_id):\n \"\"\"remove member from cluster cluster\"\"\"\n cluster = self._storage[cluster_id]\n result = cluster.member_remove(member_id)\n self._storage[cluster_id] = cluster\n return result"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding new member into configuration", "response": "def member_add(self, cluster_id, params):\n \"\"\"add new member into configuration\"\"\"\n cluster = self._storage[cluster_id]\n result = cluster.member_add(params.get('id', None), params.get('shardParams', {}))\n self._storage[cluster_id] = cluster\n return result"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef expand_dir(_dir, cwd=os.getcwd()):\n _dir = os.path.expanduser(os.path.expandvars(_dir))\n if not os.path.isabs(_dir):\n _dir = os.path.normpath(os.path.join(cwd, _dir))\n return _dir", "response": "Return path with environmental variables and tilde ~ expanded."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning expanded configuration. end-user configuration permit inline configuration shortcuts, expand to identical format for parsing. :param config: the repo config in :py:class:`dict` format. :type config: dict :param cwd: current working dir (for deciphering relative paths) :type cwd: str :rtype: list", "response": "def extract_repos(config, cwd=os.getcwd()):\n \"\"\"Return expanded configuration.\n\n end-user configuration permit inline configuration shortcuts, expand to\n identical format for parsing.\n\n :param config: the repo config in :py:class:`dict` format.\n :type config: dict\n :param cwd: current working dir (for deciphering relative paths)\n :type cwd: str\n :rtype: list\n\n \"\"\"\n configs = []\n for directory, repos in config.items():\n for repo, repo_data in repos.items():\n\n conf = {}\n\n '''\n repo_name: http://myrepo.com/repo.git\n\n to\n\n repo_name: { url: 'http://myrepo.com/repo.git' }\n\n also assures the repo is a :py:class:`dict`.\n '''\n\n if isinstance(repo_data, string_types):\n conf['url'] = repo_data\n else:\n conf = update_dict(conf, repo_data)\n\n if 'repo' in conf:\n if 'url' not in conf:\n conf['url'] = conf.pop('repo')\n else:\n conf.pop('repo', None)\n\n '''\n ``shell_command_after``: if str, turn to list.\n '''\n if 'shell_command_after' in conf:\n if isinstance(conf['shell_command_after'], string_types):\n conf['shell_command_after'] = [conf['shell_command_after']]\n\n if 'name' not in conf:\n conf['name'] = repo\n if 'parent_dir' not in conf:\n conf['parent_dir'] = expand_dir(directory, cwd)\n\n if 'repo_dir' not in conf:\n conf['repo_dir'] = expand_dir(\n os.path.join(conf['parent_dir'], conf['name']), cwd\n )\n if 'remotes' in conf:\n remotes = []\n for remote_name, url in conf['remotes'].items():\n remotes.append({'remote_name': remote_name, 'url': url})\n conf['remotes'] = sorted(\n remotes, key=lambda x: sorted(x.get('remote_name'))\n )\n configs.append(conf)\n\n return configs"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef find_home_config_files(filetype=['json', 'yaml']):\n configs = []\n\n yaml_config = os.path.expanduser('~/.vcspull.yaml')\n has_yaml_config = os.path.exists(yaml_config)\n json_config = os.path.expanduser('~/.vcspull.json')\n has_json_config = os.path.exists(json_config)\n\n if not has_yaml_config and not has_json_config:\n log.debug(\n 'No config file found. Create a .vcspull.yaml or .vcspull.json'\n ' in your $HOME directory. http://vcspull.git-pull.com for a'\n ' quickstart.'\n )\n else:\n if sum(filter(None, [has_json_config, has_yaml_config])) > int(1):\n raise exc.MultipleConfigWarning()\n if has_yaml_config:\n configs.append(yaml_config)\n if has_json_config:\n configs.append(json_config)\n\n return configs", "response": "Return configs of. vcspull. json in user s home directory."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef find_config_files(\n path=['~/.vcspull'], match=['*'], filetype=['json', 'yaml'], include_home=False\n):\n \"\"\"Return repos from a directory and match. Not recursive.\n\n :param path: list of paths to search\n :type path: list\n :param match: list of globs to search against\n :type match: list\n :param filetype: list of filetypes to search against\n :type filetype: list\n :param include_home: Include home configuration files\n :type include_home: bool\n :raises:\n - LoadConfigRepoConflict: There are two configs that have same path\n and name with different repo urls.\n :returns: list of absolute paths to config files.\n :rtype: list\n\n \"\"\"\n configs = []\n\n if include_home is True:\n configs.extend(find_home_config_files())\n\n if isinstance(path, list):\n for p in path:\n configs.extend(find_config_files(p, match, filetype))\n return configs\n else:\n path = os.path.expanduser(path)\n if isinstance(match, list):\n for m in match:\n configs.extend(find_config_files(path, m, filetype))\n else:\n if isinstance(filetype, list):\n for f in filetype:\n configs.extend(find_config_files(path, match, f))\n else:\n match = os.path.join(path, match)\n match += \".{filetype}\".format(filetype=filetype)\n\n configs = glob.glob(match)\n\n return configs", "response": "Return repos from a directory and match. Not recursive."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef load_configs(files, cwd=os.getcwd()):\n repos = []\n for f in files:\n _, ext = os.path.splitext(f)\n conf = kaptan.Kaptan(handler=ext.lstrip('.')).import_config(f)\n\n newrepos = extract_repos(conf.export('dict'), cwd)\n\n if not repos:\n repos.extend(newrepos)\n continue\n\n dupes = detect_duplicate_repos(repos, newrepos)\n\n if dupes:\n msg = ('repos with same path + different VCS detected!', dupes)\n raise exc.VCSPullException(msg)\n repos.extend(newrepos)\n\n return repos", "response": "Load a list of files into a list of repos."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef detect_duplicate_repos(repos1, repos2):\n dupes = []\n path_dupe_repos = []\n\n curpaths = [r['repo_dir'] for r in repos1]\n newpaths = [r['repo_dir'] for r in repos2]\n path_duplicates = list(set(curpaths).intersection(newpaths))\n\n if not path_duplicates:\n return None\n\n path_dupe_repos.extend(\n [r for r in repos2 if any(r['repo_dir'] == p for p in path_duplicates)]\n )\n\n if not path_dupe_repos:\n return None\n\n for n in path_dupe_repos:\n currepo = next((r for r in repos1 if r['repo_dir'] == n['repo_dir']), None)\n if n['url'] != currepo['url']:\n dupes += (n, currepo)\n return dupes", "response": "Return duplicate repos dict if repo_dir same and vcs different."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef in_dir(config_dir=CONFIG_DIR, extensions=['.yml', '.yaml', '.json']):\n configs = []\n\n for filename in os.listdir(config_dir):\n if is_config_file(filename, extensions) and not filename.startswith('.'):\n configs.append(filename)\n\n return configs", "response": "Return a list of configs in config_dir."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef filter_repos(config, repo_dir=None, vcs_url=None, name=None):\n repo_list = []\n\n if repo_dir:\n repo_list.extend(\n [r for r in config if fnmatch.fnmatch(r['parent_dir'], repo_dir)]\n )\n\n if vcs_url:\n repo_list.extend(\n r for r in config if fnmatch.fnmatch(r.get('url', r.get('repo')), vcs_url)\n )\n\n if name:\n repo_list.extend([r for r in config if fnmatch.fnmatch(r.get('name'), name)])\n\n return repo_list", "response": "Return a list of repos from a config file."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncopy information from src to dest.", "response": "def copy_node_info(src, dest):\n \"\"\"Copy information from src to dest\n\n Every node in the AST has to have line number information. Get\n the information from the old stmt.\"\"\"\n for attr in ['lineno', 'fromlineno', 'tolineno',\n 'col_offset', 'parent']:\n if hasattr(src, attr):\n setattr(dest, attr, getattr(src, attr))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef make_non_magical_flask_import(flask_ext_name):\n '''Convert a flask.ext.admin into flask_admin.'''\n match = re.match(r'flask\\.ext\\.(.*)', flask_ext_name)\n if match is None:\n raise LookupError(\"Module name `{}` doesn't match\"\n \"`flask.ext` style import.\")\n from_name = match.group(1)\n actual_module_name = 'flask_{}'.format(from_name)\n return actual_module_name", "response": "Convert a flask. ext. admin into flask_admin."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef transform_flask_from_import(node):\n '''Translates a flask.ext from-style import into a non-magical import.\n\n Translates:\n from flask.ext import wtf, bcrypt as fcrypt\n Into:\n import flask_wtf as wtf, flask_bcrypt as fcrypt\n\n '''\n new_names = []\n # node.names is a list of 2-tuples. Each tuple consists of (name, as_name).\n # So, the import would be represented as:\n #\n # from flask.ext import wtf as ftw, admin\n #\n # node.names = [('wtf', 'ftw'), ('admin', None)]\n for (name, as_name) in node.names:\n actual_module_name = 'flask_{}'.format(name)\n new_names.append((actual_module_name, as_name or name))\n\n new_node = nodes.Import()\n copy_node_info(node, new_node)\n new_node.names = new_names\n mark_transformed(new_node)\n return new_node", "response": "Translates a flask. ext from - style import into a non - magical import."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef transform_flask_from_long(node):\n '''Translates a flask.ext.wtf from-style import into a non-magical import.\n\n Translates:\n from flask.ext.wtf import Form\n from flask.ext.admin.model import InlineFormAdmin\n Into:\n from flask_wtf import Form\n from flask_admin.model import InlineFormAdmin\n\n '''\n actual_module_name = make_non_magical_flask_import(node.modname)\n new_node = nodes.ImportFrom(actual_module_name, node.names, node.level)\n copy_node_info(node, new_node)\n mark_transformed(new_node)\n return new_node", "response": "Translates a flask. ext. wtf from - style import into a non - magical import."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef transform_flask_bare_import(node):\n '''Translates a flask.ext.wtf bare import into a non-magical import.\n\n Translates:\n import flask.ext.admin as admin\n Into:\n import flask_admin as admin\n '''\n\n new_names = []\n for (name, as_name) in node.names:\n match = re.match(r'flask\\.ext\\.(.*)', name)\n from_name = match.group(1)\n actual_module_name = 'flask_{}'.format(from_name)\n new_names.append((actual_module_name, as_name))\n\n new_node = nodes.Import()\n copy_node_info(node, new_node)\n new_node.names = new_names\n mark_transformed(new_node)\n return new_node", "response": "Translates a flask. ext. wtf bare import into a non - magical import."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\npack the given binary data into an API frame and writes the result to the serial port.", "response": "def _write(self, data):\n \"\"\"\n _write: binary data -> None\n\n Packages the given binary data in an API frame and writes the\n result to the serial port\n \"\"\"\n frame = APIFrame(data, self._escaped).output()\n self.serial.write(frame)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nbuilding a command packet according to the specified command.", "response": "def _build_command(self, cmd, **kwargs):\n \"\"\"\n _build_command: string (binary data) ... -> binary data\n\n _build_command will construct a command packet according to the\n specified command's specification in api_commands. It will expect\n named arguments for all fields other than those with a default\n value or a length of 'None'.\n\n Each field will be written out in the order they are defined\n in the command definition.\n \"\"\"\n try:\n cmd_spec = self.api_commands[cmd]\n except AttributeError:\n raise NotImplementedError(\"API command specifications could not be \"\n \"found; use a derived class which defines\"\n \" 'api_commands'.\")\n\n packet = b''\n\n for field in cmd_spec:\n try:\n # Read this field's name from the function arguments dict\n data = kwargs[field['name']]\n if isinstance(data, str):\n data = stringToBytes(data)\n\n except KeyError:\n # Data wasn't given\n # Only a problem if the field has a specific length\n if field['len'] is not None:\n # Was a default value specified?\n default_value = field['default']\n if default_value:\n # If so, use it\n data = default_value\n else:\n # Otherwise, fail\n raise KeyError(\n \"The expected field {} of length {} \"\n \"was not provided\".format(\n field['name'], field['len']\n )\n )\n else:\n # No specific length, ignore it\n data = None\n\n # Ensure that the proper number of elements will be written\n if field['len'] and len(data) != field['len']:\n raise ValueError(\n \"The data provided for '{}' was not {} \"\n \"bytes long\".format(field['name'], field['len'])\n )\n\n # Add the data to the packet, if it has been specified.\n # Otherwise, the parameter was of variable length, and not given.\n if data:\n packet += data\n\n return packet"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _parse_samples_header(self, io_bytes):\n header_size = 3\n\n # number of samples (always 1?) is the first byte\n sample_count = byteToInt(io_bytes[0])\n\n # part of byte 1 and byte 2 are the DIO mask ( 9 bits )\n dio_mask = (byteToInt(io_bytes[1]) << 8 | byteToInt(io_bytes[2])) \\\n & 0x01FF\n\n # upper 7 bits of byte 1 is the AIO mask\n aio_mask = (byteToInt(io_bytes[1]) & 0xFE) >> 1\n\n # sorted lists of enabled channels; value is position of bit in mask\n dio_chans = []\n aio_chans = []\n\n for i in range(0, 9):\n if dio_mask & (1 << i):\n dio_chans.append(i)\n\n dio_chans.sort()\n\n for i in range(0, 7):\n if aio_mask & (1 << i):\n aio_chans.append(i)\n\n aio_chans.sort()\n\n return (sample_count, dio_chans, aio_chans, dio_mask, header_size)", "response": "_parse_samples_header parses the samples header from the XBee IO data format and returns the number of samples which follow the header."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nsend an API command to the device.", "response": "def send(self, cmd, **kwargs):\n \"\"\"\n send: string param=binary data ... -> None\n\n When send is called with the proper arguments, an API command\n will be written to the serial port for this XBee device\n containing the proper instructions and data.\n\n This method must be called with named arguments in accordance\n with the api_command specification. Arguments matching all\n field names other than those in reserved_names (like 'id' and\n 'order') should be given, unless they are of variable length\n (of 'None' in the specification. Those are optional).\n \"\"\"\n # Pass through the keyword arguments\n self._write(self._build_command(cmd, **kwargs))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef main():\n alarm = XBeeAlarm('/dev/ttyUSB0', '\\x56\\x78')\n routine = SimpleWakeupRoutine(alarm)\n\n from time import sleep\n while True:\n \"\"\"\n Run the routine with 10 second delays\n \"\"\"\n try:\n print \"Waiting 5 seconds...\"\n sleep(5)\n print \"Firing\"\n routine.trigger()\n except KeyboardInterrupt:\n break", "response": "Main function for the main loop"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _reset(self):\n # Analog pin 0\n self.hw.remote_at(\n dest_addr=self.remote_addr,\n command='D0',\n parameter='\\x02')\n\n # Disengage remote LED, buzzer\n self.deactivate()\n self._set_send_samples(False)", "response": "Reset the remote XBee device to a standard configuration."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _set_LED(self, status):\n # DIO pin 1 (LED), active low\n self.hw.remote_at(\n dest_addr=self.remote_addr,\n command='D1',\n parameter='\\x04' if status else '\\x05')", "response": "Private method to set the status of the remote LED."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn True if the BED is currently occupied by requesting data from the remote device.", "response": "def bed_occupied(self):\n \"\"\"\n bed_occupied: None -> boolean\n\n Determines whether the bed is currently occupied by requesting\n data from the remote XBee and comparing the analog value with\n a threshold.\n \"\"\"\n\n # Receive samples from the remote device\n self._set_send_samples(True)\n\n while True:\n packet = self.hw.wait_read_frame()\n\n if 'adc-0' in packet['samples'][0]:\n # Stop receiving samples from the remote device\n self._set_send_samples(False)\n return packet['samples'][0]['adc-0'] > XBeeAlarm.DETECT_THRESH"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the checksum of the current entry in the sequence.", "response": "def checksum(self):\n \"\"\"\n checksum: None -> single checksum byte\n\n checksum adds all bytes of the binary, unescaped data in the\n frame, saves the last byte of the result, and subtracts it from\n 0xFF. The final result is the checksum\n \"\"\"\n total = 0\n\n # Add together all bytes\n for byte in self.data:\n total += byteToInt(byte)\n\n # Only keep the last byte\n total = total & 0xFF\n\n return intToByte(0xFF - total)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef verify(self, chksum):\n total = 0\n\n # Add together all bytes\n for byte in self.data:\n total += byteToInt(byte)\n\n # Add checksum too\n total += byteToInt(chksum)\n\n # Only keep low bits\n total &= 0xFF\n\n # Check result\n return total == 0xFF", "response": "1 byte -> boolean\n verify checksums the frame adds the expected checksum and returns True if the result is correct."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\noutputs the data of the current entry point into a valid API frame.", "response": "def output(self):\n \"\"\"\n output: None -> valid API frame (binary data)\n\n output will produce a valid API frame for transmission to an\n XBee module.\n \"\"\"\n # start is one byte long, length is two bytes\n # data is n bytes long (indicated by length)\n # chksum is one byte long\n data = self.len_bytes() + self.data + self.checksum()\n\n # Only run the escaoe process if it hasn't been already\n if self.escaped and len(self.raw_data) < 1:\n self.raw_data = APIFrame.escape(data)\n\n if self.escaped:\n data = self.raw_data\n\n # Never escape start byte\n return APIFrame.START_BYTE + data"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nescape the given data string for the internal use of the API.", "response": "def escape(data):\n \"\"\"\n escape: byte string -> byte string\n\n When a 'special' byte is encountered in the given data string,\n it is preceded by an escape byte and XORed with 0x20.\n \"\"\"\n\n escaped_data = b\"\"\n for byte in data:\n if intToByte(byteToInt(byte)) in APIFrame.ESCAPE_BYTES:\n escaped_data += APIFrame.ESCAPE_BYTE\n escaped_data += intToByte(0x20 ^ byteToInt(byte))\n else:\n escaped_data += intToByte(byteToInt(byte))\n\n return escaped_data"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fill(self, byte):\n\n if self._unescape_next_byte:\n byte = intToByte(byteToInt(byte) ^ 0x20)\n self._unescape_next_byte = False\n elif self.escaped and byte == APIFrame.ESCAPE_BYTE:\n self._unescape_next_byte = True\n return\n\n self.raw_data += intToByte(byteToInt(byte))", "response": "Fill the internal data of the internal data structure with the given byte."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nparsing the data contained in the API frame and verifies it against its checksum.", "response": "def parse(self):\n \"\"\"\n parse: None -> None\n\n Given a valid API frame, parse extracts the data contained\n inside it and verifies it against its checksum\n \"\"\"\n if len(self.raw_data) < 3:\n ValueError(\"parse() may only be called on a frame containing at \"\n \"least 3 bytes of raw data (see fill())\")\n\n # First two bytes are the length of the data\n raw_len = self.raw_data[1:3]\n\n # Unpack it\n data_len = struct.unpack(\"> h\", raw_len)[0]\n\n # Read the data\n data = self.raw_data[3:3 + data_len]\n chksum = self.raw_data[-1]\n\n # Checksum check\n self.data = data\n if not self.verify(chksum):\n raise ValueError(\"Invalid checksum\")"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _parse_IS_at_response(self, packet_info):\n if packet_info['id'] in ('at_response', 'remote_at_response') and \\\n packet_info['command'].lower() == b'is' and \\\n packet_info['status'] == b'\\x00':\n return self._parse_samples(packet_info['parameter'])\n else:\n return packet_info['parameter']", "response": "Parse the IS at response packet."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nparses the ND AT response packet and returns a dictionary of the ND data structures.", "response": "def _parse_ND_at_response(self, packet_info):\n \"\"\"\n If the given packet is a successful AT response for an ND\n command, parse the parameter field.\n \"\"\"\n if packet_info['id'] == 'at_response' and \\\n packet_info['command'].lower() == b'nd' and \\\n packet_info['status'] == b'\\x00':\n result = {}\n\n # Parse each field directly\n result['source_addr'] = packet_info['parameter'][0:2]\n result['source_addr_long'] = packet_info['parameter'][2:10]\n\n # Parse the null-terminated node identifier field\n null_terminator_index = 10\n while packet_info['parameter'][null_terminator_index:\n null_terminator_index+1] != b'\\x00':\n null_terminator_index += 1\n\n # Parse each field thereafter directly\n result['node_identifier'] = \\\n packet_info['parameter'][10:null_terminator_index]\n result['parent_address'] = \\\n packet_info['parameter'][null_terminator_index+1:\n null_terminator_index+3]\n result['device_type'] = \\\n packet_info['parameter'][null_terminator_index+3:\n null_terminator_index+4]\n result['status'] = \\\n packet_info['parameter'][null_terminator_index+4:\n null_terminator_index+5]\n result['profile_id'] = \\\n packet_info['parameter'][null_terminator_index+5:\n null_terminator_index+7]\n result['manufacturer'] = \\\n packet_info['parameter'][null_terminator_index+7:\n null_terminator_index+9]\n\n # Simple check to ensure a good parse\n if null_terminator_index+9 != len(packet_info['parameter']):\n raise ValueError(\"Improper ND response length: expected {0}, \"\n \"read {1} bytes\".format(\n len(packet_info['parameter']),\n null_terminator_index+9)\n )\n\n return result\n else:\n return packet_info['parameter']"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nhalt the current thread.", "response": "def halt(self):\n \"\"\"\n halt: None -> None\n\n If this instance has a separate thread running, it will be\n halted. This method will wait until the thread has cleaned\n up before returning.\n \"\"\"\n if self._callback:\n self._thread_continue = False\n self._thread.join()"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef wait_read_frame(self, timeout=None):\n frame = self._wait_for_frame(timeout)\n return self._split_response(frame.data)", "response": "wait_read_frame waits until a valid frame is received and returns the dictionary containing the data contained within it."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef main():\n try:\n\n # Open serial port\n ser = serial.Serial('/dev/ttyUSB0', 9600)\n\n # Create XBee Series 1 object\n xbee = XBee(ser)\n\n\n # Send AT packet\n xbee.send('at', frame_id='A', command='DH')\n\n # Wait for response\n response = xbee.wait_read_frame()\n print response\n\n # Send AT packet\n xbee.send('at', frame_id='B', command='DL')\n\n # Wait for response\n response = xbee.wait_read_frame()\n print response\n\n # Send AT packet\n xbee.send('at', frame_id='C', command='MY')\n\n # Wait for response\n response = xbee.wait_read_frame()\n print response\n\n # Send AT packet\n xbee.send('at', frame_id='D', command='CE')\n\n # Wait for response\n response = xbee.wait_read_frame()\n print response\n except KeyboardInterrupt:\n pass\n finally:\n ser.close()", "response": "This function is called by the main function of the main function. It sends an API AT command to read the lower - order address bits from\n an XBee Series 1 object and looks for a response."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef halt(self):\n if self._callback:\n self._running.clear()\n self._ioloop.remove_handler(self.serial.fd)\n\n if self._frame_future is not None:\n self._frame_future.set_result(None)\n self._frame_future = None", "response": "Stop the event loop and remove the FD from the loop handler\n "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nprocess the next available frame and process it.", "response": "def process_frames(self):\n \"\"\"\n process_frames: None -> None\n\n Wait for a frame to become available, when resolved call the callback\n \"\"\"\n while self._running.is_set():\n try:\n frame = yield self._get_frame()\n info = self._split_response(frame.data)\n if info is not None:\n self._callback(info)\n except Exception as e:\n # Unexpected quit.\n if self._error_callback:\n self._error_callback(e)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nopening the serial port", "response": "def do_serial(self, p):\n\t\t\"\"\"Set the serial port, e.g.: /dev/tty.usbserial-A4001ib8\"\"\"\n\t\ttry:\n\t\t\tself.serial.port = p\n\t\t\tself.serial.open()\n\t\t\tprint 'Opening serial port: %s' % p\n\t\texcept Exception, e:\n\t\t\tprint 'Unable to open serial port: %s' % p"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef register(self, name, callback, filter):\n if name in self.names:\n raise ValueError(\"A callback has already been registered with \\\n the name '%s'\" % name)\n\n self.handlers.append({\n 'name': name,\n 'callback': callback,\n 'filter': filter\n })\n\n self.names.add(name)", "response": "Register a new entry in the cache."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nrun will read and dispatch any packet which arrives from the XBee device and dispatch any packet which arrives from the XBee device.", "response": "def run(self, oneshot=False):\n \"\"\"\n run: boolean -> None\n\n run will read and dispatch any packet which arrives from the\n XBee device\n \"\"\"\n if not self.xbee:\n raise ValueError(\"Either a serial port or an XBee must be provided \\\n to __init__ to execute run()\")\n\n while True:\n self.dispatch(self.xbee.wait_read_frame())\n\n if oneshot:\n break"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ndispatching the given packet against each registered callback methods.", "response": "def dispatch(self, packet):\n \"\"\"\n dispatch: XBee data dict -> None\n\n When called, dispatch checks the given packet against each\n registered callback method and calls each callback whose filter\n function returns true.\n \"\"\"\n for handler in self.handlers:\n if handler['filter'](packet):\n # Call the handler method with its associated\n # name and the packet which passed its filter check\n handler['callback'](handler['name'], packet)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef byteToInt(byte):\n if hasattr(byte, 'bit_length'):\n # This is already an int\n return byte\n return ord(byte) if hasattr(byte, 'encode') else byte[0]", "response": "Converts a byte to an integer."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_data(search_string, search_by='ip'):\n r_details = requests.get('https://talosintelligence.com/sb_api/query_lookup',\n headers={\n 'Referer':'https://talosintelligence.com/reputation_center/lookup?search=%s'%search_string,\n 'User-Agent':'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:64.0) Gecko/20100101 Firefox/64.0'\n },\n params = {\n 'query':'/api/v2/details/ip/',\n 'query_entry':search_string\n }).json()\n\n r_wscore = requests.get('https://talosintelligence.com/sb_api/remote_lookup',\n headers={\n 'Referer':'https://talosintelligence.com/reputation_center/lookup?search=%s'%search_string,\n 'User-Agent':'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:64.0) Gecko/20100101 Firefox/64.0'\n },\n params = {'hostname':'SDS', 'query_string':'/score/wbrs/json?url=%s' % search_string}).json()\n\n r_talos_blacklist = requests.get('https://www.talosintelligence.com/sb_api/blacklist_lookup',\n headers={\n 'Referer':'https://talosintelligence.com/reputation_center/lookup?search=%s'%search_string,\n 'User-Agent':'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:64.0) Gecko/20100101 Firefox/64.0'\n },\n params = {'query_type':'ipaddr', 'query_entry':search_string}).json()\n\n # would be nice to plot this values\n #r_volume = requests.get('https://talosintelligence.com/sb_api/query_lookup',\n # params = {\n # 'query':'/api/v2/volume/ip/',\n # 'query_entry':search_string\n # }).json()\n\n # No used for now\n #r_related_ips = requests.get('https://talosintelligence.com/sb_api/query_lookup',\n # params = {\n # 'query':'/api/v2/related_ips/ip/',\n # 'query_entry':search_string\n # }).json()\n\n \n talos_blacklisted = {'status':False}\n if 'classifications' in r_talos_blacklist['entry']:\n talos_blacklisted['status'] = True\n talos_blacklisted['classifications'] = \", \".join(r_talos_blacklist['entry']['classifications'])\n talos_blacklisted['first_seen'] = r_talos_blacklist['entry']['first_seen'] + \"UTC\"\n talos_blacklisted['expiration'] = r_talos_blacklist['entry']['expiration'] + \"UTC\"\n\n data = {\n 'address':search_string,\n 'hostname':r_details['hostname'] if 'hostname' in r_details else \"nodata\",\n 'volume_change':r_details['daychange'] if 'daychange' in r_details else \"nodata\",\n 'lastday_volume':r_details['daily_mag'] if 'daily_mag' in r_details else \"nodata\",\n 'month_volume':r_details['monthly_mag'] if 'monthly_mag' in r_details else \"nodata\",\n 'email_reputation':r_details['email_score_name'] if 'email_score_name' in r_details else \"nodata\",\n 'web_reputation':r_details['web_score_name'] if 'web_score_name' in r_details else \"nodata\",\n 'weighted_reputation_score':r_wscore['response'],\n 'talos_blacklisted':\"Yes\" if talos_blacklisted['status'] else \"No\"\n #'weighted_reputation_score':r_wscore[0]['response']['wbrs']['score'],\n #'volumes':zip(*r_volume['data'])\n }\n\n return data", "response": "Download data from talosintelligence. com for the given IP\n ivario Return tabbed data text\n ivario"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef insert_into_channel(api_key, api_secret, channel_key, video_key, **kwargs):\n jwplatform_client = jwplatform.Client(api_key, api_secret)\n logging.info(\"Inserting video into channel\")\n try:\n response = jwplatform_client.channels.videos.create(\n channel_key=channel_key,\n video_key=video_key,\n **kwargs)\n except jwplatform.errors.JWPlatformError as e:\n logging.error(\"Encountered an error inserting {} into channel {}.\\n{}\".format(video_key, channel_key, e))\n sys.exit(e.message)\n return response", "response": "Function which inserts video into a channel."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef list_conversions(api_key, api_secret, video_key, **kwargs):\n jwplatform_client = jwplatform.Client(api_key, api_secret)\n logging.info(\"Querying for video conversions.\")\n try:\n response = jwplatform_client.videos.conversions.list(video_key=video_key, **kwargs)\n except jwplatform.errors.JWPlatformError as e:\n logging.error(\"Encountered an error querying for video conversions.\\n{}\".format(e))\n sys.exit(e.message)\n return response", "response": "Function which retrieves a list of a video object s conversions."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nbuilding the request string and parameters for the API call", "response": "def _build_request(self, path, params=None):\n \"\"\"Build API request\"\"\"\n\n _url = '{scheme}://{host}{port}/{version}{path}'.format(\n scheme=self._scheme,\n host=self._host,\n port=':{}'.format(self._port) if self._port != 80 else '',\n version=self._api_version,\n path=path)\n\n if params is not None:\n _params = params.copy()\n else:\n _params = dict()\n\n # Add required API parameters\n _params['api_nonce'] = str(random.randint(0, 999999999)).zfill(9)\n _params['api_timestamp'] = int(time.time())\n _params['api_key'] = self.__key\n _params['api_format'] = 'json'\n _params['api_kit'] = 'py-{}{}'.format(\n __version__, '-{}'.format(self._agent) if self._agent else '')\n\n # Construct Signature Base String\n sbs = '&'.join(['{}={}'.format(\n quote((unicode(key).encode('utf-8')), safe='~'),\n quote((unicode(value).encode('utf-8')), safe='~')\n ) for key, value in sorted(_params.items())])\n\n # Add signature to the _params dict\n _params['api_signature'] = hashlib.sha1(\n '{}{}'.format(sbs, self.__secret).encode('utf-8')).hexdigest()\n\n return _url, _params"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef create_video(api_key, api_secret, local_video_path, api_format='json', **kwargs):\n # Setup API client\n jwplatform_client = jwplatform.Client(api_key, api_secret)\n\n # Make /videos/create API call\n logging.info(\"Registering new Video-Object\")\n try:\n response = jwplatform_client.videos.create(upload_method='single', **kwargs)\n except jwplatform.errors.JWPlatformError as e:\n logging.error(\"Encountered an error creating a video\\n{}\".format(e))\n logging.info(response)\n\n # Construct base url for upload\n upload_url = '{}://{}{}'.format(\n response['link']['protocol'],\n response['link']['address'],\n response['link']['path']\n )\n\n # Query parameters for the upload\n query_parameters = response['link']['query']\n query_parameters['api_format'] = api_format\n\n with open(local_video_path, 'rb') as f:\n files = {'file': f}\n r = requests.post(upload_url,\n params=query_parameters,\n files=files)\n logging.info('uploading file {} to url {}'.format(local_video_path, r.url))\n logging.info('upload response: {}'.format(r.text))\n logging.info(r)", "response": "Function which creates a new video object via singlefile upload method."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nfunction which allows to replace the content of an EXISTING video object.", "response": "def replace_video(api_key, api_secret, local_video_path, video_key, **kwargs):\n \"\"\"\n Function which allows to replace the content of an EXISTING video object.\n\n :param api_key: JWPlatform api-key\n :param api_secret: JWPlatform shared-secret\n :param local_video_path: Path to media on local machine.\n :param video_key: Video's object ID. Can be found within JWPlayer Dashboard.\n :param kwargs: Arguments conforming to standards found @ https://developer.jwplayer.com/jw-platform/reference/v1/methods/videos/create.html\n :return:\n \"\"\"\n filename = os.path.basename(local_video_path)\n\n # Setup API client\n jwplatform_client = jwplatform.Client(api_key, api_secret)\n logging.info(\"Updating Video\")\n try:\n response = jwplatform_client.videos.update(\n video_key=video_key,\n upload_method='s3',\n update_file='True',\n **kwargs)\n except jwplatform.errors.JWPlatformError as e:\n logging.error(\"Encountered an error updating the video\\n{}\".format(e))\n sys.exit(e.message)\n logging.info(response)\n\n # Construct base url for upload\n upload_url = '{}://{}{}'.format(\n response['link']['protocol'],\n response['link']['address'],\n response['link']['path']\n )\n\n # Query parameters for the upload\n query_parameters = response['link']['query']\n\n # HTTP PUT upload using requests\n headers = {'Content-Disposition': 'attachment; filename=\"{}\"'.format(filename)}\n with open(local_video_path, 'rb') as f:\n r = requests.put(upload_url, params=query_parameters, headers=headers, data=f)\n logging.info('uploading file {} to url {}'.format(local_video_path, r.url))\n logging.info('upload response: {}'.format(r.text))\n logging.info(r)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfunctions which updates the thumbnail for an EXISTING video utilizing position parameter.", "response": "def update_thumbnail(api_key, api_secret, video_key, position=7.0, **kwargs):\n \"\"\"\n Function which updates the thumbnail for an EXISTING video utilizing position parameter.\n This function is useful for selecting a new thumbnail from with the already existing video content.\n Instead of position parameter, user may opt to utilize thumbnail_index parameter.\n Please eee documentation for further information.\n\n :param api_key: JWPlatform api-key\n :param api_secret: JWPlatform shared-secret\n :param video_key: Video's object ID. Can be found within JWPlayer Dashboard.\n :param position: Represents seconds into the duration of a video, for thumbnail extraction.\n :param kwargs: Arguments conforming to standards found @ https://developer.jwplayer.com/jw-platform/reference/v1/methods/videos/thumbnails/update.html\n :return: Dict which represents the JSON response.\n \"\"\"\n jwplatform_client = jwplatform.Client(api_key, api_secret)\n logging.info(\"Updating video thumbnail.\")\n try:\n response = jwplatform_client.videos.thumbnails.update(\n video_key=video_key,\n position=position, # Parameter which specifies seconds into video to extract thumbnail from.\n **kwargs)\n except jwplatform.errors.JWPlatformError as e:\n logging.error(\"Encountered an error updating thumbnail.\\n{}\".format(e))\n sys.exit(e.message)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfunctioning which updates the thumbnail for a particular video object with a locally saved image.", "response": "def update_thumbnail_via_upload(api_key, api_secret, video_key, local_video_image_path='', api_format='json',\n **kwargs):\n \"\"\"\n Function which updates the thumbnail for a particular video object with a locally saved image.\n\n :param api_key: JWPlatform api-key\n :param api_secret: JWPlatform shared-secret\n :param video_key: Video's object ID. Can be found within JWPlayer Dashboard.\n :param local_video_image_path: Local system path to an image.\n :param api_format: REQUIRED Acceptable values include 'py','xml','json',and 'php'\n :param kwargs: Arguments conforming to standards found @ https://developer.jwplayer.com/jw-platform/reference/v1/methods/videos/thumbnails/update.html\n :return: Dict which represents the JSON response.\n \"\"\"\n jwplatform_client = jwplatform.Client(api_key, api_secret)\n logging.info(\"Updating video thumbnail.\")\n try:\n response = jwplatform_client.videos.thumbnails.update(\n video_key=video_key,\n **kwargs)\n except jwplatform.errors.JWPlatformError as e:\n logging.error(\"Encountered an error updating thumbnail.\\n{}\".format(e))\n sys.exit(e.message)\n logging.info(response)\n\n # Construct base url for upload\n upload_url = '{}://{}{}'.format(\n response['link']['protocol'],\n response['link']['address'],\n response['link']['path']\n )\n\n # Query parameters for the upload\n query_parameters = response['link']['query']\n query_parameters['api_format'] = api_format\n\n with open(local_video_image_path, 'rb') as f:\n files = {'file': f}\n r = requests.post(upload_url, params=query_parameters, files=files)\n logging.info('uploading file {} to url {}'.format(local_video_image_path, r.url))\n logging.info('upload response: {}'.format(r.text))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run_upload(video_file_path):\n\n upload_parameters = {\n 'file_path': video_file_path,\n 'file_size': os.stat(video_file_path).st_size,\n 'file_name': os.path.basename(video_file_path)\n }\n\n try:\n # Setup API client\n jwplatform_client = Client(JW_API_KEY, JW_API_SECRET)\n\n # Make /videos/create API call with multipart parameter specified\n jwplatform_video_create_response = jwplatform_client.videos.create(\n upload_method='multipart',\n title=upload_parameters['file_name']\n )\n\n except JWPlatformError:\n logging.exception('An error occurred during the uploader setup. Check that your API keys are properly '\n 'set up in your environment, and ensure that the video file path exists.')\n return\n\n # Construct base url for upload\n upload_parameters['upload_url'] = '{protocol}://{address}{path}'.format(**jwplatform_video_create_response['link'])\n\n logging.info('Upload URL to be used: {}'.format(upload_parameters['upload_url']))\n\n upload_parameters['query_parameters'] = jwplatform_video_create_response['link']['query']\n upload_parameters['query_parameters']['api_format'] = 'json'\n upload_parameters['headers'] = {'X-Session-ID': jwplatform_video_create_response['session_id']}\n # The chunk offset will be updated several times during the course of the upload\n upload_parameters['chunk_offset'] = 0\n\n # Perform the multipart upload\n with open(upload_parameters['file_path'], 'rb') as file_to_upload:\n while True:\n chunk = file_to_upload.read(BYTES_TO_BUFFER)\n if len(chunk) <= 0:\n break\n\n try:\n upload_chunk(chunk, upload_parameters)\n\n # Log any exceptions that bubbled up\n except requests.exceptions.RequestException:\n logging.exception('Error posting data, stopping upload...')\n break", "response": "Runs the upload of a file to the specified base url and returns the base url of the uploaded file."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nuploading a single chunk of data to the video file.", "response": "def upload_chunk(chunk, upload_parameters):\n \"\"\"\n Handles the POST request needed to upload a single portion of the video file.\n Serves as a helper method for upload_by_multipart().\n The offset used to determine where a chunk begins and ends is updated in the course of\n this method's execution.\n\n :param chunk: the raw bytes of data from the video file\n :param upload_parameters: a collection of all pieces of info needed to upload the video\n \"\"\"\n begin_chunk = upload_parameters['chunk_offset']\n\n # The next chunk will begin at (begin_chunk + len(chunk)), so the -1 ensures that the ranges do not overlap\n end_chunk = begin_chunk + len(chunk) - 1\n file_size = upload_parameters['file_size']\n filename = upload_parameters['file_size']\n logging.info(\"begin_chunk / end_chunk = {} / {}\".format(begin_chunk, end_chunk))\n\n upload_parameters['headers'].update(\n {\n 'X-Content-Range': 'bytes {}-{}/{}'.format(begin_chunk, end_chunk, file_size),\n 'Content-Disposition': 'attachment; filename=\"{}\"'.format(filename),\n 'Content-Type': 'application/octet-stream',\n 'Content-Length': str((end_chunk - begin_chunk) + 1)\n }\n )\n\n response = requests.post(\n upload_parameters['upload_url'],\n params=upload_parameters['query_parameters'],\n headers=upload_parameters['headers'],\n data=chunk\n )\n response.raise_for_status()\n\n # As noted before, the next chunk begins at (begin_chunk + len(chunk))\n upload_parameters['chunk_offset'] = begin_chunk + len(chunk)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nfunction which fetches a video library and writes each video_objects Metadata to CSV. Useful for CMS systems.", "response": "def make_csv(api_key, api_secret, path_to_csv=None, result_limit=1000, **kwargs):\n \"\"\"\n Function which fetches a video library and writes each video_objects Metadata to CSV. Useful for CMS systems.\n\n :param api_key: JWPlatform api-key\n :param api_secret: JWPlatform shared-secret\n :param path_to_csv: Local system path to desired CSV. Default will be within current working directory.\n :param result_limit: Number of video results returned in response. (Suggested to leave at default of 1000)\n :param kwargs: Arguments conforming to standards found @ https://developer.jwplayer.com/jw-platform/reference/v1/methods/videos/list.html\n :return: Dict which represents the JSON response.\n \"\"\"\n\n path_to_csv = path_to_csv or os.path.join(os.getcwd(), 'video_list.csv')\n timeout_in_seconds = 2\n max_retries = 3\n retries = 0\n offset = 0\n videos = list()\n\n jwplatform_client = jwplatform.Client(api_key, api_secret)\n logging.info(\"Querying for video list.\")\n\n while True:\n try:\n response = jwplatform_client.videos.list(result_limit=result_limit,\n result_offset=offset,\n **kwargs)\n except jwplatform.errors.JWPlatformRateLimitExceededError:\n logging.error(\"Encountered rate limiting error. Backing off on request time.\")\n if retries == max_retries:\n raise jwplatform.errors.JWPlatformRateLimitExceededError()\n timeout_in_seconds *= timeout_in_seconds # Exponential back off for timeout in seconds. 2->4->8->etc.etc.\n retries += 1\n time.sleep(timeout_in_seconds)\n continue\n except jwplatform.errors.JWPlatformError as e:\n logging.error(\"Encountered an error querying for videos list.\\n{}\".format(e))\n raise e\n\n # Reset retry flow-control variables upon a non successful query (AKA not rate limited)\n retries = 0\n timeout_in_seconds = 2\n\n # Add all fetched video objects to our videos list.\n next_videos = response.get('videos', [])\n last_query_total = response.get('total', 0)\n videos.extend(next_videos)\n offset += len(next_videos)\n logging.info(\"Accumulated {} videos.\".format(offset))\n if offset >= last_query_total: # Condition which defines you've reached the end of the library\n break\n\n # Section for writing video library to csv\n desired_fields = ['key', 'title', 'description', 'tags', 'date', 'link']\n should_write_header = not os.path.isfile(path_to_csv)\n with open(path_to_csv, 'a+') as path_to_csv:\n # Only write columns to the csv which are specified above. Columns not specified are ignored.\n writer = csv.DictWriter(path_to_csv, fieldnames=desired_fields, extrasaction='ignore')\n if should_write_header:\n writer.writeheader()\n writer.writerows(videos)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef create_channel(api_key, api_secret, channel_type='manual', **kwargs):\n jwplatform_client = jwplatform.Client(api_key, api_secret)\n logging.info(\"Creating new channel with keyword args.\")\n try:\n response = jwplatform_client.channels.create(type=channel_type, **kwargs)\n except jwplatform.errors.JWPlatformError as e:\n logging.error(\"Encountered an error creating new channel.\\n{}\".format(e))\n sys.exit(e.message)\n return response", "response": "Function which creates a new channel. Channels serve as containers of video media objects."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef update_custom_params(api_key, api_secret, video_key, params):\n formatted_params = {'custom.{}'.format(k): v for k,v in params.items()}\n\n # Setup API client\n jwplatform_client = jwplatform.Client(api_key, api_secret)\n logging.info(\"Updating Video\")\n try:\n response = jwplatform_client.videos.update(\n video_key=video_key,\n **formatted_params)\n except jwplatform.errors.JWPlatformError as e:\n logging.error(\"Encountered an error updating the video\\n{}\".format(e))\n sys.exit(e.message)\n logging.info(response)", "response": "Function which allows you to update a video s custom params."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef write(self, file):\n text, image = self.image\n image.save(file, format=self.format)\n return (text, file)", "response": "r Write CAPTCHA image to file."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef text(self):\n if isinstance(self.source, str):\n return self.source\n else:\n return self.source()", "response": "Text received from self. source."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _writeText(self, image, text, pos):\n offset = 0\n x, y = pos\n\n for c in text:\n # Write letter\n c_size = self.font.getsize(c)\n c_image = Image.new('RGBA', c_size, (0, 0, 0, 0))\n c_draw = ImageDraw.Draw(c_image)\n c_draw.text((0, 0), c, font=self.font, fill=(0, 0, 0, 255))\n\n # Transform\n c_image = self._rndLetterTransform(c_image)\n\n # Paste onto image\n image.paste(c_image, (x+offset, y), c_image)\n offset += c_size[0]", "response": "Write morphed text in Image object."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _drawLine(self, image):\n w, h = image.size\n w *= 5\n h *= 5\n\n l_image = Image.new('RGBA', (w, h), (0, 0, 0, 0))\n l_draw = ImageDraw.Draw(l_image)\n\n x1 = int(w * random.uniform(0, 0.1))\n y1 = int(h * random.uniform(0, 1))\n x2 = int(w * random.uniform(0.9, 1))\n y2 = int(h * random.uniform(0, 1))\n\n # Line width modifier was chosen as an educated guess\n # based on default image area.\n l_width = round((w * h)**0.5 * 2.284e-2)\n\n # Draw\n l_draw.line(((x1, y1), (x2, y2)), fill=(0, 0, 0, 255), width=l_width)\n\n # Transform\n l_image = self._rndLineTransform(l_image)\n l_image = l_image.resize(image.size, resample=self.resample)\n\n # Paste onto image\n image.paste(l_image, (0, 0), l_image)", "response": "Draw morphed line in Image object."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ngenerate white noise and merge it with given Image object.", "response": "def _whiteNoise(self, size):\n \"\"\"Generate white noise and merge it with given Image object.\"\"\"\n if self.noise > 0.003921569: # 1./255.\n w, h = size\n\n pixel = (lambda noise: round(255 * random.uniform(1-noise, 1)))\n\n n_image = Image.new('RGB', size, (0, 0, 0, 0))\n rnd_grid = map(lambda _: tuple([pixel(self.noise)]) * 3,\n [0] * w * h)\n n_image.putdata(list(rnd_grid))\n return n_image\n else:\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _rndLetterTransform(self, image):\n w, h = image.size\n\n dx = w * random.uniform(0.2, 0.7)\n dy = h * random.uniform(0.2, 0.7)\n\n x1, y1 = self.__class__._rndPointDisposition(dx, dy)\n x2, y2 = self.__class__._rndPointDisposition(dx, dy)\n\n w += abs(x1) + abs(x2)\n h += abs(x1) + abs(x2)\n\n quad = self.__class__._quadPoints((w, h), (x1, y1), (x2, y2))\n\n return image.transform(image.size, Image.QUAD,\n data=quad, resample=self.resample)", "response": "Randomly morph a single character."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _rndPointDisposition(dx, dy):\n x = int(random.uniform(-dx, dx))\n y = int(random.uniform(-dy, dy))\n return (x, y)", "response": "Return random disposition point."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _quadPoints(size, disp1, disp2):\n w, h = size\n x1, y1 = disp1\n x2, y2 = disp2\n\n return (\n x1, -y1,\n -x1, h + y2,\n w + x2, h - y2,\n w - x2, y1\n )", "response": "Return points for QUAD transformation."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nconvert Tornado URLSpec to OpenAPI - compliant path.", "response": "def tornadopath2openapi(urlspec, method):\n \"\"\"Convert Tornado URLSpec to OpenAPI-compliant path.\n\n :param urlspec:\n :type urlspec: URLSpec\n :param method: Handler http method\n :type method: function\n \"\"\"\n if sys.version_info >= (3, 3):\n args = list(inspect.signature(method).parameters.keys())[1:]\n else:\n if getattr(method, '__tornado_coroutine__', False):\n method = method.__wrapped__\n args = inspect.getargspec(method).args[1:]\n params = tuple('{{{}}}'.format(arg) for arg in args)\n try:\n path_tpl = urlspec.matcher._path\n except AttributeError: # tornado<4.5\n path_tpl = urlspec._path\n path = (path_tpl % params)\n if path.count('/') > 1:\n path = path.rstrip('/?*')\n return path"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ngenerate a temporary MFA token for a user.", "response": "def generate_mfa_token(self, user_id, expires_in=259200, reusable=False):\n \"\"\"\n Use to generate a temporary MFA token that can be used in place of other MFA tokens for a set time period.\n For example, use this token for account recovery.\n\n :param user_id: Id of the user\n :type user_id: int\n\n :param expires_in: Set the duration of the token in seconds. \n (default: 259200 seconds = 72h) 72 hours is the max value.\n :type expires_in: int\n\n :param reusable: Defines if the token reusable. (default: false) If set to true, token can be used for multiple apps, until it expires.\n :type reusable: bool\n\n Returns a mfa token\n :return: return the object if success\n :rtype: MFAToken\n\n See https://developers.onelogin.com/api-docs/1/multi-factor-authentication/generate-mfa-token Generate MFA Token documentation\n\n \"\"\"\n self.clean_error()\n\n try:\n url = self.get_url(Constants.GENERATE_MFA_TOKEN_URL, user_id)\n\n data = {\n 'expires_in': expires_in,\n 'reusable': reusable\n }\n\n response = self.execute_call('post', url, json=data)\n if response.status_code == 201:\n json_data = response.json()\n if json_data:\n return MFAToken(json_data)\n else:\n self.error = self.extract_status_code_from_response(response)\n self.error_description = self.extract_error_message_from_response(response)\n except Exception as e:\n self.error = 500\n self.error_description = e.args[0]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nconvert camelCase to snake_case.", "response": "def camel_to_snake(camel):\n \"\"\"Convert camelCase to snake_case.\"\"\"\n ret = []\n last_lower = False\n for char in camel:\n current_upper = char.upper() == char\n if current_upper and last_lower:\n ret.append(\"_\")\n ret.append(char.lower())\n else:\n ret.append(char.lower())\n last_lower = not current_upper\n return \"\".join(ret)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef register_id(self, id_string):\n try:\n prefix, count = id_string.rsplit(\"_\", 1)\n count = int(count)\n except ValueError:\n # We don't need to worry about ids that don't match our pattern\n pass\n else:\n if prefix == self.prefix:\n self.counter = max(count, self.counter)", "response": "Register a manually assigned id as used to avoid collisions."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a new AMDSec by parsing the root.", "response": "def parse(cls, root):\n \"\"\"\n Create a new AMDSec by parsing root.\n\n :param root: Element or ElementTree to be parsed into an object.\n \"\"\"\n if root.tag != utils.lxmlns(\"mets\") + \"amdSec\":\n raise exceptions.ParseError(\n \"AMDSec can only parse amdSec elements with METS namespace.\"\n )\n section_id = root.get(\"ID\")\n subsections = []\n for child in root:\n subsection = SubSection.parse(child)\n subsections.append(subsection)\n return cls(section_id, subsections)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef serialize(self, now=None):\n if self._tree is not None:\n return self._tree\n el = etree.Element(utils.lxmlns(\"mets\") + self.tag, ID=self.id_string)\n self.subsections.sort()\n for child in self.subsections:\n el.append(child.serialize(now))\n return el", "response": "Serialize this amdSec and all children to lxml Element and return it."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef parse(cls, element):\n if element.tag != cls.ALT_RECORD_ID_TAG:\n raise exceptions.ParseError(\n u\"AltRecordID got unexpected tag {}; expected {}\".format(\n element.tag, cls.ALT_RECORD_ID_TAG\n )\n )\n\n return cls(element.text, id=element.get(u\"ID\"), type=element.get(u\"TYPE\"))", "response": "Parse an XML element into an AltRecordID object."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nparses an XML element into an Agent object.", "response": "def parse(cls, element):\n \"\"\"\n Create a new Agent by parsing root.\n\n :param element: Element to be parsed into an Agent.\n :raises exceptions.ParseError: If element is not a valid agent.\n \"\"\"\n if element.tag != cls.AGENT_TAG:\n raise exceptions.ParseError(\n u\"Agent got unexpected tag {}; expected {}\".format(\n element.tag, cls.AGENT_TAG\n )\n )\n\n role = element.get(u\"ROLE\")\n if not role:\n raise exceptions.ParseError(u\"Agent must have a ROLE attribute.\")\n if role == u\"OTHER\":\n role = element.get(u\"OTHERROLE\") or role\n agent_type = element.get(u\"TYPE\")\n if agent_type == u\"OTHER\":\n agent_type = element.get(u\"OTHERTYPE\") or agent_type\n agent_id = element.get(u\"ID\")\n try:\n name = element.find(cls.NAME_TAG).text\n except AttributeError:\n name = None\n notes = [note.text for note in element.findall(cls.NOTE_TAG)]\n\n return cls(role, id=agent_id, type=agent_type, name=name, notes=notes)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the STATUS when serializing.", "response": "def get_status(self):\n \"\"\"\n Returns the STATUS when serializing.\n\n Calculates based on the subsection type and if it's replacing anything.\n\n :returns: None or the STATUS string.\n \"\"\"\n if self.status is not None:\n return self.status\n if self.subsection == \"dmdSec\":\n if self.older is None:\n return \"original\"\n else:\n return \"updated\"\n if self.subsection in (\"techMD\", \"rightsMD\"):\n # TODO how to handle ones where newer has been deleted?\n if self.newer is None:\n return \"current\"\n else:\n return \"superseded\"\n return None"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef replace_with(self, new_subsection):\n if self.subsection != new_subsection.subsection:\n raise exceptions.MetsError(\n \"Must replace a SubSection with one of the same type.\"\n )\n # TODO convert this to a DB so have bidirectonal foreign keys??\n self.newer = new_subsection\n new_subsection.older = self\n self.status = None", "response": "Replace this SubSection with new_subsection."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nparse a subsection element or ElementTree into a new SubSection object.", "response": "def parse(cls, root):\n \"\"\"\n Create a new SubSection by parsing root.\n\n :param root: Element or ElementTree to be parsed into an object.\n :raises exceptions.ParseError: If root's tag is not in :const:`SubSection.ALLOWED_SUBSECTIONS`.\n :raises exceptions.ParseError: If the first child of root is not mdRef or mdWrap.\n \"\"\"\n subsection = root.tag.replace(utils.lxmlns(\"mets\"), \"\", 1)\n if subsection not in cls.ALLOWED_SUBSECTIONS:\n raise exceptions.ParseError(\n \"SubSection can only parse elements with tag in %s with METS namespace\"\n % (cls.ALLOWED_SUBSECTIONS,)\n )\n section_id = root.get(\"ID\")\n created = root.get(\"CREATED\", \"\")\n status = root.get(\"STATUS\", \"\")\n child = root[0]\n if child.tag == utils.lxmlns(\"mets\") + \"mdWrap\":\n mdwrap = MDWrap.parse(child)\n obj = cls(subsection, mdwrap, section_id)\n elif child.tag == utils.lxmlns(\"mets\") + \"mdRef\":\n mdref = MDRef.parse(child)\n obj = cls(subsection, mdref, section_id)\n else:\n raise exceptions.ParseError(\n \"Child of %s must be mdWrap or mdRef\" % subsection\n )\n obj.created = created\n obj.status = status\n return obj"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef serialize(self, now=None):\n created = self.created if self.created is not None else now\n el = etree.Element(utils.lxmlns(\"mets\") + self.subsection, ID=self.id_string)\n if created: # Don't add CREATED if none was parsed\n el.set(\"CREATED\", created)\n status = self.get_status()\n if status:\n el.set(\"STATUS\", status)\n if self.contents:\n el.append(self.contents.serialize())\n return el", "response": "Serialize this SubSection and all children to lxml Element and return it."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nparse a MDWrap element or ElementTree into a new MDWrap object.", "response": "def parse(cls, root):\n \"\"\"\n Create a new MDWrap by parsing root.\n\n :param root: Element or ElementTree to be parsed into a MDWrap.\n \"\"\"\n if root.tag != utils.lxmlns(\"mets\") + \"mdRef\":\n raise exceptions.ParseError(\n \"MDRef can only parse mdRef elements with METS namespace.\"\n )\n # Required attributes\n mdtype = root.get(\"MDTYPE\")\n if not mdtype:\n raise exceptions.ParseError(\"mdRef must have a MDTYPE\")\n target = root.get(utils.lxmlns(\"xlink\") + \"href\")\n if not target:\n raise exceptions.ParseError(\"mdRef must have an xlink:href.\")\n try:\n target = utils.urldecode(target)\n except ValueError:\n raise exceptions.ParseError(\n 'Value \"{}\" (of attribute xlink:href) is not a valid'\n \" URL.\".format(target)\n )\n loctype = root.get(\"LOCTYPE\")\n if not loctype:\n raise exceptions.ParseError(\"mdRef must have a LOCTYPE\")\n # Optional attributes\n label = root.get(\"LABEL\")\n otherloctype = root.get(\"OTHERLOCTYPE\")\n\n return cls(target, mdtype, loctype, label, otherloctype)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef parse(cls, root):\n if root.tag != utils.lxmlns(\"mets\") + \"mdWrap\":\n raise exceptions.ParseError(\n \"MDWrap can only parse mdWrap elements with METS namespace.\"\n )\n mdtype = root.get(\"MDTYPE\")\n if not mdtype:\n raise exceptions.ParseError(\"mdWrap must have a MDTYPE\")\n othermdtype = root.get(\"OTHERMDTYPE\")\n document = root.xpath(\"mets:xmlData/*\", namespaces=utils.NAMESPACES)\n if len(document) == 0:\n raise exceptions.ParseError(\n \"All mdWrap/xmlData elements must have at least one child; this\"\n \" one has none\"\n )\n elif len(document) == 1:\n document = document[0]\n\n # Create a copy, so that the element is not moved by duplicate references.\n document = copy.deepcopy(document)\n\n return cls(document, mdtype, othermdtype)", "response": "Parse a MDWrap element."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _data_to_lxml_el(data, ns, nsmap, element_maker=None, snake=True):\n if not element_maker:\n element_maker = ElementMaker(namespace=nsmap[ns], nsmap=nsmap)\n tag = data[0]\n if snake:\n camel_tag = utils.snake_to_camel(tag)\n func = getattr(element_maker, camel_tag)\n args = []\n attributes = {}\n for element in data[1:]:\n if isinstance(element, dict):\n for key, val in element.items():\n attributes[key] = val\n elif isinstance(element, (tuple, list)):\n args.append(\n _data_to_lxml_el(\n element, ns, nsmap, element_maker=element_maker, snake=snake\n )\n )\n elif isinstance(element, six.text_type):\n args.append(element)\n elif isinstance(element, etree._Element):\n args.append(element)\n else:\n args.append(six.binary_type(element))\n ret = func(*args)\n for attr, val in attributes.items():\n try:\n ns, attr = attr.split(\":\")\n except ValueError:\n ns = None\n if snake:\n attr = utils.snake_to_camel(attr)\n if ns:\n attr = \"{\" + nsmap[ns] + \"}\" + attr\n ret.attrib[attr] = val\n else:\n ret.attrib[attr] = val\n return ret", "response": "Convert a tuple or list of tuples to an lxml. etree. _Element instance."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _to_colon_ns(bracket_ns, default_ns=None, nsmap=None, snake=True):\n parts = [x.strip(\"{\") for x in bracket_ns.split(\"}\")]\n if len(parts) != 2:\n return bracket_ns\n ns, var = parts\n if default_ns and nsmap:\n try:\n ns = [k for k, v in nsmap.items() if v == ns][0]\n if ns == default_ns:\n if snake:\n return utils.camel_to_snake(var)\n return var\n except IndexError:\n pass\n if snake:\n return \":\".join([ns, utils.camel_to_snake(var)])\n return \":\".join([ns, var])", "response": "Convert a namespaced tag or attribute name from explicit XML \"bracket\" - style notation using a more succinct Pythonic colon - separated notation using\n snake_case e. g. :: object_identifier schema_location"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns the XML attributes of lxml_el as a dict where namespaced attributes are represented via colon - delimited and using snake case.", "response": "def _get_el_attributes(lxml_el, ns=None, nsmap=None):\n \"\"\"Return the XML attributes of lxml ``Element`` instance lxml_el as a dict\n where namespaced attributes are represented via colon-delimiting and using\n snake case.\n \"\"\"\n attrs = {}\n for attr, val in lxml_el.items():\n attr = _to_colon_ns(attr, default_ns=ns, nsmap=nsmap)\n attrs[attr] = val\n return attrs"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nconverting an lxml. _Element instance to a Python tuple.", "response": "def _lxml_el_to_data(lxml_el, ns, nsmap, snake=True):\n \"\"\"Convert an ``lxml._Element`` instance to a Python tuple.\"\"\"\n tag_name = _to_colon_ns(lxml_el.tag, default_ns=ns, nsmap=nsmap)\n ret = [tag_name]\n attributes = _get_el_attributes(lxml_el, ns=ns, nsmap=nsmap)\n if attributes:\n ret.append(attributes)\n for sub_el in lxml_el:\n ret.append(_lxml_el_to_data(sub_el, ns, nsmap, snake=snake))\n text = lxml_el.text\n if text:\n ret.append(text)\n return tuple(ret)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef data_to_premis(data, premis_version=utils.PREMIS_VERSION):\n nsmap = utils.PREMIS_VERSIONS_MAP[premis_version][\"namespaces\"]\n return _data_to_lxml_el(data, \"premis\", nsmap)", "response": "Given tuple data representing a PREMIS entity return an lxml. etree. _Element instance. Eg., ::\nInviteXML"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ntransforming a PREMIS lxml. _Element instance to a Python tuple.", "response": "def premis_to_data(premis_lxml_el):\n \"\"\"Transform a PREMIS ``lxml._Element`` instance to a Python tuple.\"\"\"\n premis_version = premis_lxml_el.get(\"version\", utils.PREMIS_VERSION)\n nsmap = utils.PREMIS_VERSIONS_MAP[premis_version][\"namespaces\"]\n return _lxml_el_to_data(premis_lxml_el, \"premis\", nsmap)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef data_find(data, path):\n path_parts = path.split(\"/\")\n try:\n sub_elm = [\n el\n for el in data\n if isinstance(el, (tuple, list)) and el[0] == path_parts[0]\n ][0]\n except IndexError:\n return None\n else:\n if len(path_parts) > 1:\n return data_find(sub_elm, \"/\".join(path_parts[1:]))\n return sub_elm", "response": "Find and return the first element - as - tuple in tuple data using simplified\n XPath path."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef tuple_to_schema(tuple_):\n schema = []\n for element in tuple_:\n if isinstance(element, (tuple, list)):\n try:\n if isinstance(element[1], six.string_types):\n schema.append((element[0],))\n else:\n schema.append(tuple_to_schema(element))\n except IndexError:\n schema.append((element[0],))\n else:\n schema.append(element)\n return tuple(schema)", "response": "Convert a tuple representing an XML data structure into a schema tuple that can be used in the PREMISElement. schema property of a sub - class of\n PREMISElement."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef generate_element_class(tuple_instance):\n schema = tuple_to_schema(tuple_instance)\n\n def defaults(self):\n return {}\n\n def schema_getter(self):\n return schema\n\n new_class_name = \"PREMIS{}Element\".format(schema[0].capitalize())\n return type(\n new_class_name,\n (PREMISElement,),\n {\"defaults\": property(defaults), \"schema\": property(schema_getter)},\n )", "response": "Dynamically create a sub - class of PREMISElement given tuple_instance which is a tuple representing an XML data structure."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef data_find_all(data, path, dyn_cls=False):\n path_parts = path.split(\"/\")\n try:\n sub_elms = tuple(\n el\n for el in data\n if isinstance(el, (tuple, list)) and el[0] == path_parts[0]\n )\n except IndexError:\n return None\n if len(path_parts) > 1:\n ret = []\n for sub_elm in sub_elms:\n for x in data_find_all(sub_elm, \"/\".join(path_parts[1:])):\n ret.append(x)\n ret = tuple(ret)\n else:\n ret = sub_elms\n if ret and dyn_cls:\n cls = generate_element_class(ret[0])\n return tuple(cls(data=tuple_) for tuple_ in ret)\n return ret", "response": "Find and return all element - as - tuples in tuple data using simplified XPath path."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfinds the text value of the element - as - tuple in tuple data using simplified XPath path.", "response": "def data_find_text(data, path):\n \"\"\"Return the text value of the element-as-tuple in tuple ``data`` using\n simplified XPath ``path``.\n \"\"\"\n el = data_find(data, path)\n if not isinstance(el, (list, tuple)):\n return None\n texts = [child for child in el[1:] if not isinstance(child, (tuple, list, dict))]\n if not texts:\n return None\n return \" \".join(\n [\n # How should we deal with decoding errors when `x` is binary?\n # For now, we're using the ``strict`` mode. Other options here:\n # https://docs.python.org/3/library/functions.html#open.\n six.ensure_text(x, encoding=\"utf-8\", errors=\"strict\")\n for x in texts\n ]\n )"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ngenerate a tree - as - tuple representing a PREMIS XML element.", "response": "def _generate_data(schema, elements, attributes=None, path=None):\n \"\"\"Using tree-as-tuple ``schema`` as guide, return a tree-as-tuple ``data``\n representing a PREMIS XML element, where the values in dict ``elements`` and\n the values in dict ``attributes`` are located in the appropriate locations\n in the ``data`` tree structure.\n \"\"\"\n path = path or []\n attributes = attributes or {}\n tag_name = schema[0]\n data = [tag_name]\n if attributes:\n data.append(attributes)\n new_path = path[:]\n new_path.append(tag_name)\n root = new_path[0]\n possible_paths = [\"__\".join(new_path), tag_name]\n if root != tag_name and tag_name.startswith(root):\n possible_paths.append(tag_name.lstrip(root)[1:])\n for possible_path in possible_paths:\n val = elements.get(possible_path)\n if val:\n if isinstance(val, (tuple, list)):\n data = tuple(val)\n else:\n if attributes:\n data = (tag_name, attributes, val)\n else:\n data = (tag_name, val)\n return tuple(data)\n for subschema in schema[1:]:\n subel = _generate_data(subschema, elements, path=new_path)\n if (not subel) or (subel == subschema):\n continue\n if all(map(lambda x: isinstance(x, tuple), subel)):\n for subsubel in subel:\n data.append(subsubel)\n elif not el_is_empty(subel):\n data.append(subel)\n return tuple(data)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning True if tuple el represents an empty XML element.", "response": "def el_is_empty(el):\n \"\"\"Return ``True`` if tuple ``el`` represents an empty XML element.\"\"\"\n if len(el) == 1 and not isinstance(el[0], (list, tuple)):\n return True\n subels_are_empty = []\n for subel in el:\n if isinstance(subel, (list, tuple)):\n subels_are_empty.append(el_is_empty(subel))\n else:\n subels_are_empty.append(not bool(subel))\n return all(subels_are_empty)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nanalyzing PREMIS - element - as - tuple schema and return a dict that maps attribute names to the simplified XPaths needed to retrieve them.", "response": "def get_attrs_to_paths(schema, attrs_to_paths=None, path=None):\n \"\"\"Analyze PREMIS-element-as-tuple ``schema`` and return a dict that maps\n attribute names to the simplified XPaths needed to retrieve them, e.g.,::\n\n >>> {'object_identifier_type':\n 'object_identifier/object_identifier_type',\n 'object_identifier_value':\n 'object_identifier/object_identifier_value'}\n \"\"\"\n attrs_to_paths = attrs_to_paths or {}\n tag = schema[0]\n if len(schema) == 1:\n _insert_attr_path(attrs_to_paths, tag, \"/\".join(path + [tag]))\n else:\n for elem in schema[1:]:\n if isinstance(elem, dict):\n continue\n new_path = [] if path is None else path + [tag]\n if isinstance(elem, (list, tuple)):\n attrs_to_paths.update(\n get_attrs_to_paths(\n elem, attrs_to_paths=attrs_to_paths, path=new_path\n )\n )\n else:\n _insert_attr_path(attrs_to_paths, tag, \"/\".join(new_path))\n return attrs_to_paths"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngiving a tuple of data returns the PREMIS version it is using. If none can be found return the default PREMIS version.", "response": "def _premis_version_from_data(data):\n \"\"\"Given tuple ``data`` encoding a PREMIS element, attempt to return the\n PREMIS version it is using. If none can be found, return the default PREMIS\n version.\n \"\"\"\n for child in data:\n if isinstance(child, dict):\n version = child.get(\"version\")\n if version:\n return version\n return utils.PREMIS_VERSION"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef parsed_event_detail(self):\n attr = (\n \"event_detail_information__event_detail\"\n if self.premis_version == utils.PREMIS_3_0_VERSION\n else \"event_detail\"\n )\n return dict(\n [\n tuple([x.strip(' \"') for x in kv.strip().split(\"=\", 1)])\n for kv in getattr(self, attr).split(\";\")\n ]\n )", "response": "Parse and return our PREMIS eventDetail string value like ::\n\n program = 7z version = 9. 20 algorithm = bzip2 version = 9. 20 program = 7z"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn as a 3 - tuple this PREMIS compression event s program version and algorithm used to perform the compression.", "response": "def compression_details(self):\n \"\"\"Return as a 3-tuple, this PREMIS compression event's program,\n version, and algorithm used to perform the compression.\n \"\"\"\n event_type = self.findtext(\"event_type\")\n if event_type != \"compression\":\n raise AttributeError(\n 'PREMIS events of type \"{}\" have no compression'\n \" details\".format(event_type)\n )\n parsed_compression_event_detail = self.parsed_event_detail\n compression_program = _get_event_detail_attr(\n \"program\", parsed_compression_event_detail\n )\n compression_algorithm = _get_event_detail_attr(\n \"algorithm\", parsed_compression_event_detail\n )\n compression_program_version = _get_event_detail_attr(\n \"version\", parsed_compression_event_detail\n )\n archive_tool = {\"7z\": \"7-Zip\"}.get(compression_program, compression_program)\n return compression_algorithm, compression_program_version, archive_tool"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn a list of dicts representing objects with TRANSFORMTYPE = decompression.", "response": "def get_decompression_transform_files(self, offset=0):\n \"\"\"Returns a list of dicts representing ````\n elements with ``TRANSFORMTYPE=\"decompression\"`` given\n ``compression_algorithm`` which is a comma-separated string of\n algorithms that must be used in the order provided to decompress\n the package, e.g., 'bzip2,tar' or 'lzma'.\n \"\"\"\n compression_algorithm, _, _ = self.compression_details\n return [\n {\n \"algorithm\": algorithm,\n \"order\": str(index + offset + 1),\n \"type\": \"decompression\",\n }\n for index, algorithm in enumerate(compression_algorithm.split(\",\"))\n ]"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef encryption_details(self):\n event_type = self.findtext(\"event_type\")\n if event_type != \"encryption\":\n raise AttributeError(\n 'PREMIS events of type \"{}\" have no encryption'\n \" details\".format(event_type)\n )\n parsed_encryption_event_detail = self.parsed_event_detail\n encryption_program = _get_event_detail_attr(\n \"program\", parsed_encryption_event_detail\n )\n encryption_program_version = _get_event_detail_attr(\n \"version\", parsed_encryption_event_detail\n )\n encryption_key = _get_event_detail_attr(\"key\", parsed_encryption_event_detail)\n return encryption_program, encryption_program_version, encryption_key", "response": "Return as a 3 - tuple this PREMIS encryption event s program version and key used to perform the encryption."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns a test function that returns True if the object has all of the class methods in class_method_names.", "response": "def has_class_methods(*class_method_names):\n \"\"\"Return a test function that, when given a class, returns ``True`` if that\n class has all of the class methods in ``class_method_names``. If an object\n is passed to the test function, check for the class methods on its\n class.\n \"\"\"\n\n def test(cls):\n if not isinstance(cls, type):\n cls = type(cls)\n for class_method_name in class_method_names:\n try:\n class_method = getattr(cls, class_method_name)\n if class_method.__self__ is not cls:\n return False\n except AttributeError:\n return False\n return True\n\n return test"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning a test function that returns True if the object has all of the methods in method_names.", "response": "def has_methods(*method_names):\n \"\"\"Return a test function that, when given an object (class or an\n instance), returns ``True`` if that object has all of the (regular) methods\n in ``method_names``. Note: this is testing for regular methods only and the\n test function will correctly return ``False`` if an instance has one of the\n specified methods as a classmethod or a staticmethod. However, it will\n incorrectly return ``True`` (false positives) for classmethods and\n staticmethods on a *class*.\n \"\"\"\n\n def test(obj):\n for method_name in method_names:\n try:\n method = getattr(obj, method_name)\n except AttributeError:\n return False\n else:\n if not callable(method):\n return False\n if not isinstance(obj, type):\n try:\n # An instance method is a method type with a __self__\n # attribute that references the instance.\n if method.__self__ is not obj:\n return False\n except AttributeError:\n return False\n return True\n\n return test"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef provide(self, feature_name, provider, *args, **kwargs):\n if not self.allow_replace:\n assert feature_name not in self.providers, \"Duplicate feature: {!r}\".format(\n feature_name\n )\n if callable(provider) and not isinstance(provider, type):\n self.providers[feature_name] = lambda: provider(*args, **kwargs)\n else:\n self.providers[feature_name] = lambda: provider", "response": "Provide a feature named feature_name using the provider object provided by the provider."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nencodes or decode url by applying func to all of its URL - encodable parts.", "response": "def _urlendecode(url, func):\n \"\"\"Encode or decode ``url`` by applying ``func`` to all of its\n URL-encodable parts.\n \"\"\"\n parsed = urlparse(url)\n for attr in URL_ENCODABLE_PARTS:\n parsed = parsed._replace(**{attr: func(getattr(parsed, attr))})\n return urlunparse(parsed)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef register(self, backbone_view_class):\n if backbone_view_class not in self._registry:\n self._registry.append(backbone_view_class)", "response": "Registers the given backbone view class."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns the queryset of the object s in the default manager.", "response": "def queryset(self, request, **kwargs):\n \"\"\"\n Returns the queryset (along with ordering) to be used when retrieving object(s).\n \"\"\"\n qs = self.model._default_manager.all()\n if self.ordering:\n qs = qs.order_by(*self.ordering)\n return qs"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nhandling get requests for either the collection or an object detail.", "response": "def get(self, request, id=None, **kwargs):\n \"\"\"\n Handles get requests for either the collection or an object detail.\n \"\"\"\n if not self.has_get_permission(request):\n return HttpResponseForbidden(_('You do not have permission to perform this action.'))\n\n if id:\n obj = get_object_or_404(self.queryset(request, **kwargs), id=id)\n return self.get_object_detail(request, obj)\n else:\n return self.get_collection(request, **kwargs)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_object_detail(self, request, obj):\n if self.display_detail_fields:\n display_fields = self.display_detail_fields\n else:\n display_fields = self.display_fields\n\n data = self.serialize(obj, ['id'] + list(display_fields))\n return HttpResponse(self.json_dumps(data), content_type='application/json')", "response": "Handles get requests for the details of the given object."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nhandle get requests for the list of objects.", "response": "def get_collection(self, request, **kwargs):\n \"\"\"\n Handles get requests for the list of objects.\n \"\"\"\n qs = self.queryset(request, **kwargs)\n\n if self.display_collection_fields:\n display_fields = self.display_collection_fields\n else:\n display_fields = self.display_fields\n\n if self.paginate_by is not None:\n page = request.GET.get('page', 1)\n paginator = Paginator(qs, self.paginate_by)\n try:\n qs = paginator.page(page).object_list\n except PageNotAnInteger:\n data = _('Invalid `page` parameter: Not a valid integer.')\n return HttpResponseBadRequest(data)\n except EmptyPage:\n data = _('Invalid `page` parameter: Out of range.')\n return HttpResponseBadRequest(data)\n data = [\n self.serialize(obj, ['id'] + list(display_fields)) for obj in qs\n ]\n return HttpResponse(self.json_dumps(data), content_type='application/json')"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_object(self, request):\n try:\n # backbone sends data in the body in json format\n # Conditional statement is for backwards compatibility with Django <= 1.3\n data = json.loads(request.body if hasattr(request, 'body') else request.raw_post_data)\n except ValueError:\n return HttpResponseBadRequest(_('Unable to parse JSON request body.'))\n\n form = self.get_form_instance(request, data=data)\n if form.is_valid():\n if not self.has_add_permission_for_data(request, form.cleaned_data):\n return HttpResponseForbidden(_('You do not have permission to perform this action.'))\n\n obj = form.save()\n\n # We return the newly created object's details and a Location header with it's url\n response = self.get_object_detail(request, obj)\n response.status_code = 201\n\n opts = self.model._meta\n url_slug = self.url_slug or (\n opts.model_name if hasattr(opts, 'model_name') else opts.module_name\n )\n url_name = 'backbone:%s_%s_detail' % (self.model._meta.app_label, url_slug)\n response['Location'] = reverse(url_name, args=[obj.id])\n return response\n else:\n return HttpResponseBadRequest(self.json_dumps(form.errors), content_type='application/json')", "response": "Adds an object to the database."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns an instantiated form for adding or editing an object.", "response": "def get_form_instance(self, request, data=None, instance=None):\n \"\"\"\n Returns an instantiated form to be used for adding or editing an object.\n\n The `instance` argument is the model instance (passed only if this form\n is going to be used for editing an existing object).\n \"\"\"\n defaults = {}\n if self.form:\n defaults['form'] = self.form\n if self.fields:\n defaults['fields'] = self.fields\n return modelform_factory(self.model, **defaults)(data=data, instance=instance)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef has_add_permission(self, request):\n perm_string = '%s.add_%s' % (self.model._meta.app_label,\n self.model._meta.object_name.lower()\n )\n return request.user.has_perm(perm_string)", "response": "Returns True if the requesting user has permission to add an object False otherwise."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef has_update_permission(self, request, obj):\n perm_string = '%s.change_%s' % (self.model._meta.app_label,\n self.model._meta.object_name.lower()\n )\n return request.user.has_perm(perm_string)", "response": "Returns True if the requesting user has permission to update the given object False otherwise."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef serialize(self, obj, fields):\n\n data = {}\n remaining_fields = []\n for field in fields:\n if callable(field): # Callable\n data[field.__name__] = field(obj)\n elif hasattr(self, field) and callable(getattr(self, field)): # Method on the view\n data[field] = getattr(self, field)(obj)\n elif hasattr(obj, field): # Callable/property/field on the model\n attr = getattr(obj, field)\n if isinstance(attr, Model):\n data[field] = attr.pk\n elif isinstance(attr, Manager):\n data[field] = [item['pk'] for item in attr.values('pk')]\n elif callable(attr): # Callable on the model\n data[field] = attr()\n else:\n remaining_fields.append(field)\n else:\n raise AttributeError('Invalid field: %s' % field)\n\n # Add on db fields\n serializer = Serializer()\n serializer.serialize([obj], fields=list(remaining_fields))\n data.update(serializer.getvalue()[0]['fields'])\n\n # Any remaining fields should be properties on the model\n remaining_fields = set(remaining_fields) - set(data.keys())\n\n for field in remaining_fields:\n data[field] = getattr(obj, field)\n\n return data", "response": "Serializes a single model instance to a Python dict based on the specified list of fields."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef json_dumps(self, data, **options):\n params = {'sort_keys': True, 'indent': 2}\n params.update(options)\n # This code is based off django's built in JSON serializer\n if json.__version__.split('.') >= ['2', '1', '3']:\n # Use JS strings to represent Python Decimal instances (ticket #16850)\n params.update({'use_decimal': False})\n return json.dumps(data, cls=DjangoJSONEncoder, **params)", "response": "Wrapper around json. dumps that uses a special JSON encoder."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef dir(cls, label, children):\n return FSEntry(label=label, children=children, type=u\"Directory\", use=None)", "response": "Return a new FSEntry directory object."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn an instance of the class object from a single entry.", "response": "def from_fptr(cls, label, type_, fptr):\n \"\"\"Return ``FSEntry`` object.\"\"\"\n return FSEntry(\n label=label,\n type=type_,\n path=fptr.path,\n use=fptr.use,\n file_uuid=fptr.file_uuid,\n derived_from=fptr.derived_from,\n checksum=fptr.checksum,\n checksumtype=fptr.checksumtype,\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the fptr @FILEID of the current file.", "response": "def file_id(self):\n \"\"\" Returns the fptr @FILEID if this is not a Directory. \"\"\"\n if self.type.lower() == \"directory\":\n return None\n if self.file_uuid is None:\n raise exceptions.MetsError(\n \"No FILEID: File %s does not have file_uuid set\" % self.path\n )\n if self.is_aip:\n return os.path.splitext(os.path.basename(self.path))[0]\n return utils.FILE_ID_PREFIX + self.file_uuid"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef group_id(self):\n if self.derived_from is not None:\n return self.derived_from.group_id()\n if self.file_uuid is None:\n return None\n return utils.GROUP_ID_PREFIX + self.file_uuid", "response": "Returns the group_id of the current object."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _add_metadata_element(self, md, subsection, mdtype, mode=\"mdwrap\", **kwargs):\n # HELP how handle multiple amdSecs?\n # When adding *MD which amdSec to add to?\n if mode.lower() == \"mdwrap\":\n othermdtype = kwargs.get(\"othermdtype\")\n mdsec = MDWrap(md, mdtype, othermdtype)\n elif mode.lower() == \"mdref\":\n loctype = kwargs.get(\"loctype\")\n label = kwargs.get(\"label\")\n otherloctype = kwargs.get(\"otherloctype\")\n mdsec = MDRef(md, mdtype, loctype, label, otherloctype)\n subsection = SubSection(subsection, mdsec)\n if subsection.subsection == \"dmdSec\":\n self.dmdsecs.append(subsection)\n else:\n try:\n amdsec = self.amdsecs[0]\n except IndexError:\n amdsec = AMDSec()\n self.amdsecs.append(amdsec)\n amdsec.subsections.append(subsection)\n return subsection", "response": "Add metadata element to the metadata section."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nserializes object md_inst by transforming it into an an . If it already is such return it.", "response": "def serialize_md_inst(self, md_inst, md_class):\n \"\"\"Serialize object ``md_inst`` by transforming it into an\n ``lxml.etree._ElementTree``. If it already is such, return it. If not,\n make sure it is the correct type and return the output of calling\n ``seriaize()`` on it.\n \"\"\"\n valid_insts = tuple(\n chain((etree._ElementTree, etree._Element), six.string_types)\n )\n if isinstance(md_inst, valid_insts):\n return md_inst\n if not isinstance(md_inst, md_class):\n raise TypeError(\n \"Instance {!r} must be instance of {!r}\".format(md_inst, md_class)\n )\n return md_inst.serialize()"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd a child FSEntry to this FSEntry.", "response": "def add_child(self, child):\n \"\"\"Add a child FSEntry to this FSEntry.\n\n Only FSEntrys with a type of 'directory' can have children.\n\n This does not detect cyclic parent/child relationships, but that will\n cause problems.\n\n :param metsrw.fsentry.FSEntry child: FSEntry to add as a child\n :return: The newly added child\n :raises ValueError: If this FSEntry cannot have children.\n :raises ValueError: If the child and the parent are the same\n \"\"\"\n if self.type.lower() != \"directory\":\n raise ValueError(\"Only directory objects can have children\")\n if child is self:\n raise ValueError(\"Cannot be a child of itself!\")\n if child not in self._children:\n self._children.append(child)\n child.parent = self\n return child"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nremoves a child from this FSEntry.", "response": "def remove_child(self, child):\n \"\"\"\n Remove a child from this FSEntry\n\n If `child` is not actually a child of this entry, nothing happens.\n\n :param child: Child to remove\n \"\"\"\n try:\n self._children.remove(child)\n except ValueError: # Child may not be in list\n pass\n else:\n child.parent = None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning the file element for this FSEntry and appropriate for use in a fileSec.", "response": "def serialize_filesec(self):\n \"\"\"\n Return the file Element for this file, appropriate for use in a fileSec.\n\n If this is not an Item or has no use, return None.\n\n :return: fileSec element for this FSEntry\n \"\"\"\n if (\n self.type.lower() not in (\"item\", \"archival information package\")\n or self.use is None\n ):\n return None\n el = etree.Element(utils.lxmlns(\"mets\") + \"file\", ID=self.file_id())\n if self.group_id():\n el.attrib[\"GROUPID\"] = self.group_id()\n if self.admids:\n el.set(\"ADMID\", \" \".join(self.admids))\n if self.checksum and self.checksumtype:\n el.attrib[\"CHECKSUM\"] = self.checksum\n el.attrib[\"CHECKSUMTYPE\"] = self.checksumtype\n if self.path:\n flocat = etree.SubElement(el, utils.lxmlns(\"mets\") + \"FLocat\")\n # Setting manually so order is correct\n try:\n flocat.set(utils.lxmlns(\"xlink\") + \"href\", utils.urlencode(self.path))\n except ValueError:\n raise exceptions.SerializeError(\n 'Value \"{}\" (for attribute xlink:href) is not a valid'\n \" URL.\".format(self.path)\n )\n flocat.set(\"LOCTYPE\", \"OTHER\")\n flocat.set(\"OTHERLOCTYPE\", \"SYSTEM\")\n for transform_file in self.transform_files:\n transform_file_el = etree.SubElement(\n el, utils.lxmlns(\"mets\") + \"transformFile\"\n )\n for key, val in transform_file.items():\n attribute = \"transform{}\".format(key).upper()\n transform_file_el.attrib[attribute] = str(val)\n return el"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning True if this fs item is a directory with no children or only other empty directories as children.", "response": "def is_empty_dir(self):\n \"\"\"Returns ``True`` if this fs item is a directory with no children or\n a directory with only other empty directories as children.\n \"\"\"\n if self.mets_div_type == \"Directory\":\n children = self._children\n if children:\n if all(child.is_empty_dir for child in children):\n return True\n else:\n return False\n else:\n return True\n else:\n return False"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef serialize_structmap(self, recurse=True, normative=False):\n if not self.label:\n return None\n # Empty directories are not included in the physical structmap.\n if self.is_empty_dir and not normative:\n return None\n el = etree.Element(utils.lxmlns(\"mets\") + \"div\", TYPE=self.mets_div_type)\n el.attrib[\"LABEL\"] = self.label\n if (not normative) and self.file_id():\n etree.SubElement(el, utils.lxmlns(\"mets\") + \"fptr\", FILEID=self.file_id())\n if self.dmdids:\n if (not normative) or (normative and self.is_empty_dir):\n el.set(\"DMDID\", \" \".join(self.dmdids))\n if recurse and self._children:\n for child in self._children:\n child_el = child.serialize_structmap(\n recurse=recurse, normative=normative\n )\n if child_el is not None:\n el.append(child_el)\n return el", "response": "Return the div element for this FSEntry and all its children."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks if the given secret string matches the known secret string.", "response": "def check_secret(self, secret):\n \"\"\"Checks if the secret string used in the authentication attempt\n matches the \"known\" secret string. Some mechanisms will override this\n method to control how this comparison is made.\n\n Args:\n secret: The secret string to compare against what was used in the\n authentication attempt.\n\n Returns:\n True if the given secret matches the authentication attempt.\n\n \"\"\"\n try:\n return hmac.compare_digest(secret, self.secret)\n except AttributeError: # pragma: no cover\n return secret == self.secret"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn a new SASLAuth object with only authentication mechanisms that are secure for use in non - encrypted sessions.", "response": "def secure(cls):\n \"\"\"Uses only authentication mechanisms that are secure for use in\n non-encrypted sessions.\n\n Returns:\n A new :class:`SASLAuth` object.\n\n \"\"\"\n builtin_mechs = cls._get_builtin_mechanisms()\n secure_mechs = [mech for _, mech in builtin_mechs.items()\n if not mech.insecure and mech.priority is not None]\n return SASLAuth(secure_mechs)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nuse only authentication mechanisms that provide the credentials in un - hashed form typically meaning is True.", "response": "def plaintext(cls):\n \"\"\"Uses only authentication mechanisms that provide the credentials in\n un-hashed form, typically meaning\n :attr:`~pysasl.AuthenticationCredentials.has_secret` is True.\n\n Returns:\n A new :class:`SASLAuth` object.\n\n \"\"\"\n builtin_mechs = cls._get_builtin_mechanisms()\n plaintext_mechs = [mech for _, mech in builtin_mechs.items()\n if mech.insecure and mech.priority is not None]\n return SASLAuth(plaintext_mechs)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nlist of server mechanisms available for this session.", "response": "def server_mechanisms(self):\n \"\"\"List of available :class:`ServerMechanism` objects.\"\"\"\n return [mech for mech in self.mechs.values()\n if isinstance(mech, ServerMechanism)]"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef client_mechanisms(self):\n return [mech for mech in self.mechs.values()\n if isinstance(mech, ClientMechanism)]", "response": "List of available ClientMechanisms."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_server(self, name):\n mech = self.get(name)\n return mech if isinstance(mech, ServerMechanism) else None", "response": "Like get but only mechanisms inheriting\n will be returned."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nlikes get but only mechanisms inheriting will be returned.", "response": "def get_client(self, name):\n \"\"\"Like :meth:`.get`, but only mechanisms inheriting\n :class:`ClientMechanism` will be returned.\n\n Args:\n name: The SASL mechanism name.\n\n Returns:\n The mechanism object or ``None``\n\n \"\"\"\n mech = self.get(name)\n return mech if isinstance(mech, ClientMechanism) else None"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef read(cls, source):\n if hasattr(source, \"read\"):\n return cls.fromfile(source)\n if os.path.exists(source):\n return cls.fromfile(source)\n if isinstance(source, six.string_types):\n source = source.encode(\"utf8\")\n return cls.fromstring(source)", "response": "Read a METS file into a METSDocument instance."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncollect all files into a set including all descendants.", "response": "def _collect_all_files(self, files=None):\n \"\"\"\n Collect all FSEntrys into a set, including all descendants.\n\n :param list files: List of :class:`FSEntry` to traverse.\n :returns: Set of FSEntry\n \"\"\"\n if files is None:\n files = self._root_elements\n collected = set()\n for entry in files:\n collected.add(entry)\n collected.update(self._collect_all_files(entry.children))\n return collected"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the file that matches the given parameters.", "response": "def get_file(self, **kwargs):\n \"\"\"\n Return the FSEntry that matches parameters.\n\n :param str file_uuid: UUID of the target FSEntry.\n :param str label: structMap LABEL of the target FSEntry.\n :param str type: structMap TYPE of the target FSEntry.\n :returns: :class:`FSEntry` that matches parameters, or None.\n \"\"\"\n # TODO put this in a sqlite DB so it can be queried efficiently\n # TODO handle multiple matches (with DB?)\n # TODO check that kwargs are actual attrs\n for entry in self.all_files():\n if all(value == getattr(entry, key) for key, value in kwargs.items()):\n return entry\n return None"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nadds an FSEntry object to this METS document s tree.", "response": "def append_file(self, fs_entry):\n \"\"\"\n Adds an FSEntry object to this METS document's tree. Any of the\n represented object's children will also be added to the document.\n\n A given FSEntry object can only be included in a document once,\n and any attempt to add an object the second time will be ignored.\n\n :param metsrw.mets.FSEntry fs_entry: FSEntry to add to the METS document\n \"\"\"\n\n if fs_entry in self._root_elements:\n return\n self._root_elements.append(fs_entry)\n # Reset file lists so they get regenerated with the new files(s)\n self._all_files = None"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nremoves an entry from this METS document.", "response": "def remove_entry(self, fs_entry):\n \"\"\"Removes an FSEntry object from this METS document.\n\n Any children of this FSEntry will also be removed. This will be removed\n as a child of it's parent, if any.\n\n :param metsrw.mets.FSEntry fs_entry: FSEntry to remove from the METS\n \"\"\"\n try:\n self._root_elements.remove(fs_entry)\n except ValueError: # fs_entry may not be in the root elements\n pass\n if fs_entry.parent:\n fs_entry.parent.remove_child(fs_entry)\n # Reset file lists so they get regenerated without the removed file(s)\n self._all_files = None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _document_root(self, fully_qualified=True):\n nsmap = {\"xsi\": utils.NAMESPACES[\"xsi\"], \"xlink\": utils.NAMESPACES[\"xlink\"]}\n if fully_qualified:\n nsmap[\"mets\"] = utils.NAMESPACES[\"mets\"]\n else:\n nsmap[None] = utils.NAMESPACES[\"mets\"]\n attrib = {\n \"{}schemaLocation\".format(utils.lxmlns(\"xsi\")): utils.SCHEMA_LOCATIONS\n }\n if self.objid:\n attrib[\"OBJID\"] = self.objid\n return etree.Element(utils.lxmlns(\"mets\") + \"mets\", nsmap=nsmap, attrib=attrib)", "response": "Return the mets Element for the document root."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the metsHdr Element.", "response": "def _mets_header(self, now):\n \"\"\"\n Return the metsHdr Element.\n \"\"\"\n header_tag = etree.QName(utils.NAMESPACES[u\"mets\"], u\"metsHdr\")\n header_attrs = {}\n\n if self.createdate is None:\n header_attrs[u\"CREATEDATE\"] = now\n else:\n header_attrs[u\"CREATEDATE\"] = self.createdate\n header_attrs[u\"LASTMODDATE\"] = now\n\n header_element = etree.Element(header_tag, **header_attrs)\n for agent in self.agents:\n header_element.append(agent.serialize())\n for alternate_id in self.alternate_ids:\n header_element.append(alternate_id.serialize())\n\n return header_element"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncollecting all dmdSec and amdSec classes associated with the files.", "response": "def _collect_mdsec_elements(files):\n \"\"\"\n Return all dmdSec and amdSec classes associated with the files.\n\n Returns all dmdSecs, then all amdSecs, so they only need to be\n serialized before being appended to the METS document.\n\n :param List files: List of :class:`FSEntry` to collect MDSecs for.\n :returns: List of AMDSecs and SubSections\n \"\"\"\n dmdsecs = []\n amdsecs = []\n for f in files:\n for d in f.dmdsecs:\n dmdsecs.append(d)\n for a in f.amdsecs:\n amdsecs.append(a)\n\n dmdsecs.sort(key=lambda x: x.id_string)\n amdsecs.sort(key=lambda x: x.id_string)\n return dmdsecs + amdsecs"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _structmap(self):\n structmap = etree.Element(\n utils.lxmlns(\"mets\") + \"structMap\",\n TYPE=\"physical\",\n # TODO Add ability for multiple structMaps\n ID=\"structMap_1\",\n # TODO don't hardcode this\n LABEL=\"Archivematica default\",\n )\n for item in self._root_elements:\n child = item.serialize_structmap(recurse=True)\n if child is not None:\n structmap.append(child)\n\n return structmap", "response": "Returns the structMap element for all files."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning fileSec Element containing all files grouped by use.", "response": "def _filesec(self, files=None):\n \"\"\"\n Returns fileSec Element containing all files grouped by use.\n \"\"\"\n if files is None:\n files = self.all_files()\n\n filesec = etree.Element(utils.lxmlns(\"mets\") + \"fileSec\")\n filegrps = {}\n for file_ in files:\n if file_.type.lower() not in (\"item\", AIP_ENTRY_TYPE):\n continue\n # Get fileGrp, or create if not exist\n filegrp = filegrps.get(file_.use)\n if filegrp is None:\n filegrp = etree.SubElement(\n filesec, utils.lxmlns(\"mets\") + \"fileGrp\", USE=file_.use\n )\n filegrps[file_.use] = filegrp\n\n file_el = file_.serialize_filesec()\n if file_el is not None:\n filegrp.append(file_el)\n\n return filesec"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns this document serialized to an xml Element.", "response": "def serialize(self, fully_qualified=True):\n \"\"\"\n Returns this document serialized to an xml Element.\n\n :return: Element for this document\n \"\"\"\n now = datetime.utcnow().replace(microsecond=0).isoformat(\"T\")\n files = self.all_files()\n mdsecs = self._collect_mdsec_elements(files)\n root = self._document_root(fully_qualified=fully_qualified)\n root.append(self._mets_header(now=now))\n for section in mdsecs:\n root.append(section.serialize(now=now))\n root.append(self._filesec(files))\n root.append(self._structmap())\n root.append(self._normative_structmap())\n return root"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef tostring(self, fully_qualified=True, pretty_print=True, encoding=\"UTF-8\"):\n root = self.serialize(fully_qualified=fully_qualified)\n kwargs = {\"pretty_print\": pretty_print, \"encoding\": encoding}\n if encoding != \"unicode\":\n kwargs[\"xml_declaration\"] = True\n return etree.tostring(root, **kwargs)", "response": "Serialize and return a string of this METS document."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nserialize and write this METS document to filepath.", "response": "def write(\n self, filepath, fully_qualified=True, pretty_print=False, encoding=\"UTF-8\"\n ):\n \"\"\"Serialize and write this METS document to `filepath`.\n\n The default encoding is ``UTF-8``. This method will return a unicode\n string when ``encoding`` is set to ``unicode``.\n\n :param str filepath: Path to write the METS document to\n \"\"\"\n root = self.serialize(fully_qualified=fully_qualified)\n tree = root.getroottree()\n kwargs = {\"pretty_print\": pretty_print, \"encoding\": encoding}\n if encoding != \"unicode\":\n kwargs[\"xml_declaration\"] = True\n tree.write(filepath, **kwargs)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _parse_tree_structmap(self, tree, parent_elem, normative_parent_elem=None):\n siblings = []\n el_to_normative = self._get_el_to_normative(parent_elem, normative_parent_elem)\n for elem, normative_elem in el_to_normative.items():\n if elem.tag != utils.lxmlns(\"mets\") + \"div\":\n continue # Only handle divs, not fptrs\n entry_type = elem.get(\"TYPE\")\n label = elem.get(\"LABEL\")\n fptr_elems = elem.findall(\"mets:fptr\", namespaces=utils.NAMESPACES)\n # Directories are walked recursively. Additionally, they may\n # contain direct fptrs.\n if entry_type.lower() == \"directory\":\n children = self._parse_tree_structmap(\n tree, elem, normative_parent_elem=normative_elem\n )\n fs_entry = fsentry.FSEntry.dir(label, children)\n self._add_dmdsecs_to_fs_entry(elem, fs_entry, tree)\n siblings.append(fs_entry)\n for fptr_elem in fptr_elems:\n fptr = self._analyze_fptr(fptr_elem, tree, entry_type)\n fs_entry = fsentry.FSEntry.from_fptr(\n label=None, type_=u\"Item\", fptr=fptr\n )\n self._add_amdsecs_to_fs_entry(fptr.amdids, fs_entry, tree)\n siblings.append(fs_entry)\n continue\n # Other types, e.g.: items, aips...\n if not len(fptr_elems):\n continue\n fptr = self._analyze_fptr(fptr_elems[0], tree, entry_type)\n fs_entry = fsentry.FSEntry.from_fptr(label, entry_type, fptr)\n self._add_dmdsecs_to_fs_entry(elem, fs_entry, tree)\n self._add_amdsecs_to_fs_entry(fptr.amdids, fs_entry, tree)\n siblings.append(fs_entry)\n return siblings", "response": "Recursively parse all the children of parent_elem including amdSecs\n and dmdSecsStrs."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns ordered dict el_to_normative which maps children of parent_elem to their normative counterparts in the children of normative_parent_elem or to None.", "response": "def _get_el_to_normative(parent_elem, normative_parent_elem):\n \"\"\"Return ordered dict ``el_to_normative``, which maps children of\n ``parent_elem`` to their normative counterparts in the children of\n ``normative_parent_elem`` or to ``None`` if there is no normative\n parent. If there is a normative div element with no non-normative\n counterpart, that element is treated as a key with value ``None``.\n This allows us to create ``FSEntry`` instances for empty directory div\n elements, which are only documented in a normative logical structmap.\n \"\"\"\n el_to_normative = OrderedDict()\n if normative_parent_elem is None:\n for el in parent_elem:\n el_to_normative[el] = None\n else:\n for norm_el in normative_parent_elem:\n matches = [\n el\n for el in parent_elem\n if el.get(\"TYPE\") == norm_el.get(\"TYPE\")\n and el.get(\"LABEL\") == norm_el.get(\"LABEL\")\n ]\n if matches:\n el_to_normative[matches[0]] = norm_el\n else:\n el_to_normative[norm_el] = None\n return el_to_normative"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fromfile(cls, path):\n parser = etree.XMLParser(remove_blank_text=True)\n\n return cls.fromtree(etree.parse(path, parser=parser))", "response": "Creates a METS object from a METS file."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef fromstring(cls, string):\n parser = etree.XMLParser(remove_blank_text=True)\n root = etree.fromstring(string, parser)\n tree = root.getroottree()\n\n return cls.fromtree(tree)", "response": "Create a METS object by parsing a string containing a METS document."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncreate a METS object from an ElementTree or Element.", "response": "def fromtree(cls, tree):\n \"\"\"\n Create a METS from an ElementTree or Element.\n\n :param ElementTree tree: ElementTree to build a METS document from.\n \"\"\"\n mets = cls()\n mets.tree = tree\n mets._parse_tree(tree)\n\n return mets"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_schematron(sct_path):\n sct_path = _get_file_path(sct_path)\n parser = etree.XMLParser(remove_blank_text=True)\n sct_doc = etree.parse(sct_path, parser=parser)\n return isoschematron.Schematron(sct_doc, store_report=True)", "response": "Return an isoschematron. Schematron instance using the schematron file at sct_path."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef validate(mets_doc, xmlschema=METS_XSD_PATH, schematron=AM_SCT_PATH):\n is_xsd_valid, xsd_error_log = xsd_validate(mets_doc, xmlschema=xmlschema)\n is_sct_valid, sct_report = schematron_validate(mets_doc, schematron=schematron)\n valid = is_xsd_valid and is_sct_valid\n report = {\n \"is_xsd_valid\": is_xsd_valid,\n \"is_sct_valid\": is_sct_valid,\n \"xsd_error_log\": xsd_error_log,\n \"sct_report\": sct_report,\n }\n report[\"report\"] = report_string(report)\n return valid, report", "response": "Validate a METS file using a METS XML schema and a METS SCT schema."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_xmlschema(xmlschema, mets_doc):\n xsd_path = _get_file_path(xmlschema)\n xmlschema = etree.parse(xsd_path)\n schema_locations = set(\n mets_doc.xpath(\"//*/@xsi:schemaLocation\", namespaces=NAMESPACES)\n )\n for schema_location in schema_locations:\n namespaces_locations = schema_location.strip().split()\n for namespace, location in zip(*[iter(namespaces_locations)] * 2):\n if namespace == NAMESPACES[\"mets\"]:\n continue\n xs_import = etree.Element(\"{http://www.w3.org/2001/XMLSchema}import\")\n xs_import.attrib[\"namespace\"] = namespace\n xs_import.attrib[\"schemaLocation\"] = location\n xmlschema.getroot().insert(0, xs_import)\n return etree.XMLSchema(xmlschema)", "response": "Return an XMLSchema instance given the path to the XMLSchema file in xmlschema and the XMLTree instance mets_doc representing the METS file being parsed."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nvalidating a METS file using a schematron schema. Return a boolean indicating validity and a lxml. ElementTree instance indicating the validation result.", "response": "def schematron_validate(mets_doc, schematron=AM_SCT_PATH):\n \"\"\"Validate a METS file using a schematron schema. Return a boolean\n indicating validity and a report as an ``lxml.ElementTree`` instance.\n \"\"\"\n if isinstance(schematron, six.string_types):\n schematron = get_schematron(schematron)\n is_valid = schematron.validate(mets_doc)\n report = schematron.validation_report\n return is_valid, report"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn a human - readable string representation of the error report returned by lxml s schematron validator.", "response": "def sct_report_string(report):\n \"\"\"Return a human-readable string representation of the error report\n returned by lxml's schematron validator.\n \"\"\"\n ret = []\n namespaces = {\"svrl\": \"http://purl.oclc.org/dsdl/svrl\"}\n for index, failed_assert_el in enumerate(\n report.findall(\"svrl:failed-assert\", namespaces=namespaces)\n ):\n ret.append(\n \"{}. {}\".format(\n index + 1,\n failed_assert_el.find(\"svrl:text\", namespaces=namespaces).text,\n )\n )\n ret.append(\" test: {}\".format(failed_assert_el.attrib[\"test\"]))\n ret.append(\" location: {}\".format(failed_assert_el.attrib[\"location\"]))\n ret.append(\"\\n\")\n return \"\\n\".join(ret)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning a human - readable string representation of the error log returned by lxml s XMLSchema validator.", "response": "def xsd_error_log_string(xsd_error_log):\n \"\"\"Return a human-readable string representation of the error log\n returned by lxml's XMLSchema validator.\n \"\"\"\n ret = []\n for error in xsd_error_log:\n ret.append(\n \"ERROR ON LINE {}: {}\".format(error.line, error.message.encode(\"utf-8\"))\n )\n return \"\\n\".join(ret)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef validate_arguments(self, view_class, kwargs):\n\n # Check view_class inherit from django View\n if not issubclass(view_class, View):\n raise PopupViewIsNotSubclassView()\n\n self.view_class_name = view_class.__name__\n self.popup_dialog_title = kwargs.pop(\"popup_dialog_title\", _(\"Popup Dialog: Select value\"))\n\n self.callback_data = kwargs.pop(\"callback_data\", {})\n if not isinstance(self.callback_data, dict):\n raise AttributeError(\"callback_data argument must be a dictionary\")\n try:\n self.callback_data = urllib.urlencode(self.callback_data)\n except AttributeError:\n self.callback_data = urllib.parse.urlencode(self.callback_data)", "response": "Validate the arguments passed to the popup dialog."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_view_url(self):\n url = reverse(\"django_popup_view_field:get_popup_view\", args=(self.view_class_name,))\n return \"{url}?{cd}\".format(\n url=url,\n cd=self.callback_data\n )", "response": "Return url for ajax to view for render dialog content"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nexecutes flush. Ignores any errors and sets the timer to the next time it is called.", "response": "def on_timer(self):\n \"\"\"Executes flush(). Ignores any errors to make sure one exception\n doesn't halt the whole flushing process.\n \"\"\"\n try:\n self.flush()\n except Exception as e:\n log.exception('Error while flushing: %s', e)\n self._set_timer()"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nlogging timing information as the number of microseconds since the provided time", "response": "def timing_since(self, stat, start, sample_rate=1):\n \"\"\"\n Log timing information as the number of microseconds since the provided time float\n >>> start = time.time()\n >>> # do stuff\n >>> statsd_client.timing_since('some.time', start)\n \"\"\"\n self.timing(stat, int((time.time() - start) * 1000000), sample_rate)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef timing(self, stat, time, sample_rate=1):\n stats = {stat: \"%f|ms\" % time}\n self.send(stats, sample_rate)", "response": "Log timing information for a single stat"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef gauge(self, stat, value, sample_rate=1):\n stats = {stat: \"%f|g\" % value}\n self.send(stats, sample_rate)", "response": "Log gauge information for a single stat"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef update_stats(self, stats, delta, sample_rate=1):\n if not isinstance(stats, list):\n stats = [stats]\n\n data = dict((stat, \"%s|c\" % delta) for stat in stats)\n self.send(data, sample_rate)", "response": "Updates one or more stats counters by arbitrary amounts\n "} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsends data over UDP and return the number of times the data was sent", "response": "def send(self, data, sample_rate=1):\n \"\"\"\n Squirt the metrics over UDP\n \"\"\"\n\n if self.prefix:\n data = dict((\".\".join((self.prefix, stat)), value) for stat, value in data.items())\n\n if sample_rate < 1:\n if random.random() > sample_rate:\n return\n sampled_data = dict((stat, \"%s|@%s\" % (value, sample_rate))\n for stat, value in data.items())\n else:\n sampled_data = data\n\n try:\n [self.udp_sock.sendto(bytes(bytearray(\"%s:%s\" % (stat, value),\n \"utf-8\")), self.addr)\n for stat, value in sampled_data.items()]\n except:\n self.log.exception(\"unexpected error\")"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_auth_providers(self, netloc):\n url = \"https://%s/info/system?null\" % (netloc)\n\n response = requests.get(url, verify=self.verify)\n if not response.ok or not hasattr(response, \"json\"):\n error_message = '%s Unexpected Error: %s for uri: %s\\nText: %r' %\\\n (response.status_code,\n response.reason,\n response.url,\n response.text)\n raise iControlUnexpectedHTTPError(error_message, response=response)\n respJson = response.json()\n result = respJson['providers']\n return result", "response": "BIG - IP specific query for auth providers"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_new_token(self, netloc):\n login_body = {\n 'username': self.username,\n 'password': self.password,\n }\n\n if self.auth_provider:\n if self.auth_provider == 'local':\n login_body['loginProviderName'] = 'local'\n elif self.auth_provider == 'tmos':\n login_body['loginProviderName'] = 'tmos'\n elif self.auth_provider not in ['none', 'default']:\n providers = self.get_auth_providers(netloc)\n for provider in providers:\n if self.auth_provider in provider['link']:\n login_body['loginProviderName'] = provider['name']\n break\n elif self.auth_provider == provider['name']:\n login_body['loginProviderName'] = provider['name']\n break\n else:\n if self.login_provider_name == 'tmos':\n login_body['loginProviderName'] = self.login_provider_name\n\n login_url = \"https://%s/mgmt/shared/authn/login\" % (netloc)\n\n response = requests.post(\n login_url,\n json=login_body,\n verify=self.verify,\n auth=HTTPBasicAuth(self.username, self.password)\n )\n self.attempts += 1\n if not response.ok or not hasattr(response, \"json\"):\n error_message = '%s Unexpected Error: %s for uri: %s\\nText: %r' %\\\n (response.status_code,\n response.reason,\n response.url,\n response.text)\n raise iControlUnexpectedHTTPError(error_message,\n response=response)\n respJson = response.json()\n\n token = self._get_token_from_response(respJson)\n created_bigip = self._get_last_update_micros(token)\n\n try:\n expiration_bigip = self._get_expiration_micros(\n token, created_bigip\n )\n except (KeyError, ValueError):\n error_message = \\\n '%s Unparseable Response: %s for uri: %s\\nText: %r' %\\\n (response.status_code,\n response.reason,\n response.url,\n response.text)\n raise iControlUnexpectedHTTPError(error_message,\n response=response)\n\n try:\n self.expiration = self._get_token_expiration_time(\n created_bigip, expiration_bigip\n )\n except iControlUnexpectedHTTPError:\n error_message = \\\n '%s Token already expired: %s for uri: %s\\nText: %r' % \\\n (response.status_code,\n time.ctime(expiration_bigip),\n response.url,\n response.text)\n raise iControlUnexpectedHTTPError(error_message,\n response=response)", "response": "Get a new token from BIG - IP and store it internally."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef generate_bigip_uri(base_uri, partition, name, sub_path, suffix, **kwargs):\n '''(str, str, str) --> str\n\n This function checks the supplied elements to see if each conforms to\n the specification for the appropriate part of the URI. These validations\n are conducted by the helper function _validate_uri_parts.\n After validation the parts are assembled into a valid BigIP REST URI\n string which is then submitted with appropriate metadata.\n\n >>> generate_bigip_uri('https://0.0.0.0/mgmt/tm/ltm/nat/', \\\n 'CUSTOMER1', 'nat52', params={'a':1})\n 'https://0.0.0.0/mgmt/tm/ltm/nat/~CUSTOMER1~nat52'\n >>> generate_bigip_uri('https://0.0.0.0/mgmt/tm/ltm/nat/', \\\n 'CUSTOMER1', 'nat52', params={'a':1}, suffix='/wacky')\n 'https://0.0.0.0/mgmt/tm/ltm/nat/~CUSTOMER1~nat52/wacky'\n >>> generate_bigip_uri('https://0.0.0.0/mgmt/tm/ltm/nat/', '', '', \\\n params={'a':1}, suffix='/thwocky')\n 'https://0.0.0.0/mgmt/tm/ltm/nat/thwocky'\n\n ::Warning: There are cases where '/' and '~' characters are valid in the\n object name or subPath. This is indicated by passing the 'transform_name' or 'transform_subpath' boolean\n respectively as True. By default this is set to False.\n '''\n\n _validate_uri_parts(base_uri, partition, name, sub_path, suffix,\n **kwargs)\n\n if kwargs.get('transform_name', False):\n if name != '':\n name = name.replace('/', '~')\n if kwargs.get('transform_subpath', False):\n if sub_path != '':\n sub_path = sub_path.replace('/', '~')\n if partition != '':\n partition = '~' + partition\n else:\n if sub_path:\n msg = 'When giving the subPath component include partition ' \\\n 'as well.'\n raise InvalidURIComponentPart(msg)\n if sub_path != '' and partition != '':\n sub_path = '~' + sub_path\n if name != '' and partition != '':\n name = '~' + name\n tilded_partition_and_instance = partition + sub_path + name\n if suffix and not tilded_partition_and_instance:\n suffix = suffix.lstrip('/')\n\n REST_uri = base_uri + tilded_partition_and_instance + suffix\n return REST_uri", "response": "This function generates a BigIP REST URI from the supplied parts."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nprepares and Post-Process HTTP VERB method for BigIP-RESTServer request. This function decorates all of the HTTP VERB methods in the iControlRESTSession class. It provides the core logic for this module. If necessary it validates and assembles a uri from parts with a call to `generate_bigip_uri`. Then it: 1. pre-logs the details of the request 2. submits the request 3. logs the response, included expected status codes 4. raises exceptions for unexpected status codes. (i.e. not doc'd as BigIP RESTServer codes.)", "response": "def decorate_HTTP_verb_method(method):\n \"\"\"Prepare and Post-Process HTTP VERB method for BigIP-RESTServer request.\n\n This function decorates all of the HTTP VERB methods in the\n iControlRESTSession class. It provides the core logic for this module.\n If necessary it validates and assembles a uri from parts with a call to\n `generate_bigip_uri`.\n\n Then it:\n\n 1. pre-logs the details of the request\n 2. submits the request\n 3. logs the response, included expected status codes\n 4. raises exceptions for unexpected status codes. (i.e. not doc'd as BigIP\n RESTServer codes.)\n \"\"\"\n @functools.wraps(method)\n def wrapper(self, RIC_base_uri, **kwargs):\n partition = kwargs.pop('partition', '')\n sub_path = kwargs.pop('subPath', '')\n suffix = kwargs.pop('suffix', '')\n identifier, kwargs = _unique_resource_identifier_from_kwargs(**kwargs)\n uri_as_parts = kwargs.pop('uri_as_parts', False)\n transform_name = kwargs.pop('transform_name', False)\n transform_subpath = kwargs.pop('transform_subpath', False)\n if uri_as_parts:\n REST_uri = generate_bigip_uri(RIC_base_uri, partition, identifier,\n sub_path, suffix,\n transform_name=transform_name,\n transform_subpath=transform_subpath,\n **kwargs)\n else:\n REST_uri = RIC_base_uri\n pre_message = \"%s WITH uri: %s AND suffix: %s AND kwargs: %s\" %\\\n (method.__name__, REST_uri, suffix, kwargs)\n\n logger = logging.getLogger(__name__)\n logger.debug(pre_message)\n response = method(self, REST_uri, **kwargs)\n post_message =\\\n \"RESPONSE::STATUS: %s Content-Type: %s Content-Encoding:\"\\\n \" %s\\nText: %r\" % (response.status_code,\n response.headers.get('Content-Type', None),\n response.headers.get('Content-Encoding', None),\n response.text)\n logger.debug(post_message)\n if response.status_code not in range(200, 207):\n error_message = '%s Unexpected Error: %s for uri: %s\\nText: %r' %\\\n (response.status_code,\n response.reason,\n response.url,\n response.text)\n raise iControlUnexpectedHTTPError(error_message, response=response)\n return response\n return wrapper"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchooses an identifier given different choices from the given kwargs and returns it.", "response": "def _unique_resource_identifier_from_kwargs(**kwargs):\n \"\"\"Chooses an identifier given different choices\n\n The unique identifier in BIG-IP's REST API at the time of this writing\n is called 'name'. This is in contrast to the unique identifier that is\n used by iWorkflow and BIG-IQ which at some times is 'name' and other\n times is 'uuid'.\n\n For example, in iWorkflow, there consider this URI\n\n * https://10.2.2.3/mgmt/cm/cloud/tenants/{0}/services/iapp\n\n Then consider this iWorkflow URI\n\n * https://localhost/mgmt/cm/cloud/connectors/local/{0}\n\n In the first example, the identifier, {0}, is what we would normally\n consider a name. For example, \"tenant1\". In the second example though,\n the value is expected to be what we would normally consider to be a\n UUID. For example, '244bd478-374e-4eb2-8c73-6e46d7112604'.\n\n This method only tries to rectify the problem of which to use.\n\n I believe there might be some change that the two can appear together,\n although I have not yet experienced it. If it is possible, I believe it\n would happen in BIG-IQ/iWorkflow land where the UUID and Name both have\n significance. That's why I deliberately prefer the UUID when it exists\n in the parameters sent to the URL.\n\n :param kwargs:\n :return:\n \"\"\"\n name = kwargs.pop('name', '')\n uuid = kwargs.pop('uuid', '')\n id = kwargs.pop('id', '')\n if uuid:\n return uuid, kwargs\n elif id:\n # Used for /mgmt/cm/system/authn/providers/tmos on BIG-IP\n return id, kwargs\n else:\n return name, kwargs"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsend a HTTP DELETE command to the BIGIP REST Server.", "response": "def delete(self, uri, **kwargs):\n \"\"\"Sends a HTTP DELETE command to the BIGIP REST Server.\n\n Use this method to send a DELETE command to the BIGIP. When calling\n this method with the optional arguments ``name`` and ``partition``\n as part of ``**kwargs`` they will be added to the ``uri`` passed\n in separated by ~ to create a proper BIGIP REST API URL for objects.\n\n All other parameters passed in as ``**kwargs`` are passed directly\n to the :meth:`requests.Session.delete`\n\n :param uri: A HTTP URI\n :type uri: str\n :param name: The object name that will be appended to the uri\n :type name: str\n :arg partition: The partition name that will be appened to the uri\n :type partition: str\n :param \\**kwargs: The :meth:`reqeusts.Session.delete` optional params\n \"\"\"\n args1 = get_request_args(kwargs)\n args2 = get_send_args(kwargs)\n req = requests.Request('DELETE', uri, **args1)\n prepared = self.session.prepare_request(req)\n if self.debug:\n self._debug_output.append(debug_prepared_request(prepared))\n return self.session.send(prepared, **args2)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nappend text to the User - Agent header for the request.", "response": "def append_user_agent(self, user_agent):\n \"\"\"Append text to the User-Agent header for the request.\n\n Use this method to update the User-Agent header by appending the\n given string to the session's User-Agent header separated by a space.\n\n :param user_agent: A string to append to the User-Agent header\n :type user_agent: str\n \"\"\"\n old_ua = self.session.headers.get('User-Agent', '')\n ua = old_ua + ' ' + user_agent\n self.session.headers['User-Agent'] = ua.strip()"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef validate_public_key(value):\n is_valid = False\n exc = None\n\n for load in (load_pem_public_key, load_ssh_public_key):\n if not is_valid:\n try:\n load(value.encode('utf-8'), default_backend())\n is_valid = True\n except Exception as e:\n exc = e\n\n if not is_valid:\n raise ValidationError('Public key is invalid: %s' % exc)", "response": "Validate that the given value is a valid RSA Public key in either PEM or OpenSSH format."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nhandling pdf and epub format.", "response": "def to_file(self, output_filename):\n '''Handles pdf and epub format.\n Inpute: output_filename should have the proper extension.\n Output: The name of the file created, or an IOError if failed'''\n temp_file = NamedTemporaryFile(mode=\"w\", suffix=\".md\", delete=False)\n temp_file.write(self._content)\n temp_file.close()\n\n subprocess_arguments = [PANDOC_PATH, temp_file.name, '-o %s' % output_filename]\n subprocess_arguments.extend(self.arguments)\n cmd = \" \".join(subprocess_arguments)\n\n fin = os.popen(cmd)\n msg = fin.read()\n fin.close()\n if msg:\n print(\"Pandoc message: {}\",format(msg))\n\n os.remove(temp_file.name)\n\n if exists(output_filename):\n return output_filename\n else:\n raise IOError(\"Failed creating file: %s\" % output_filename)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate a signed JWT using the given username and RSA private key.", "response": "def sign(username, private_key, generate_nonce=None, iat=None, algorithm=DEFAULT_ALGORITHM):\n \"\"\"\n Create a signed JWT using the given username and RSA private key.\n\n :param username: Username (string) to authenticate as on the remote system.\n :param private_key: Private key to use to sign the JWT claim.\n :param generate_nonce: Optional. Callable to use to generate a new nonce. Defaults to\n `random.random `_.\n :param iat: Optional. Timestamp to include in the JWT claim. Defaults to\n `time.time `_.\n :param algorithm: Optional. Algorithm to use to sign the JWT claim. Default to ``RS512``.\n See `pyjwt.readthedocs.io `_ for other possible algorithms.\n :return: JWT claim as a string.\n \"\"\"\n iat = iat if iat else time.time()\n if not generate_nonce:\n generate_nonce = lambda username, iat: random.random() # NOQA\n\n token_data = {\n 'username': username,\n 'time': iat,\n 'nonce': generate_nonce(username, iat),\n }\n\n token = jwt.encode(token_data, private_key, algorithm=algorithm)\n return token"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ngive a JWT get the username that it is claiming to be without verifying that the signature is valid.", "response": "def get_claimed_username(token):\n \"\"\"\n Given a JWT, get the username that it is claiming to be `without verifying that the signature is valid`.\n\n :param token: JWT claim\n :return: Username\n \"\"\"\n unverified_data = jwt.decode(token, options={\n 'verify_signature': False\n })\n\n if 'username' not in unverified_data:\n return None\n return unverified_data['username']"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef verify(token, public_key, validate_nonce=None, algorithms=[DEFAULT_ALGORITHM]):\n try:\n token_data = jwt.decode(token, public_key, algorithms=algorithms)\n except jwt.InvalidTokenError:\n logger.debug('JWT failed verification')\n return False\n\n claimed_username = token_data.get('username')\n claimed_time = token_data.get('time', 0)\n claimed_nonce = token_data.get('nonce')\n\n # Ensure time is within acceptable bounds\n current_time = time.time()\n min_time, max_time = (current_time - TIMESTAMP_TOLERANCE, current_time + TIMESTAMP_TOLERANCE)\n if claimed_time < min_time or claimed_time > max_time:\n logger.debug('Claimed time is outside of allowable tolerances')\n return False\n\n # Ensure nonce is unique\n if validate_nonce:\n if not validate_nonce(claimed_username, claimed_time, claimed_nonce):\n logger.debug('Claimed nonce failed to validate')\n return False\n else:\n logger.warning('validate_nonce function was not supplied!')\n\n # If we've gotten this far, the token is valid\n return token_data", "response": "Verify the validity of a given JWT using the given public key."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nlogging a nonce as being used.", "response": "def log_used_nonce(self, username, iat, nonce):\n \"\"\"\n Log a nonce as being used, and therefore henceforth invalid.\n\n :param username: Username as a string.\n :param iat: Unix timestamp float or integer of when the nonce was used.\n :param nonce: Nonce value.\n \"\"\"\n # TODO: Figure out some way to do this in a thread-safe manner. It'd be better to use\n # a Redis Set or something, but we don't necessarily want to be tightly coupled to\n # Redis either since not everyone uses it.\n key = self.create_nonce_key(username, iat)\n used = cache.get(key, [])\n used.append(nonce)\n cache.set(key, set(used), token.TIMESTAMP_TOLERANCE * 2)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef validate_nonce(self, username, iat, nonce):\n key = self.create_nonce_key(username, iat)\n used = cache.get(key, [])\n return nonce not in used", "response": "Validate that the given nonce has not already been used."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nprocess a Django request and authenticate users.", "response": "def process_request(self, request):\n \"\"\"\n Process a Django request and authenticate users.\n\n If a JWT authentication header is detected and it is determined to be valid, the user is set as\n ``request.user`` and CSRF protection is disabled (``request._dont_enforce_csrf_checks = True``) on\n the request.\n\n :param request: Django Request instance\n \"\"\"\n if 'HTTP_AUTHORIZATION' not in request.META:\n return\n\n try:\n method, claim = request.META['HTTP_AUTHORIZATION'].split(' ', 1)\n except ValueError:\n return\n\n if method.upper() != AUTH_METHOD:\n return\n\n username = token.get_claimed_username(claim)\n if not username:\n return\n\n User = get_user_model()\n try:\n user = User.objects.get(username=username)\n except User.DoesNotExist:\n return\n\n claim_data = None\n for public in user.public_keys.all():\n claim_data = token.verify(claim, public.key, validate_nonce=self.validate_nonce)\n if claim_data:\n break\n if not claim_data:\n return\n\n logger.debug('Successfully authenticated %s using JWT', user.username)\n request._dont_enforce_csrf_checks = True\n request.user = user"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef generate_key_pair(size=2048, public_exponent=65537, as_string=True):\n private = rsa.generate_private_key(\n public_exponent=public_exponent,\n key_size=size,\n backend=default_backend()\n )\n public = private.public_key()\n\n if not as_string:\n return private, public\n\n pem_private = private.private_bytes(Encoding.PEM, PrivateFormat.PKCS8, NoEncryption()).decode(ENCODING)\n pem_public = public.public_bytes(Encoding.PEM, PublicFormat.SubjectPublicKeyInfo).decode(ENCODING)\n return pem_private, pem_public", "response": "Generate a private and public key pair."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nloads a private key from disk.", "response": "def load_private_key(key_file, key_password=None):\n \"\"\"\n Load a private key from disk.\n\n :param key_file: File path to key file.\n :param key_password: Optional. If the key file is encrypted, provide the password to decrypt it. Defaults to None.\n :return: PrivateKey\n \"\"\"\n key_file = os.path.expanduser(key_file)\n key_file = os.path.abspath(key_file)\n\n if not key_password:\n with open(key_file, 'r') as key:\n return key.read()\n\n with open(key_file, 'rb') as key:\n key_bytes = key.read()\n return decrypt_key(key_bytes, key_password).decode(ENCODING)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef decrypt_key(key, password):\n private = serialization.load_pem_private_key(key, password=password, backend=default_backend())\n return private.private_bytes(Encoding.PEM, PrivateFormat.PKCS8, NoEncryption())", "response": "Decrypt an encrypted private key."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncreating an HTTP Authorization header using a private key file.", "response": "def create_auth_header(username, key=None, key_file=\"~/.ssh/id_rsa\", key_password=None):\n \"\"\"\n Create an HTTP Authorization header using a private key file.\n\n Either a key or a key_file must be provided.\n\n :param username: The username to authenticate as on the remote system.\n :param key: Optional. A private key as either a string or an instance of cryptography.hazmat.primitives.asymmetric.rsa.RSAPrivateKey.\n :param key_file: Optional. Path to a file containing the user's private key. Defaults to ~/.ssh/id_rsa. Should be in PEM format.\n :param key_password: Optional. Password to decrypt key_file. If set, should be a bytes object.\n :return: Authentication header value as a string.\n \"\"\"\n if not key:\n key = load_private_key(key_file, key_password)\n claim = token.sign(username, key)\n return \"%s %s\" % (AUTH_METHOD, claim.decode(ENCODING))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nconfigure logging to log level lvl and name.", "response": "def _config_logs(lvl=None, name=None):\n \"\"\"\n Set up or change logging configuration.\n\n _config_logs() => idempotent setup;\n _config_logs(L) => change log level\n \"\"\"\n # print('_config_log', 'from %s' %name if name else '')\n FORMAT = '%(message)s'\n # maybe better for log files\n # FORMAT='[%(levelname)s]:%(message)s',\n\n # Reset handlers\n for h in list(logging.root.handlers):\n logging.root.removeHandler(h)\n\n global _log_level\n if lvl: _log_level = lvl\n\n logging.basicConfig(level=_log_level, format=FORMAT, stream=sys.stdout)\n _log = logging.getLogger(__name__)\n _log.setLevel(_log_level)\n\n # external\n for log in ['urllib3', 'asyncio']:\n logging.getLogger(log).setLevel(_log_level)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef mk_set_headers(self, data, columns):\n columns = tuple(columns)\n lens = []\n\n for key in columns:\n value_len = max(len(str(each.get(key, ''))) for each in data)\n # account for header lengths\n lens.append(max(value_len, len(self._get_name(key))))\n\n fmt = self.mk_fmt(*lens)\n return fmt", "response": "figure out sizes and create header fmt"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _get_name(self, key):\n if key in self.display_names:\n return self.display_names[key]\n\n return key.capitalize()", "response": "get display name for a key or mangle for display"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef display_set(self, typ, data, columns):\n self.display_section(\"%s (%d)\" % (self._get_name(typ), len(data)))\n headers = tuple(map(self._get_name, columns))\n fmt = self.mk_set_headers(data, columns)\n self.display_headers(fmt, headers)\n\n for each in data:\n row = tuple(self._get_val(each, k) for k, v in each.items())\n self._print(fmt % row)\n\n self._print(\"\\n\")", "response": "display a list of dicts"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _print(self, *args):\n string = u\" \".join(args) + '\\n'\n self.fobj.write(string)", "response": "internal print to self. fobj"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndisplaying section of typ with data", "response": "def display(self, typ, data):\n \"\"\" display section of typ with data \"\"\"\n if hasattr(self, 'print_' + typ):\n getattr(self, 'print_' + typ)(data)\n\n elif not data:\n self._print(\"%s: %s\" % (typ, data))\n\n elif isinstance(data, collections.Mapping):\n self._print(\"\\n\", typ)\n for k, v in data.items():\n self.print(k, v)\n\n elif isinstance(data, (list, tuple)):\n # tabular data layout for lists of dicts\n if isinstance(data[0], collections.Mapping):\n self.display_set(typ, data, self._get_columns(data[0]))\n else:\n for each in data:\n self.print(typ, each)\n else:\n self._print(\"%s: %s\" % (typ, data))\n\n self.fobj.flush()"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _handler(func):\n \"Decorate a command handler\"\n\n def _wrapped(*a, **k):\n r = func(*a, **k)\n if r is None: r = 0\n return r\n\n return staticmethod(_wrapped)", "response": "Decorate a command handler"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd commands to a parser", "response": "def add_subcommands(parser, commands):\n \"Add commands to a parser\"\n subps = parser.add_subparsers()\n for cmd, cls in commands:\n subp = subps.add_parser(cmd, help=cls.__doc__)\n add_args = getattr(cls, 'add_arguments', None)\n if add_args:\n add_args(subp)\n handler = getattr(cls, 'handle', None)\n if handler:\n subp.set_defaults(handler=handler)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef update(self, res, pk, depth=1, since=None):\n fetch = lambda: self._fetcher.fetch_latest(res, pk, 1, since=since)\n self._update(res, fetch, depth)", "response": "Update the cache with the given object pk."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef update_where(self, res, depth=0, since=None, **kwargs):\n \"Like update() but uses WHERE-style args\"\n fetch = lambda: self._fetcher.fetch_all_latest(res, 0, kwargs, since=since)\n self._update(res, fetch, depth)", "response": "Like update() but uses WHERE - style args"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef update_all(self, rs=None, since=None):\n \"Sync all objects for the relations rs (if None, sync all resources)\"\n self._log.info(\"Updating resources: %s\", ' '.join(r.tag for r in rs))\n\n if rs is None:\n rs = resource.all_resources()\n ctx = self._ContextClass(self)\n for r in rs:\n self._atomic_update(lambda: ctx.sync_resource(r, since=since))", "response": "Sync all objects for the relations rs ( if None sync all resources )"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget a scheduled task or None if no such task exists.", "response": "def get_task(self, key):\n \"\"\"Get a scheduled task, or none\"\"\"\n res, pk = key\n jobs, lock = self._jobs\n with lock:\n return jobs[res].get(pk)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef set_job(self, key, func, args):\n res, pk = key\n jobs, lock = self._jobs\n task = _tasks.UpdateTask(func(*args), key)\n with lock:\n job = jobs[res].get(pk)\n had = bool(job)\n if not job:\n job = task\n jobs[res][pk] = job\n else:\n task.cancel()\n self._log.debug('Scheduling: %s-%s (%s)', res.tag, pk,\n 'new task' if not had else 'dup')\n return job", "response": "Set a scheduled task or cancel it if none exists."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef pending_tasks(self, res):\n \"Synchronized access to tasks\"\n jobs, lock = self._jobs\n with lock:\n return jobs[res].copy()", "response": "Synchronized access to tasks"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfetches data with func return dict indexed by ID", "response": "def fetch_and_index(self, fetch_func):\n \"Fetch data with func, return dict indexed by ID\"\n data, e = fetch_func()\n if e: raise e\n yield {row['id']: row for row in data}"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nclean object intercepting and collecting any missing - relation errors and returning the relevant resource ids and fields.", "response": "def clean_helper(B, obj, clean_func):\n \"\"\"\n Clean object, intercepting and collecting any missing-relation or\n unique-constraint errors and returning the relevant resource ids/fields.\n\n Returns:\n - tuple: (, )\n \"\"\"\n try:\n clean_func(obj)\n except B.validation_error() as e:\n # _debug.log_validation_errors(B, e, obj, k)\n\n # Check if it's a uniqueness or missing relation error\n fields = B.detect_uniqueness_error(e)\n missing = B.detect_missing_relations(obj, e)\n return fields, missing\n return (None, None)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef initialize_object(B, res, row):\n B = get_backend()\n field_groups = FieldGroups(B.get_concrete(res))\n\n try:\n obj = B.get_object(B.get_concrete(res), row['id'])\n except B.object_missing_error(B.get_concrete(res)):\n tbl = B.get_concrete(res)\n obj = tbl()\n\n # Set attributes, refs\n for fname, field in field_groups['scalars'].items():\n value = row.get(fname, getattr(obj, fname, None))\n value = B.convert_field(obj.__class__, fname, value)\n setattr(obj, fname, value)\n # _debug('res, row: %s, %s', res, row)\n\n # Already-fetched, and id-only refs\n fetched, dangling = defaultdict(dict), defaultdict(set)\n\n # To handle subrows that might be shallow (id) or deep (dict)\n def _handle_subrow(R, subrow):\n if isinstance(subrow, dict):\n pk = subrow['id']\n fetched[R][pk] = subrow\n else:\n pk = subrow\n dangling[R].add(pk)\n return pk\n\n for fname, field in field_groups['one_refs'].items():\n fieldres = _field_resource(B, B.get_concrete(res), fname)\n key = field.column\n subrow = row.get(key)\n if subrow is None: # e.g. use \"org\" if \"org_id\" is missing\n key = fname\n subrow = row[key]\n pk = _handle_subrow(fieldres, subrow)\n setattr(obj, key, pk)\n\n for fname, field in field_groups['many_refs'].items():\n fieldres = _field_resource(B, B.get_concrete(res), fname)\n pks = [\n _handle_subrow(fieldres, subrow) for subrow in row.get(fname, [])\n ]\n\n return obj, fetched, dangling", "response": "Initialize an object with the given row."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfinding and read config file for a directory return None if not found.", "response": "def read_config(conf_dir=DEFAULT_CONFIG_DIR):\n \"Find and read config file for a directory, return None if not found.\"\n\n conf_path = os.path.expanduser(conf_dir)\n if not os.path.exists(conf_path):\n # only throw if not default\n if conf_dir != DEFAULT_CONFIG_DIR:\n raise IOError(\"Config directory not found at %s\" % (conf_path, ))\n\n return munge.load_datafile('config', conf_path, default=None)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nloading config files from the specified directory using the specified schema.", "response": "def load_config(conf_dir=DEFAULT_CONFIG_DIR, schema=CLIENT_SCHEMA):\n \"\"\"\n Load config files from the specified directory, using defaults for missing values.\n Directory should contain a file named config. where is a\n supported config file format.\n \"\"\"\n data = default_config(schema)\n config = read_config(conf_dir)\n if config:\n recursive_update(data, config)\n return data"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef detect_old(data):\n \"Check for a config file with old schema\"\n if not data:\n return False\n ok, errors, warnings = _schema.validate(_OLD_SCHEMA, data)\n return ok and not (errors or warnings)", "response": "Check for a config file with old schema"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef convert_old(data):\n \"Convert config data with old schema to new schema\"\n ret = default_config()\n ret['sync'].update(data.get('peeringdb', {}))\n ret['orm']['database'].update(data.get('database', {}))\n return ret", "response": "Convert config data with old schema to new schema"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef write_config(data, conf_dir=DEFAULT_CONFIG_DIR, codec=\"yaml\",\n backup_existing=False):\n \"\"\"\n Write config values to a file.\n\n Arguments:\n - conf_dir: path to output directory\n - codec: output field format\n - backup_existing: if a config file exists,\n make a copy before overwriting\n \"\"\"\n if not codec:\n codec = 'yaml'\n codec = munge.get_codec(codec)()\n conf_dir = os.path.expanduser(conf_dir)\n if not os.path.exists(conf_dir):\n os.mkdir(conf_dir)\n\n # Check for existing file, back up if necessary\n outpath = os.path.join(conf_dir, 'config.' + codec.extensions[0])\n if backup_existing and os.path.exists(outpath):\n os.rename(outpath, outpath + '.bak')\n codec.dump(data, open(outpath, 'w'))", "response": "Write config values to a file."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef prompt_config(sch, defaults=None, path=None):\n out = {}\n for name, attr in sch.attributes():\n fullpath = name\n if path:\n fullpath = '{}.{}'.format(path, name)\n if defaults is None:\n defaults = {}\n default = defaults.get(name)\n if isinstance(attr, _schema.Schema):\n # recurse on sub-schema\n value = prompt_config(attr, defaults=default, path=fullpath)\n else:\n if default is None:\n default = attr.default\n if default is None:\n default = ''\n value = prompt(fullpath, default)\n out[name] = value\n\n return sch.validate(out)", "response": "Utility function to recursively prompt for config values for a single resource."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef fetch(self, R, pk, depth=1):\n \"Request object from API\"\n d, e = self._fetcher.fetch(R, pk, depth)\n if e: raise e\n return d", "response": "Request object from API"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nrequest multiple objects from API", "response": "def fetch_all(self, R, depth=1, **kwargs):\n \"Request multiple objects from API\"\n d, e = self._fetcher.fetch_all(R, depth, kwargs)\n if e: raise e\n return d"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting a resource instance by primary key ( id )", "response": "def get(self, res, pk):\n \"Get a resource instance by primary key (id)\"\n B = get_backend()\n return B.get_object(B.get_concrete(res), pk)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef all(self, res):\n \"Get resources using a filter condition\"\n B = get_backend()\n return B.get_objects(B.get_concrete(res))", "response": "Get resources using a filter condition"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef run_task(func):\n\n def _wrapped(*a, **k):\n gen = func(*a, **k)\n return _consume_task(gen)\n\n return _wrapped", "response": "Decorator to collect and return generator results returning a list\n if there are multiple results\n "} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nload type by id", "response": "def get(self, typ, id, **kwargs):\n \"\"\"\n Load type by id\n \"\"\"\n return self._load(self._request(typ, id=id, params=kwargs))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nsend the request to the specified resource type and return the response obj", "response": "def _request(self, typ, id=0, method='GET', params=None, data=None,\n url=None):\n \"\"\"\n send the request, return response obj\n \"\"\"\n backend, backend_version = peeringdb.get_backend_info()\n user_agent = 'PeeringDB/{} {}/{}'.format(peeringdb.__version__,\n backend, backend_version)\n headers = {\n \"Accept\": \"application/json\",\n \"User-Agent\": user_agent,\n }\n auth = None\n\n if self.user:\n auth = (self.user, self.password)\n if not url:\n if id:\n url = \"%s/%s/%s\" % (self.url, typ, id)\n else:\n url = \"%s/%s\" % (self.url, typ)\n\n return requests.request(method, url, params=params, data=data,\n auth=auth, headers=headers)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nsplit a string into a tag and id", "response": "def split_ref(string):\n \"\"\" splits a string into (tag, id) \"\"\"\n re_tag = re.compile('^(?P[a-zA-Z]+)[\\s-]*(?P\\d+)$')\n m = re_tag.search(string)\n if not m:\n raise ValueError(\"unable to split string '%s'\" % (string, ))\n\n return (m.group('tag').lower(), int(m.group('pk')))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef limit_mem(limit=(4 * 1024**3)):\n \"Set soft memory limit\"\n rsrc = resource.RLIMIT_DATA\n soft, hard = resource.getrlimit(rsrc)\n resource.setrlimit(rsrc, (limit, hard)) # 4GB\n softnew, _ = resource.getrlimit(rsrc)\n assert softnew == limit\n\n _log = logging.getLogger(__name__)\n _log.debug('Set soft memory limit: %s => %s', soft, softnew)", "response": "Set soft memory limit"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nrendering a module with the given parameters.", "response": "def render_module(inits, calls, inputs, outputs, dst_dir, pytorch_dict, pytorch_module_name):\n \"\"\"\n Render model.\n \"\"\"\n\n inits = [i for i in inits if len(i) > 0]\n\n output = pytorch_model_template.format(**{\n 'module_name': pytorch_module_name,\n 'module_name_lower': pytorch_module_name.lower(),\n 'inits': '\\n'.join(inits),\n 'inputs': inputs,\n 'calls': '\\n'.join(calls),\n 'outputs': outputs,\n })\n\n if dst_dir is not None:\n import os\n import errno\n\n try:\n os.makedirs(dst_dir)\n except OSError as e:\n if e.errno != errno.EEXIST:\n raise\n\n with open(os.path.join(dst_dir, pytorch_module_name.lower() + '.py'), 'w+') as f:\n f.write(output)\n f.close()\n\n torch.save(pytorch_dict, os.path.join(dst_dir, pytorch_module_name.lower() + '.pt'))\n\n return output"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfunctions to convert a gluon model into a pytorch model.", "response": "def gluon2pytorch(net, args, dst_dir, pytorch_module_name, debug=True, keep_names=False):\n \"\"\"\n Function to convert a model.\n \"\"\"\n\n x = [mx.nd.array(np.ones(i)) for i in args]\n x = net(*x)\n\n # Get network params\n params = net.collect_params()\n\n # Create a symbol to trace net\n x = [mx.sym.var('__input__' + str(i)) for i in range(len(args))]\n sym = net(*x)\n\n if len(sym) > 1:\n group = mx.sym.Group(sym)\n else:\n group = sym\n\n # Get JSON-definition of the model\n json_model = json.loads(group.tojson())['nodes']\n\n # Create empty accumulators\n nodes = []\n is_skipped = []\n pytorch_dict = {}\n inits = []\n calls = []\n inputs = []\n outputs = [i[0] for i in json.loads(group.tojson())['heads']] \n last = 0\n\n if keep_names:\n names_dict = {}\n else:\n names_dict = None\n\n # Trace model\n for i, node in enumerate(json_model):\n # If the node has 'null' op, it means, that it's not a real op, but only parameter\n # TODO: convert constants\n if keep_names:\n names_dict[i] = node['name']\n\n if node['op'] == 'null':\n if node['name'].find('__input__') == 0:\n inputs.append(int(node['name'][9:]))\n is_skipped.append(1)\n continue\n\n # It's not 'null'\n is_skipped.append(0)\n\n # Create dict with necessary node parameters\n op = {\n 'name': node['name'][:-4],\n 'type': node['op'],\n }\n print(op, node)\n if len(node['inputs']) > 0:\n orginal_inputs = [i for i in np.array(node['inputs'])[:, 0] if i in inputs]\n op['inputs'] = [i for i in np.array(node['inputs'])[:, 0] if is_skipped[i] != 1 or i in orginal_inputs]\n else:\n print(json_model)\n op['inputs'] = []\n try:\n # Not all nodes have 'attrs'\n op['attrs'] = node['attrs']\n except KeyError:\n op['attrs'] = {}\n\n # Debug output\n if debug:\n print(op)\n print('__')\n\n # Append new node to list\n nodes.append(op)\n\n # If operation is in available convertors, convert it\n if op['type'] in CONVERTERS:\n init_str, call_str = CONVERTERS[op['type']](i, op, nodes, params, pytorch_dict, names_dict, debug)\n inits.append(init_str)\n calls.append(call_str)\n else:\n raise AttributeError('Layer isn\\'t supported')\n\n if names_dict is not None:\n inputs = ', '.join([names_dict[i] for i in inputs])\n outputs = ', '.join([names_dict[i] for i in outputs])\n else:\n inputs = ', '.join(['x' + str(i) for i in inputs])\n outputs = ', '.join(['x' + str(i) for i in outputs])\n\n pytorch_source = render_module(inits, calls, inputs, outputs, dst_dir, pytorch_dict, pytorch_module_name)\n\n return eval_model(pytorch_source, pytorch_dict, pytorch_module_name)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nhooks for Odoo 10.", "response": "def hook_odoo(package):\n \"\"\" work around Odoo 10 issue\n https://github.com/acsone/setuptools-odoo/issues/10\n\n # This hook should runs after all *-nspkg.pth files because it is named\n # zzz_ and .pth file run in alphabetical order.\n \"\"\"\n if sys.version_info.major != 2:\n return\n if package.__name__ == 'odoo':\n if not hasattr(package, 'release'):\n # Since 'release' is not in the odoo package, it means\n # odoo/__init__.py did not run, so what we have here is a dummy\n # odoo package created by setuptools' *-nspkg.pth files.\n # We remove it so 'import odoo' that will be done in the actual\n # main program will have a chance to run odoo/__init__.py.\n if 'odoo.addons' in sys.modules:\n del sys.modules['odoo.addons']\n if 'odoo' in sys.modules:\n del sys.modules['odoo']"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns a dictionary containing active ftp connections.", "response": "def getAllFtpConnections(self):\n \"\"\"\n Returns a dictionary containing active ftp connections.\n \"\"\"\n outputMsg = \"Current ftp connections:\\n\"\n counter = 1\n for k in self.ftpList:\n outputMsg += str(counter) + \". \" + k + \" \"\n outputMsg += str(self.ftpList[k]) + \"\\n\"\n counter += 1\n if self.printOutput:\n logger.info(outputMsg)\n return self.ftpList"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef ftp_connect(self, host, user='anonymous', password='anonymous@', port=21, timeout=30, connId='default'):\n if connId in self.ftpList:\n errMsg = \"Connection with ID %s already exist. It should be deleted before this step.\" % connId\n raise FtpLibraryError(errMsg)\n else:\n newFtp = None\n outputMsg = \"\"\n try:\n timeout = int(timeout)\n port = int(port)\n newFtp = ftplib.FTP()\n outputMsg += newFtp.connect(host, port, timeout)\n outputMsg += newFtp.login(user,password)\n except socket.error as se:\n raise FtpLibraryError('Socket error exception occured.')\n except ftplib.all_errors as e:\n raise FtpLibraryError(str(e))\n except Exception as e:\n raise FtpLibraryError(str(e))\n if self.printOutput:\n logger.info(outputMsg)\n self.__addNewConnection(newFtp, connId)", "response": "Connect to a FTP server and login it."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_welcome(self, connId='default'):\n thisConn = self.__getConnection(connId)\n outputMsg = \"\"\n try:\n outputMsg += thisConn.getwelcome()\n except ftplib.all_errors as e:\n raise FtpLibraryError(str(e))\n if self.printOutput:\n logger.info(outputMsg)\n return outputMsg", "response": "Returns a string containing the welcome message of FTP server."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef dir(self, connId='default'):\n dirList = []\n thisConn = self.__getConnection(connId)\n outputMsg = \"\"\n try:\n thisConn.dir(dirList.append)\n for d in dirList:\n outputMsg += str(d) + \"\\n\"\n except ftplib.all_errors as e:\n raise FtpLibraryError(str(e))\n if self.printOutput:\n logger.info(outputMsg)\n return dirList", "response": "Returns list of raw lines returned as contens of current directory."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef dir_names(self, connId='default'):\n files_list = []\n thisConn = self.__getConnection(connId)\n try:\n files_list = thisConn.nlst()\n except:\n files_list = []\n return files_list", "response": "Returns list of files and directories of current directory."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ndownloads a file from current directory on FTP server.", "response": "def download_file(self, remoteFileName, localFilePath=None, connId='default'):\n \"\"\"\n Downloads file from current directory on FTP server in binary mode. If\n localFilePath is not given, file is saved in current local directory (by\n default folder containing robot framework project file) with the same name\n as source file. Returns server output\n Parameters:\n - remoteFileName - file name on FTP server\n - localFilePath (optional) - local file name or path where remote file should be saved.\n - connId(optional) - connection identifier. By default equals 'default'\n localFilePath variable can have following meanings:\n 1. file name (will be saved in current default directory);\n 2. full path (dir + file name)\n 3. dir path (original file name will be added)\n Examples:\n | download file | a.txt | | |\n | download file | a.txt | b.txt | connId=ftp1 |\n | download file | a.txt | D:/rfftppy/tmp | |\n | download file | a.txt | D:/rfftppy/tmp/b.txt | |\n | download file | a.txt | D:\\\\rfftppy\\\\tmp\\\\c.txt | |\n \"\"\"\n thisConn = self.__getConnection(connId)\n outputMsg = \"\"\n localPath = \"\"\n if localFilePath == None:\n localPath = remoteFileName\n else:\n localPath = os.path.normpath(localFilePath)\n if os.path.isdir(localPath):\n localPath = os.path.join(localPath, remoteFileName)\n try:\n with open(localPath, 'wb') as localFile:\n outputMsg += thisConn.retrbinary(\"RETR \" + remoteFileName, localFile.write)\n except ftplib.all_errors as e:\n raise FtpLibraryError(str(e))\n if self.printOutput:\n logger.info(outputMsg)\n return outputMsg"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef upload_file(self, localFileName, remoteFileName=None, connId='default'):\n thisConn = self.__getConnection(connId)\n outputMsg = \"\"\n remoteFileName_ = \"\"\n localFilePath = os.path.normpath(localFileName)\n if not os.path.isfile(localFilePath):\n raise FtpLibraryError(\"Valid file path should be provided.\")\n else:\n if remoteFileName==None:\n fileTuple = os.path.split(localFileName)\n if len(fileTuple)==2:\n remoteFileName_ = fileTuple[1]\n else:\n remoteFileName_ = 'defaultFileName'\n else:\n remoteFileName_ = remoteFileName\n try:\n outputMsg += thisConn.storbinary(\"STOR \" + remoteFileName_, open(localFilePath, \"rb\"))\n except ftplib.all_errors as e:\n raise FtpLibraryError(str(e))\n if self.printOutput:\n logger.info(outputMsg)\n return outputMsg", "response": "Uploads a file from local drive to local directory on FTP server in binary mode."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef size(self, fileToCheck, connId='default'):\n thisConn = self.__getConnection(connId)\n outputMsg = \"\"\n try:\n tmpSize = thisConn.size(fileToCheck)\n outputMsg += str(tmpSize)\n except ftplib.all_errors as e:\n raise FtpLibraryError(str(e))\n if self.printOutput:\n logger.info(outputMsg)\n return outputMsg", "response": "This function checks the size of a file on FTP server and returns the size in bytes."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsending any valid command to the FTP server. Returns the server output.", "response": "def send_cmd(self, command, connId='default'):\n \"\"\"\n Sends any command to FTP server. Returns server output.\n Parameters:\n - command - any valid command to be sent (invalid will result in exception).\n - connId(optional) - connection identifier. By default equals 'default'\n Example:\n | send cmd | HELP |\n \"\"\"\n thisConn = self.__getConnection(connId)\n outputMsg = \"\"\n try:\n outputMsg += str(thisConn.sendcmd(command))\n except ftplib.all_errors as e:\n raise FtpLibraryError(str(e))\n if self.printOutput:\n logger.info(outputMsg)\n return outputMsg"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef ftp_close(self, connId='default'):\n thisConn = self.__getConnection(connId)\n try:\n thisConn.quit()\n self.__removeConnection(connId)\n except Exception as e:\n try:\n thisConn.close()\n self.__removeConnection(connId)\n except ftplib.all_errors as x:\n raise FtpLibraryError(str(x))", "response": "Closes an FTP connection. Returns None."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef setup_logging(verbose=0, colors=False, name=None):\n root_logger = logging.getLogger(name)\n root_logger.setLevel(logging.DEBUG if verbose > 0 else logging.INFO)\n formatter = ColorFormatter(verbose > 0, colors)\n if colors:\n colorclass.Windows.enable()\n\n handler_stdout = logging.StreamHandler(sys.stdout)\n handler_stdout.setFormatter(formatter)\n handler_stdout.setLevel(logging.DEBUG)\n handler_stdout.addFilter(type('', (logging.Filter,), {'filter': staticmethod(lambda r: r.levelno <= logging.INFO)}))\n root_logger.addHandler(handler_stdout)\n\n handler_stderr = logging.StreamHandler(sys.stderr)\n handler_stderr.setFormatter(formatter)\n handler_stderr.setLevel(logging.WARNING)\n root_logger.addHandler(handler_stderr)", "response": "Configure console logging. Info and below go to stdout, others go to stderr.\n\n :param int verbose: Verbosity level. > 0 print debug statements. > 1 passed to sphinx-build.\n :param bool colors: Print color text in non-verbose mode.\n :param str name: Which logger name to set handlers to. Used for testing."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\napply little arrow and colors to the record.", "response": "def format(self, record):\n \"\"\"Apply little arrow and colors to the record.\n\n Arrow and colors are only applied to sphinxcontrib.versioning log statements.\n\n :param logging.LogRecord record: The log record object to log.\n \"\"\"\n formatted = super(ColorFormatter, self).format(record)\n if self.verbose or not record.name.startswith(self.SPECIAL_SCOPE):\n return formatted\n\n # Arrow.\n formatted = '=> ' + formatted\n\n # Colors.\n if not self.colors:\n return formatted\n if record.levelno >= logging.ERROR:\n formatted = str(colorclass.Color.red(formatted))\n elif record.levelno >= logging.WARNING:\n formatted = str(colorclass.Color.yellow(formatted))\n else:\n formatted = str(colorclass.Color.cyan(formatted))\n return formatted"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef chunk(iterator, max_size):\n gen = iter(iterator)\n while True:\n chunked = list()\n for i, item in enumerate(gen):\n chunked.append(item)\n if i >= max_size - 1:\n break\n if not chunked:\n return\n yield chunked", "response": "Yields a list of items from an iterator."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nruns a command and return the output.", "response": "def run_command(local_root, command, env_var=True, pipeto=None, retry=0, environ=None):\n \"\"\"Run a command and return the output.\n\n :raise CalledProcessError: Command exits non-zero.\n\n :param str local_root: Local path to git root directory.\n :param iter command: Command to run.\n :param dict environ: Environment variables to set/override in the command.\n :param bool env_var: Define GIT_DIR environment variable (on non-Windows).\n :param function pipeto: Pipe `command`'s stdout to this function (only parameter given).\n :param int retry: Retry this many times on CalledProcessError after 0.1 seconds.\n\n :return: Command output.\n :rtype: str\n \"\"\"\n log = logging.getLogger(__name__)\n\n # Setup env.\n env = os.environ.copy()\n if environ:\n env.update(environ)\n if env_var and not IS_WINDOWS:\n env['GIT_DIR'] = os.path.join(local_root, '.git')\n else:\n env.pop('GIT_DIR', None)\n\n # Run command.\n with open(os.devnull) as null:\n main = Popen(command, cwd=local_root, env=env, stdout=PIPE, stderr=PIPE if pipeto else STDOUT, stdin=null)\n if pipeto:\n pipeto(main.stdout)\n main_output = main.communicate()[1].decode('utf-8') # Might deadlock if stderr is written to a lot.\n else:\n main_output = main.communicate()[0].decode('utf-8')\n log.debug(json.dumps(dict(cwd=local_root, command=command, code=main.poll(), output=main_output)))\n\n # Verify success.\n if main.poll() != 0:\n if retry < 1:\n raise CalledProcessError(main.poll(), command, output=main_output)\n time.sleep(0.1)\n return run_command(local_root, command, env_var, pipeto, retry - 1)\n\n return main_output"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_root(directory):\n command = ['git', 'rev-parse', '--show-toplevel']\n try:\n output = run_command(directory, command, env_var=False)\n except CalledProcessError as exc:\n raise GitError('Failed to find local git repository root in {}.'.format(repr(directory)), exc.output)\n if IS_WINDOWS:\n output = output.replace('/', '\\\\')\n return output.strip()", "response": "Get the root directory of the local git repository from any subdirectory within it."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef list_remote(local_root):\n command = ['git', 'ls-remote', '--heads', '--tags']\n try:\n output = run_command(local_root, command)\n except CalledProcessError as exc:\n raise GitError('Git failed to list remote refs.', exc.output)\n\n # Dereference annotated tags if any. No need to fetch annotations.\n if '^{}' in output:\n parsed = list()\n for group in (m.groupdict() for m in RE_REMOTE.finditer(output)):\n dereferenced, name, kind = group['name'].endswith('^{}'), group['name'][:-3], group['kind']\n if dereferenced and parsed and kind == parsed[-1]['kind'] == 'tags' and name == parsed[-1]['name']:\n parsed[-1]['sha'] = group['sha']\n else:\n parsed.append(group)\n else:\n parsed = [m.groupdict() for m in RE_REMOTE.finditer(output)]\n\n return [[i['sha'], i['name'], i['kind']] for i in parsed]", "response": "Return list of remote branch and tag latest SHAs."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nfilter commits and return a dict of commit Unix timestamps and first matching conf. py path.", "response": "def filter_and_date(local_root, conf_rel_paths, commits):\n \"\"\"Get commit Unix timestamps and first matching conf.py path. Exclude commits with no conf.py file.\n\n :raise CalledProcessError: Unhandled git command failure.\n :raise GitError: A commit SHA has not been fetched.\n\n :param str local_root: Local path to git root directory.\n :param iter conf_rel_paths: List of possible relative paths (to git root) of Sphinx conf.py (e.g. docs/conf.py).\n :param iter commits: List of commit SHAs.\n\n :return: Commit time (seconds since Unix epoch) for each commit and conf.py path. SHA keys and [int, str] values.\n :rtype: dict\n \"\"\"\n dates_paths = dict()\n\n # Filter without docs.\n for commit in commits:\n if commit in dates_paths:\n continue\n command = ['git', 'ls-tree', '--name-only', '-r', commit] + conf_rel_paths\n try:\n output = run_command(local_root, command)\n except CalledProcessError as exc:\n raise GitError('Git ls-tree failed on {0}'.format(commit), exc.output)\n if output:\n dates_paths[commit] = [None, output.splitlines()[0].strip()]\n\n # Get timestamps by groups of 50.\n command_prefix = ['git', 'show', '--no-patch', '--pretty=format:%ct']\n for commits_group in chunk(dates_paths, 50):\n command = command_prefix + commits_group\n output = run_command(local_root, command)\n timestamps = [int(i) for i in RE_UNIX_TIME.findall(output)]\n for i, commit in enumerate(commits_group):\n dates_paths[commit][0] = timestamps[i]\n\n # Done.\n return dates_paths"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nfetching commits from origin.", "response": "def fetch_commits(local_root, remotes):\n \"\"\"Fetch from origin.\n\n :raise CalledProcessError: Unhandled git command failure.\n\n :param str local_root: Local path to git root directory.\n :param iter remotes: Output of list_remote().\n \"\"\"\n # Fetch all known branches.\n command = ['git', 'fetch', 'origin']\n run_command(local_root, command)\n\n # Fetch new branches/tags.\n for sha, name, kind in remotes:\n try:\n run_command(local_root, ['git', 'reflog', sha])\n except CalledProcessError:\n run_command(local_root, command + ['refs/{0}/{1}'.format(kind, name)])\n run_command(local_root, ['git', 'reflog', sha])"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef export(local_root, commit, target):\n log = logging.getLogger(__name__)\n target = os.path.realpath(target)\n mtimes = list()\n\n # Define extract function.\n def extract(stdout):\n \"\"\"Extract tar archive from \"git archive\" stdout.\n\n :param file stdout: Handle to git's stdout pipe.\n \"\"\"\n queued_links = list()\n try:\n with tarfile.open(fileobj=stdout, mode='r|') as tar:\n for info in tar:\n log.debug('name: %s; mode: %d; size: %s; type: %s', info.name, info.mode, info.size, info.type)\n path = os.path.realpath(os.path.join(target, info.name))\n if not path.startswith(target): # Handle bad paths.\n log.warning('Ignoring tar object path %s outside of target directory.', info.name)\n elif info.isdir(): # Handle directories.\n if not os.path.exists(path):\n os.makedirs(path, mode=info.mode)\n elif info.issym() or info.islnk(): # Queue links.\n queued_links.append(info)\n else: # Handle files.\n tar.extract(member=info, path=target)\n if os.path.splitext(info.name)[1].lower() == '.rst':\n mtimes.append(info.name)\n for info in (i for i in queued_links if os.path.exists(os.path.join(target, i.linkname))):\n tar.extract(member=info, path=target)\n except tarfile.TarError as exc:\n log.debug('Failed to extract output from \"git archive\" command: %s', str(exc))\n\n # Run command.\n run_command(local_root, ['git', 'archive', '--format=tar', commit], pipeto=extract)\n\n # Set mtime.\n for file_path in mtimes:\n last_committed = int(run_command(local_root, ['git', 'log', '-n1', '--format=%at', commit, '--', file_path]))\n os.utime(os.path.join(target, file_path), (last_committed, last_committed))", "response": "Exports all files at the given commit to the target directory."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef clone(local_root, new_root, remote, branch, rel_dest, exclude):\n log = logging.getLogger(__name__)\n output = run_command(local_root, ['git', 'remote', '-v'])\n remotes = dict()\n for match in RE_ALL_REMOTES.findall(output):\n remotes.setdefault(match[0], [None, None])\n if match[2] == 'fetch':\n remotes[match[0]][0] = match[1]\n else:\n remotes[match[0]][1] = match[1]\n if not remotes:\n raise GitError('Git repo has no remotes.', output)\n if remote not in remotes:\n raise GitError('Git repo missing remote \"{}\".'.format(remote), output)\n\n # Clone.\n try:\n run_command(new_root, ['git', 'clone', remotes[remote][0], '--depth=1', '--branch', branch, '.'])\n except CalledProcessError as exc:\n raise GitError('Failed to clone from remote repo URL.', exc.output)\n\n # Make sure user didn't select a tag as their DEST_BRANCH.\n try:\n run_command(new_root, ['git', 'symbolic-ref', 'HEAD'])\n except CalledProcessError as exc:\n raise GitError('Specified branch is not a real branch.', exc.output)\n\n # Copy all remotes from original repo.\n for name, (fetch, push) in remotes.items():\n try:\n run_command(new_root, ['git', 'remote', 'set-url' if name == 'origin' else 'add', name, fetch], retry=3)\n run_command(new_root, ['git', 'remote', 'set-url', '--push', name, push], retry=3)\n except CalledProcessError as exc:\n raise GitError('Failed to set git remote URL.', exc.output)\n\n # Done if no exclude.\n if not exclude:\n return\n\n # Resolve exclude paths.\n exclude_joined = [\n os.path.relpath(p, new_root) for e in exclude for p in glob.glob(os.path.join(new_root, rel_dest, e))\n ]\n log.debug('Expanded %s to %s', repr(exclude), repr(exclude_joined))\n\n # Do \"git rm\".\n try:\n run_command(new_root, ['git', 'rm', '-rf', rel_dest])\n except CalledProcessError as exc:\n raise GitError('\"git rm\" failed to remove ' + rel_dest, exc.output)\n\n # Restore files in exclude.\n run_command(new_root, ['git', 'reset', 'HEAD'] + exclude_joined)\n run_command(new_root, ['git', 'checkout', '--'] + exclude_joined)", "response": "Clone local_root origin into a new directory and check out a specific branch."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef commit_and_push(local_root, remote, versions):\n log = logging.getLogger(__name__)\n current_branch = run_command(local_root, ['git', 'rev-parse', '--abbrev-ref', 'HEAD']).strip()\n run_command(local_root, ['git', 'add', '.'])\n\n # Check if there are no changes.\n try:\n run_command(local_root, ['git', 'diff', 'HEAD', '--no-ext-diff', '--quiet', '--exit-code'])\n except CalledProcessError:\n pass # Repo is dirty, something has changed.\n else:\n log.info('No changes to commit.')\n return True\n\n # Check if there are changes excluding those files that always change.\n output = run_command(local_root, ['git', 'diff', 'HEAD', '--no-ext-diff', '--name-status'])\n for status, name in (l.split('\\t', 1) for l in output.splitlines()):\n if status != 'M':\n break # Only looking for modified files.\n components = name.split('/')\n if '.doctrees' not in components and components[-1] != 'searchindex.js':\n break # Something other than those two dirs/files has changed.\n else:\n log.info('No significant changes to commit.')\n return True\n\n # Commit.\n latest_commit = sorted(versions.remotes, key=lambda v: v['date'])[-1]\n commit_message_file = os.path.join(local_root, '_scv_commit_message.txt')\n with open(commit_message_file, 'w') as handle:\n handle.write('AUTO sphinxcontrib-versioning {} {}\\n\\n'.format(\n datetime.utcfromtimestamp(latest_commit['date']).strftime('%Y%m%d'),\n latest_commit['sha'][:11],\n ))\n for line in ('{}: {}\\n'.format(v, os.environ[v]) for v in WHITELIST_ENV_VARS if v in os.environ):\n handle.write(line)\n try:\n run_command(local_root, ['git', 'commit', '-F', commit_message_file])\n except CalledProcessError as exc:\n raise GitError('Failed to commit locally.', exc.output)\n os.remove(commit_message_file)\n\n # Push.\n try:\n run_command(local_root, ['git', 'push', remote, current_branch])\n except CalledProcessError as exc:\n if '[rejected]' in exc.output and '(fetch first)' in exc.output:\n log.debug('Remote has changed since cloning the repo. Must retry.')\n return False\n raise GitError('Failed to push to remote.', exc.output)\n\n log.info('Successfully pushed to remote repository.')\n return True", "response": "Commits the changes made in the local_root to the remote repository and attempts to push the changes to the remote repository."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef semvers(names):\n matches = [(RE_SEMVER.findall(n) or [[]])[0] for n in names]\n max_len_ints = 0\n max_len_str = 0\n\n # Get max lens for padding.\n for match in (m for m in matches if m):\n max_len_ints = len(match) # Never changes.\n max_len_str = max(max_len_str, len(match[-1]))\n if not max_len_ints:\n return matches # Nothing to do, all empty.\n invalid_template = [1] + [0] * (max_len_ints + max_len_str - 1)\n\n # Parse.\n exploded_semver = list()\n for match in matches:\n if not match:\n exploded_semver.append(invalid_template[:])\n continue\n version_ints = [-int(i or 0) for i in match[:-1]]\n ints_of_str = [ord(i) for i in match[-1]] + [0] * (max_len_str - len(match[-1]))\n exploded_semver.append([0] + version_ints + ints_of_str)\n\n return exploded_semver", "response": "Parses the names into integers and convert non - integer meta indicators into integers with ord."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsorts the list of remotes in place. Allows sorting by multiple conditions.", "response": "def multi_sort(remotes, sort):\n \"\"\"Sort `remotes` in place. Allows sorting by multiple conditions.\n\n This is needed because Python 3 no longer supports sorting lists of multiple types. Sort keys must all be of the\n same type.\n\n Problem: the user expects versions to be sorted latest first and timelogical to be most recent first (when viewing\n the HTML documentation), yet expects alphabetical sorting to be A before Z.\n Solution: invert integers (dates and parsed versions).\n\n :param iter remotes: List of dicts from Versions().remotes.\n :param iter sort: What to sort by. May be one or more of: alpha, time, semver\n \"\"\"\n exploded_alpha = list()\n exploded_semver = list()\n\n # Convert name to int if alpha is in sort.\n if 'alpha' in sort:\n alpha_max_len = max(len(r['name']) for r in remotes)\n for name in (r['name'] for r in remotes):\n exploded_alpha.append([ord(i) for i in name] + [0] * (alpha_max_len - len(name)))\n\n # Parse versions if semver is in sort.\n if 'semver' in sort:\n exploded_semver = semvers(r['name'] for r in remotes)\n\n # Build sort_mapping dict.\n sort_mapping = dict()\n for i, remote in enumerate(remotes):\n key = list()\n for sort_by in sort:\n if sort_by == 'alpha':\n key.extend(exploded_alpha[i])\n elif sort_by == 'time':\n key.append(-remote['date'])\n elif sort_by == 'semver':\n key.extend(exploded_semver[i])\n sort_mapping[id(remote)] = key\n\n # Sort.\n remotes.sort(key=lambda k: sort_mapping.get(id(k)))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef branches(self):\n return [(r['name'], self.vpathto(r['name'])) for r in self.remotes if r['kind'] == 'heads']", "response": "Return list of ( name urls ) only branches."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef tags(self):\n return [(r['name'], self.vpathto(r['name'])) for r in self.remotes if r['kind'] == 'tags']", "response": "Return list of ( name and urls ) only tags."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef vhasdoc(self, other_version):\n if self.context['current_version'] == other_version:\n return True\n return self.context['pagename'] in self[other_version]['found_docs']", "response": "Return True if the current document is in the other version."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns relative path to current document in another version. Like Sphinx s pathto.", "response": "def vpathto(self, other_version):\n \"\"\"Return relative path to current document in another version. Like Sphinx's pathto().\n\n If the current document doesn't exist in the other version its master_doc path is returned instead.\n\n :raise KeyError: If other_version doesn't exist.\n\n :param str other_version: Version to link to.\n\n :return: Relative path.\n :rtype: str\n \"\"\"\n is_root = self.context['scv_is_root']\n pagename = self.context['pagename']\n if self.context['current_version'] == other_version and not is_root:\n return '{}.html'.format(pagename.split('/')[-1])\n\n other_remote = self[other_version]\n other_root_dir = other_remote['root_dir']\n components = ['..'] * pagename.count('/')\n components += [other_root_dir] if is_root else ['..', other_root_dir]\n components += [pagename if self.vhasdoc(other_version) else other_remote['master_doc']]\n return '{}.html'.format(__import__('posixpath').join(*components))"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef read_local_conf(local_conf):\n log = logging.getLogger(__name__)\n\n # Attempt to read.\n log.info('Reading config from %s...', local_conf)\n try:\n config = read_config(os.path.dirname(local_conf), '')\n except HandledError:\n log.warning('Unable to read file, continuing with only CLI args.')\n return dict()\n\n # Filter and return.\n return {k[4:]: v for k, v in config.items() if k.startswith('scv_') and not k[4:].startswith('_')}", "response": "Read conf. py from local_conf and return it as a dict."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngathering info about the remote git repository.", "response": "def gather_git_info(root, conf_rel_paths, whitelist_branches, whitelist_tags):\n \"\"\"Gather info about the remote git repository. Get list of refs.\n\n :raise HandledError: If function fails with a handled error. Will be logged before raising.\n\n :param str root: Root directory of repository.\n :param iter conf_rel_paths: List of possible relative paths (to git root) of Sphinx conf.py (e.g. docs/conf.py).\n :param iter whitelist_branches: Optional list of patterns to filter branches by.\n :param iter whitelist_tags: Optional list of patterns to filter tags by.\n\n :return: Commits with docs. A list of tuples: (sha, name, kind, date, conf_rel_path).\n :rtype: list\n \"\"\"\n log = logging.getLogger(__name__)\n\n # List remote.\n log.info('Getting list of all remote branches/tags...')\n try:\n remotes = list_remote(root)\n except GitError as exc:\n log.error(exc.message)\n log.error(exc.output)\n raise HandledError\n log.info('Found: %s', ' '.join(i[1] for i in remotes))\n\n # Filter and date.\n try:\n try:\n dates_paths = filter_and_date(root, conf_rel_paths, (i[0] for i in remotes))\n except GitError:\n log.info('Need to fetch from remote...')\n fetch_commits(root, remotes)\n try:\n dates_paths = filter_and_date(root, conf_rel_paths, (i[0] for i in remotes))\n except GitError as exc:\n log.error(exc.message)\n log.error(exc.output)\n raise HandledError\n except subprocess.CalledProcessError as exc:\n log.debug(json.dumps(dict(command=exc.cmd, cwd=root, code=exc.returncode, output=exc.output)))\n log.error('Failed to get dates for all remote commits.')\n raise HandledError\n filtered_remotes = [[i[0], i[1], i[2], ] + dates_paths[i[0]] for i in remotes if i[0] in dates_paths]\n log.info('With docs: %s', ' '.join(i[1] for i in filtered_remotes))\n if not whitelist_branches and not whitelist_tags:\n return filtered_remotes\n\n # Apply whitelist.\n whitelisted_remotes = list()\n for remote in filtered_remotes:\n if remote[2] == 'heads' and whitelist_branches:\n if not any(re.search(p, remote[1]) for p in whitelist_branches):\n continue\n if remote[2] == 'tags' and whitelist_tags:\n if not any(re.search(p, remote[1]) for p in whitelist_tags):\n continue\n whitelisted_remotes.append(remote)\n log.info('Passed whitelisting: %s', ' '.join(i[1] for i in whitelisted_remotes))\n\n return whitelisted_remotes"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef pre_build(local_root, versions):\n log = logging.getLogger(__name__)\n exported_root = TempDir(True).name\n\n # Extract all.\n for sha in {r['sha'] for r in versions.remotes}:\n target = os.path.join(exported_root, sha)\n log.debug('Exporting %s to temporary directory.', sha)\n export(local_root, sha, target)\n\n # Build root.\n remote = versions[Config.from_context().root_ref]\n with TempDir() as temp_dir:\n log.debug('Building root (before setting root_dirs) in temporary directory: %s', temp_dir)\n source = os.path.dirname(os.path.join(exported_root, remote['sha'], remote['conf_rel_path']))\n build(source, temp_dir, versions, remote['name'], True)\n existing = os.listdir(temp_dir)\n\n # Define root_dir for all versions to avoid file name collisions.\n for remote in versions.remotes:\n root_dir = RE_INVALID_FILENAME.sub('_', remote['name'])\n while root_dir in existing:\n root_dir += '_'\n remote['root_dir'] = root_dir\n log.debug('%s root directory is %s', remote['name'], root_dir)\n existing.append(root_dir)\n\n # Get found_docs and master_doc values for all versions.\n for remote in list(versions.remotes):\n log.debug('Partially running sphinx-build to read configuration for: %s', remote['name'])\n source = os.path.dirname(os.path.join(exported_root, remote['sha'], remote['conf_rel_path']))\n try:\n config = read_config(source, remote['name'])\n except HandledError:\n log.warning('Skipping. Will not be building: %s', remote['name'])\n versions.remotes.pop(versions.remotes.index(remote))\n continue\n remote['found_docs'] = config['found_docs']\n remote['master_doc'] = config['master_doc']\n\n return exported_root", "response": "Build all versions to determine root directory and master_doc names."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nbuild all versions in a single file.", "response": "def build_all(exported_root, destination, versions):\n \"\"\"Build all versions.\n\n :param str exported_root: Tempdir path with exported commits as subdirectories.\n :param str destination: Destination directory to copy/overwrite built docs to. Does not delete old files.\n :param sphinxcontrib.versioning.versions.Versions versions: Versions class instance.\n \"\"\"\n log = logging.getLogger(__name__)\n\n while True:\n # Build root.\n remote = versions[Config.from_context().root_ref]\n log.info('Building root: %s', remote['name'])\n source = os.path.dirname(os.path.join(exported_root, remote['sha'], remote['conf_rel_path']))\n build(source, destination, versions, remote['name'], True)\n\n # Build all refs.\n for remote in list(versions.remotes):\n log.info('Building ref: %s', remote['name'])\n source = os.path.dirname(os.path.join(exported_root, remote['sha'], remote['conf_rel_path']))\n target = os.path.join(destination, remote['root_dir'])\n try:\n build(source, target, versions, remote['name'], False)\n except HandledError:\n log.warning('Skipping. Will not be building %s. Rebuilding everything.', remote['name'])\n versions.remotes.pop(versions.remotes.index(remote))\n break # Break out of for loop.\n else:\n break"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ncalling by Sphinx during phase 1.", "response": "def setup(app):\n \"\"\"Called by Sphinx during phase 0 (initialization).\n\n :param sphinx.application.Sphinx app: Sphinx application object.\n\n :returns: Extension version.\n :rtype: dict\n \"\"\"\n # Used internally. For rebuilding all pages when one or versions fail.\n app.add_config_value('sphinxcontrib_versioning_versions', SC_VERSIONING_VERSIONS, 'html')\n\n # Needed for banner.\n app.config.html_static_path.append(STATIC_DIR)\n app.add_stylesheet('banner.css')\n\n # Tell Sphinx which config values can be set by the user.\n for name, default in Config():\n app.add_config_value('scv_{}'.format(name), default, 'html')\n\n # Event handlers.\n app.connect('builder-inited', EventHandlers.builder_inited)\n app.connect('env-updated', EventHandlers.env_updated)\n app.connect('html-page-context', EventHandlers.html_page_context)\n return dict(version=__version__)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _build(argv, config, versions, current_name, is_root):\n # Patch.\n application.Config = ConfigInject\n if config.show_banner:\n EventHandlers.BANNER_GREATEST_TAG = config.banner_greatest_tag\n EventHandlers.BANNER_MAIN_VERSION = config.banner_main_ref\n EventHandlers.BANNER_RECENT_TAG = config.banner_recent_tag\n EventHandlers.SHOW_BANNER = True\n EventHandlers.CURRENT_VERSION = current_name\n EventHandlers.IS_ROOT = is_root\n EventHandlers.VERSIONS = versions\n SC_VERSIONING_VERSIONS[:] = [p for r in versions.remotes for p in sorted(r.items()) if p[0] not in ('sha', 'date')]\n\n # Update argv.\n if config.verbose > 1:\n argv += ('-v',) * (config.verbose - 1)\n if config.no_colors:\n argv += ('-N',)\n if config.overflow:\n argv += config.overflow\n\n # Build.\n result = build_main(argv)\n if result != 0:\n raise SphinxError", "response": "Build Sphinx docs via multiprocessing for isolation."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreads the Sphinx config via multiprocessing for isolation.", "response": "def _read_config(argv, config, current_name, queue):\n \"\"\"Read the Sphinx config via multiprocessing for isolation.\n\n :param tuple argv: Arguments to pass to Sphinx.\n :param sphinxcontrib.versioning.lib.Config config: Runtime configuration.\n :param str current_name: The ref name of the current version being built.\n :param multiprocessing.queues.Queue queue: Communication channel to parent process.\n \"\"\"\n # Patch.\n EventHandlers.ABORT_AFTER_READ = queue\n\n # Run.\n _build(argv, config, Versions(list()), current_name, False)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef build(source, target, versions, current_name, is_root):\n log = logging.getLogger(__name__)\n argv = ('sphinx-build', source, target)\n config = Config.from_context()\n\n log.debug('Running sphinx-build for %s with args: %s', current_name, str(argv))\n child = multiprocessing.Process(target=_build, args=(argv, config, versions, current_name, is_root))\n child.start()\n child.join() # Block.\n if child.exitcode != 0:\n log.error('sphinx-build failed for branch/tag: %s', current_name)\n raise HandledError", "response": "Builds a single version of the current version."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreading the Sphinx config for one version.", "response": "def read_config(source, current_name):\n \"\"\"Read the Sphinx config for one version.\n\n :raise HandledError: If sphinx-build fails. Will be logged before raising.\n\n :param str source: Source directory to pass to sphinx-build.\n :param str current_name: The ref name of the current version being built.\n\n :return: Specific Sphinx config values.\n :rtype: dict\n \"\"\"\n log = logging.getLogger(__name__)\n queue = multiprocessing.Queue()\n config = Config.from_context()\n\n with TempDir() as temp_dir:\n argv = ('sphinx-build', source, temp_dir)\n log.debug('Running sphinx-build for config values with args: %s', str(argv))\n child = multiprocessing.Process(target=_read_config, args=(argv, config, current_name, queue))\n child.start()\n child.join() # Block.\n if child.exitcode != 0:\n log.error('sphinx-build failed for branch/tag while reading config: %s', current_name)\n raise HandledError\n\n config = queue.get()\n return config"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef builder_inited(app):\n # Add this extension's _templates directory to Sphinx.\n templates_dir = os.path.join(os.path.dirname(__file__), '_templates')\n app.builder.templates.pathchain.insert(0, templates_dir)\n app.builder.templates.loaders.insert(0, SphinxFileSystemLoader(templates_dir))\n app.builder.templates.templatepathlen += 1\n\n # Add versions.html to sidebar.\n if '**' not in app.config.html_sidebars:\n app.config.html_sidebars['**'] = StandaloneHTMLBuilder.default_sidebars + ['versions.html']\n elif 'versions.html' not in app.config.html_sidebars['**']:\n app.config.html_sidebars['**'].append('versions.html')", "response": "Update the Sphinx builder."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef env_updated(cls, app, env):\n if cls.ABORT_AFTER_READ:\n config = {n: getattr(app.config, n) for n in (a for a in dir(app.config) if a.startswith('scv_'))}\n config['found_docs'] = tuple(str(d) for d in env.found_docs)\n config['master_doc'] = str(app.config.master_doc)\n cls.ABORT_AFTER_READ.put(config)\n sys.exit(0)", "response": "Abort Sphinx after initializing config and discovering all pages to build."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef html_page_context(cls, app, pagename, templatename, context, doctree):\n assert templatename or doctree # Unused, for linting.\n cls.VERSIONS.context = context\n versions = cls.VERSIONS\n this_remote = versions[cls.CURRENT_VERSION]\n banner_main_remote = versions[cls.BANNER_MAIN_VERSION] if cls.SHOW_BANNER else None\n\n # Update Jinja2 context.\n context['bitbucket_version'] = cls.CURRENT_VERSION\n context['current_version'] = cls.CURRENT_VERSION\n context['github_version'] = cls.CURRENT_VERSION\n context['html_theme'] = app.config.html_theme\n context['scv_banner_greatest_tag'] = cls.BANNER_GREATEST_TAG\n context['scv_banner_main_ref_is_branch'] = banner_main_remote['kind'] == 'heads' if cls.SHOW_BANNER else None\n context['scv_banner_main_ref_is_tag'] = banner_main_remote['kind'] == 'tags' if cls.SHOW_BANNER else None\n context['scv_banner_main_version'] = banner_main_remote['name'] if cls.SHOW_BANNER else None\n context['scv_banner_recent_tag'] = cls.BANNER_RECENT_TAG\n context['scv_is_branch'] = this_remote['kind'] == 'heads'\n context['scv_is_greatest_tag'] = this_remote == versions.greatest_tag_remote\n context['scv_is_recent_branch'] = this_remote == versions.recent_branch_remote\n context['scv_is_recent_ref'] = this_remote == versions.recent_remote\n context['scv_is_recent_tag'] = this_remote == versions.recent_tag_remote\n context['scv_is_root'] = cls.IS_ROOT\n context['scv_is_tag'] = this_remote['kind'] == 'tags'\n context['scv_show_banner'] = cls.SHOW_BANNER\n context['versions'] = versions\n context['vhasdoc'] = versions.vhasdoc\n context['vpathto'] = versions.vpathto\n\n # Insert banner into body.\n if cls.SHOW_BANNER and 'body' in context:\n parsed = app.builder.templates.render('banner.html', context)\n context['body'] = parsed + context['body']\n # Handle overridden css_files.\n css_files = context.setdefault('css_files', list())\n if '_static/banner.css' not in css_files:\n css_files.append('_static/banner.css')\n # Handle overridden html_static_path.\n if STATIC_DIR not in app.config.html_static_path:\n app.config.html_static_path.append(STATIC_DIR)\n\n # Reset last_updated with file's mtime (will be last git commit authored date).\n if app.config.html_last_updated_fmt is not None:\n file_path = app.env.doc2path(pagename)\n if os.path.isfile(file_path):\n lufmt = app.config.html_last_updated_fmt or getattr(locale, '_')('%b %d, %Y')\n mtime = datetime.datetime.fromtimestamp(os.path.getmtime(file_path))\n context['last_updated'] = format_date(lufmt, mtime, language=app.config.language, warn=app.warn)", "response": "Update the Jinja2 HTML context with the current version and the banner main version."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef cli(config, **options):\n def pre(rel_source):\n \"\"\"To be executed in a Click sub command.\n\n Needed because if this code is in cli() it will be executed when the user runs: --help\n\n :param tuple rel_source: Possible relative paths (to git root) of Sphinx directory containing conf.py.\n \"\"\"\n # Setup logging.\n if not NO_EXECUTE:\n setup_logging(verbose=config.verbose, colors=not config.no_colors)\n log = logging.getLogger(__name__)\n\n # Change current working directory.\n if config.chdir:\n os.chdir(config.chdir)\n log.debug('Working directory: %s', os.getcwd())\n else:\n config.update(dict(chdir=os.getcwd()), overwrite=True)\n\n # Get and verify git root.\n try:\n config.update(dict(git_root=get_root(config.git_root or os.getcwd())), overwrite=True)\n except GitError as exc:\n log.error(exc.message)\n log.error(exc.output)\n raise HandledError\n\n # Look for local config.\n if config.no_local_conf:\n config.update(dict(local_conf=None), overwrite=True)\n elif not config.local_conf:\n candidates = [p for p in (os.path.join(s, 'conf.py') for s in rel_source) if os.path.isfile(p)]\n if candidates:\n config.update(dict(local_conf=candidates[0]), overwrite=True)\n else:\n log.debug(\"Didn't find a conf.py in any REL_SOURCE.\")\n elif os.path.basename(config.local_conf) != 'conf.py':\n log.error('Path \"%s\" must end with conf.py.', config.local_conf)\n raise HandledError\n config['pre'] = pre # To be called by Click sub commands.\n config.update(options)", "response": "A CLI for building versioned Sphinx docs for every branch and tag pushed to origin."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef build_options(func):\n func = click.option('-a', '--banner-greatest-tag', is_flag=True,\n help='Override banner-main-ref to be the tag with the highest version number.')(func)\n func = click.option('-A', '--banner-recent-tag', is_flag=True,\n help='Override banner-main-ref to be the most recent committed tag.')(func)\n func = click.option('-b', '--show-banner', help='Show a warning banner.', is_flag=True)(func)\n func = click.option('-B', '--banner-main-ref',\n help=\"Don't show banner on this ref and point banner URLs to this ref. Default master.\")(func)\n func = click.option('-i', '--invert', help='Invert/reverse order of versions.', is_flag=True)(func)\n func = click.option('-p', '--priority', type=click.Choice(('branches', 'tags')),\n help=\"Group these kinds of versions at the top (for themes that don't separate them).\")(func)\n func = click.option('-r', '--root-ref',\n help='The branch/tag at the root of DESTINATION. Will also be in subdir. Default master.')(func)\n func = click.option('-s', '--sort', multiple=True, type=click.Choice(('semver', 'alpha', 'time')),\n help='Sort versions. Specify multiple times to sort equal values of one kind.')(func)\n func = click.option('-t', '--greatest-tag', is_flag=True,\n help='Override root-ref to be the tag with the highest version number.')(func)\n func = click.option('-T', '--recent-tag', is_flag=True,\n help='Override root-ref to be the most recent committed tag.')(func)\n func = click.option('-w', '--whitelist-branches', multiple=True,\n help='Whitelist branches that match the pattern. Can be specified more than once.')(func)\n func = click.option('-W', '--whitelist-tags', multiple=True,\n help='Whitelist tags that match the pattern. Can be specified more than once.')(func)\n\n return func", "response": "Add build click options to function."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\noverriding root_ref or banner_main_ref with tags in config if user requested.", "response": "def override_root_main_ref(config, remotes, banner):\n \"\"\"Override root_ref or banner_main_ref with tags in config if user requested.\n\n :param sphinxcontrib.versioning.lib.Config config: Runtime configuration.\n :param iter remotes: List of dicts from Versions.remotes.\n :param bool banner: Evaluate banner main ref instead of root ref.\n\n :return: If root/main ref exists.\n :rtype: bool\n \"\"\"\n log = logging.getLogger(__name__)\n greatest_tag = config.banner_greatest_tag if banner else config.greatest_tag\n recent_tag = config.banner_recent_tag if banner else config.recent_tag\n\n if greatest_tag or recent_tag:\n candidates = [r for r in remotes if r['kind'] == 'tags']\n if candidates:\n multi_sort(candidates, ['semver' if greatest_tag else 'time'])\n config.update({'banner_main_ref' if banner else 'root_ref': candidates[0]['name']}, overwrite=True)\n else:\n flag = '--banner-main-ref' if banner else '--root-ref'\n log.warning('No git tags with docs found in remote. Falling back to %s value.', flag)\n\n ref = config.banner_main_ref if banner else config.root_ref\n return ref in [r['name'] for r in remotes]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nbuilding all locally and locally locally.", "response": "def build(config, rel_source, destination, **options):\n \"\"\"Fetch branches/tags and build all locally.\n\n Doesn't push anything to remote. Just fetch all remote branches and tags, export them to a temporary directory, run\n sphinx-build on each one, and then store all built documentation in DESTINATION.\n\n REL_SOURCE is the path to the docs directory relative to the git root. If the source directory has moved around\n between git tags you can specify additional directories.\n\n DESTINATION is the path to the local directory that will hold all generated docs for all versions.\n\n To pass options to sphinx-build (run for every branch/tag) use a double hyphen\n (e.g. build docs docs/_build/html -- -D setting=value).\n \\f\n\n :param sphinxcontrib.versioning.lib.Config config: Runtime configuration.\n :param tuple rel_source: Possible relative paths (to git root) of Sphinx directory containing conf.py (e.g. docs).\n :param str destination: Destination directory to copy/overwrite built docs to. Does not delete old files.\n :param dict options: Additional Click options.\n \"\"\"\n if 'pre' in config:\n config.pop('pre')(rel_source)\n config.update({k: v for k, v in options.items() if v})\n if config.local_conf:\n config.update(read_local_conf(config.local_conf), ignore_set=True)\n if NO_EXECUTE:\n raise RuntimeError(config, rel_source, destination)\n log = logging.getLogger(__name__)\n\n # Gather git data.\n log.info('Gathering info about the remote git repository...')\n conf_rel_paths = [os.path.join(s, 'conf.py') for s in rel_source]\n remotes = gather_git_info(config.git_root, conf_rel_paths, config.whitelist_branches, config.whitelist_tags)\n if not remotes:\n log.error('No docs found in any remote branch/tag. Nothing to do.')\n raise HandledError\n versions = Versions(\n remotes,\n sort=config.sort,\n priority=config.priority,\n invert=config.invert,\n )\n\n # Get root ref.\n if not override_root_main_ref(config, versions.remotes, False):\n log.error('Root ref %s not found in: %s', config.root_ref, ' '.join(r[1] for r in remotes))\n raise HandledError\n log.info('Root ref is: %s', config.root_ref)\n\n # Get banner main ref.\n if not config.show_banner:\n config.update(dict(banner_greatest_tag=False, banner_main_ref=None, banner_recent_tag=False), overwrite=True)\n elif not override_root_main_ref(config, versions.remotes, True):\n log.warning('Banner main ref %s not found in: %s', config.banner_main_ref, ' '.join(r[1] for r in remotes))\n log.warning('Disabling banner.')\n config.update(dict(banner_greatest_tag=False, banner_main_ref=None, banner_recent_tag=False, show_banner=False),\n overwrite=True)\n else:\n log.info('Banner main ref is: %s', config.banner_main_ref)\n\n # Pre-build.\n log.info(\"Pre-running Sphinx to collect versions' master_doc and other info.\")\n exported_root = pre_build(config.git_root, versions)\n if config.banner_main_ref and config.banner_main_ref not in [r['name'] for r in versions.remotes]:\n log.warning('Banner main ref %s failed during pre-run. Disabling banner.', config.banner_main_ref)\n config.update(dict(banner_greatest_tag=False, banner_main_ref=None, banner_recent_tag=False, show_banner=False),\n overwrite=True)\n\n # Build.\n build_all(exported_root, destination, versions)\n\n # Cleanup.\n log.debug('Removing: %s', exported_root)\n shutil.rmtree(exported_root)\n\n # Store versions in state for push().\n config['versions'] = versions"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nbuild locally and then push to remote branch.", "response": "def push(ctx, config, rel_source, dest_branch, rel_dest, **options):\n \"\"\"Build locally and then push to remote branch.\n\n First the build sub command is invoked which takes care of building all versions of your documentation in a\n temporary directory. If that succeeds then all built documents will be pushed to a remote branch.\n\n REL_SOURCE is the path to the docs directory relative to the git root. If the source directory has moved around\n between git tags you can specify additional directories.\n\n DEST_BRANCH is the branch name where generated docs will be committed to. The branch will then be pushed to remote.\n If there is a race condition with another job pushing to remote the docs will be re-generated and pushed again.\n\n REL_DEST is the path to the directory that will hold all generated docs for all versions relative to the git roof of\n DEST_BRANCH.\n\n To pass options to sphinx-build (run for every branch/tag) use a double hyphen\n (e.g. push docs gh-pages . -- -D setting=value).\n \\f\n\n :param click.core.Context ctx: Click context.\n :param sphinxcontrib.versioning.lib.Config config: Runtime configuration.\n :param tuple rel_source: Possible relative paths (to git root) of Sphinx directory containing conf.py (e.g. docs).\n :param str dest_branch: Branch to clone and push to.\n :param str rel_dest: Relative path (to git root) to write generated docs to.\n :param dict options: Additional Click options.\n \"\"\"\n if 'pre' in config:\n config.pop('pre')(rel_source)\n config.update({k: v for k, v in options.items() if v})\n if config.local_conf:\n config.update(read_local_conf(config.local_conf), ignore_set=True)\n if NO_EXECUTE:\n raise RuntimeError(config, rel_source, dest_branch, rel_dest)\n log = logging.getLogger(__name__)\n\n # Clone, build, push.\n for _ in range(PUSH_RETRIES):\n with TempDir() as temp_dir:\n log.info('Cloning %s into temporary directory...', dest_branch)\n try:\n clone(config.git_root, temp_dir, config.push_remote, dest_branch, rel_dest, config.grm_exclude)\n except GitError as exc:\n log.error(exc.message)\n log.error(exc.output)\n raise HandledError\n\n log.info('Building docs...')\n ctx.invoke(build, rel_source=rel_source, destination=os.path.join(temp_dir, rel_dest))\n versions = config.pop('versions')\n\n log.info('Attempting to push to branch %s on remote repository.', dest_branch)\n try:\n if commit_and_push(temp_dir, config.push_remote, versions):\n return\n except GitError as exc:\n log.error(exc.message)\n log.error(exc.output)\n raise HandledError\n log.warning('Failed to push to remote repository. Retrying in %d seconds...', PUSH_SLEEP)\n time.sleep(PUSH_SLEEP)\n\n # Failed if this is reached.\n log.error('Ran out of retries, giving up.')\n raise HandledError"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef custom_sort(param):\n option = param.opts[0].lstrip('-')\n if param.param_type_name != 'option':\n return False,\n return True, option == 'version', option.lower(), option.swapcase()", "response": "Custom click. core. params sorter."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_params(self, ctx):\n self.params.sort(key=self.custom_sort)\n return super(ClickGroup, self).get_params(ctx)", "response": "Sort order of options before displaying."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ninject overflow arguments into context state.", "response": "def invoke(self, ctx):\n \"\"\"Inject overflow arguments into context state.\n\n :param click.core.Context ctx: Click context.\n\n :return: super() return value.\n \"\"\"\n if self.overflow:\n ctx.ensure_object(Config).update(dict(overflow=self.overflow))\n return super(ClickGroup, self).invoke(ctx)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nretrieves this class instance from the current Click context.", "response": "def from_context(cls):\n \"\"\"Retrieve this class' instance from the current Click context.\n\n :return: Instance of this class.\n :rtype: Config\n \"\"\"\n try:\n ctx = click.get_current_context()\n except RuntimeError:\n return cls()\n return ctx.find_object(cls)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nupdating the object with the values from a dictionary.", "response": "def update(self, params, ignore_set=False, overwrite=False):\n \"\"\"Set instance values from dictionary.\n\n :param dict params: Click context params.\n :param bool ignore_set: Skip already-set values instead of raising AttributeError.\n :param bool overwrite: Allow overwriting already-set values.\n \"\"\"\n log = logging.getLogger(__name__)\n valid = {i[0] for i in self}\n for key, value in params.items():\n if not hasattr(self, key):\n raise AttributeError(\"'{}' object has no attribute '{}'\".format(self.__class__.__name__, key))\n if key not in valid:\n message = \"'{}' object does not support item assignment on '{}'\"\n raise AttributeError(message.format(self.__class__.__name__, key))\n if key in self._already_set:\n if ignore_set:\n log.debug('%s already set in config, skipping.', key)\n continue\n if not overwrite:\n message = \"'{}' object does not support item re-assignment on '{}'\"\n raise AttributeError(message.format(self.__class__.__name__, key))\n setattr(self, key, value)\n self._already_set.add(key)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nassign a hosting device to a cfg agent.", "response": "def assign_hosting_device_to_cfg_agent(self, context, cfg_agent_id,\n hosting_device_id):\n \"\"\"Make config agent handle an (unassigned) hosting device.\"\"\"\n hd_db = self._get_hosting_device(context, hosting_device_id)\n if hd_db.cfg_agent_id:\n if hd_db.cfg_agent_id == cfg_agent_id:\n return\n LOG.debug('Hosting device %(hd_id)s has already been assigned to '\n 'Cisco cfg agent %(agent_id)s',\n {'hd_id': hosting_device_id, 'agent_id': cfg_agent_id})\n raise ciscocfgagentscheduler.HostingDeviceAssignedToCfgAgent(\n hosting_device_id=hosting_device_id, agent_id=cfg_agent_id)\n cfg_agent_db = get_agent_db_obj(self._get_agent(context, cfg_agent_id))\n if (cfg_agent_db.agent_type != c_constants.AGENT_TYPE_CFG or\n cfg_agent_db.admin_state_up is not True):\n raise ciscocfgagentscheduler.InvalidCfgAgent(agent_id=cfg_agent_id)\n self._bind_hosting_device_to_cfg_agent(context, hd_db, cfg_agent_db)\n cfg_notifier = self.agent_notifiers.get(c_constants.AGENT_TYPE_CFG)\n if cfg_notifier:\n cfg_notifier.hosting_devices_assigned_to_cfg_agent(\n context, [hosting_device_id], cfg_agent_db.host)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef unassign_hosting_device_from_cfg_agent(self, context, cfg_agent_id,\n hosting_device_id):\n \"\"\"Make config agent handle an (unassigned) hosting device.\"\"\"\n hd_db = self._get_hosting_device(context, hosting_device_id)\n if hd_db.cfg_agent_id is None and cfg_agent_id is None:\n return\n elif hd_db.cfg_agent_id != cfg_agent_id:\n LOG.debug('Hosting device %(hd_id)s is not assigned to Cisco '\n 'cfg agent %(agent_id)s',\n {'hd_id': hosting_device_id,\n 'agent_id': cfg_agent_id})\n raise ciscocfgagentscheduler.HostingDeviceNotAssignedToCfgAgent(\n hosting_device_id=hosting_device_id, agent_id=cfg_agent_id)\n cfg_agent_db = get_agent_db_obj(self._get_agent(context, cfg_agent_id))\n cfg_notifier = self.agent_notifiers.get(c_constants.AGENT_TYPE_CFG)\n if cfg_notifier:\n cfg_notifier.hosting_devices_unassigned_from_cfg_agent(\n context, [hosting_device_id], cfg_agent_db.host)\n self._bind_hosting_device_to_cfg_agent(context, hd_db, None)", "response": "Unassigns a hosting device from a cfg agent."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsetting port attributes for trunk subports. For baremetal deployments only, set the neutron port attributes during the bind_port event.", "response": "def update_subports(self, port):\n \"\"\"Set port attributes for trunk subports.\n\n For baremetal deployments only, set the neutron port attributes\n during the bind_port event.\n \"\"\"\n trunk_details = port.get('trunk_details')\n subports = trunk_details['sub_ports']\n\n host_id = port.get(bc.dns.DNSNAME)\n context = bc.get_context()\n el_context = context.elevated()\n\n for subport in subports:\n bc.get_plugin().update_port(el_context, subport['port_id'],\n {'port':\n {bc.portbindings.HOST_ID: host_id,\n 'device_owner': bc.trunk_consts.TRUNK_SUBPORT_OWNER}})\n\n # Set trunk to ACTIVE status.\n trunk_obj = bc.trunk_objects.Trunk.get_object(\n el_context, id=trunk_details['trunk_id'])\n trunk_obj.update(status=bc.trunk_consts.ACTIVE_STATUS)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreads the static uplink from file if given.", "response": "def read_static_uplink(self):\n \"\"\"Read the static uplink from file, if given.\"\"\"\n if self.node_list is None or self.node_uplink_list is None:\n return\n for node, port in zip(self.node_list.split(','),\n self.node_uplink_list.split(',')):\n if node.strip() == self.host_name:\n self.static_uplink = True\n self.static_uplink_port = port.strip()\n return"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef vdp_vlan_change_cb(self, port_uuid, lvid, vdp_vlan, fail_reason):\n LOG.info(\"Vlan change CB lvid %(lvid)s VDP %(vdp)s\",\n {'lvid': lvid, 'vdp': vdp_vlan})\n self.update_vm_result(port_uuid, constants.RESULT_SUCCESS,\n lvid=lvid, vdp_vlan=vdp_vlan,\n fail_reason=fail_reason)", "response": "Callback function for update the VDP VLAN in DB."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nprocessing the bulk VM event.", "response": "def process_bulk_vm_event(self, msg, phy_uplink):\n \"\"\"Process the VM bulk event usually after a restart. \"\"\"\n LOG.info(\"In processing Bulk VM Event status %s\", msg)\n time.sleep(3)\n if (not self.uplink_det_compl or\n phy_uplink not in self.ovs_vdp_obj_dict):\n # This condition shouldn't be hit as only when uplink is obtained,\n # save_uplink is called and that in turns calls this process_bulk.\n LOG.error(\"Uplink Port Event not received,\"\n \"yet in bulk process\")\n return\n ovs_vdp_obj = self.ovs_vdp_obj_dict[phy_uplink]\n for vm_dict in msg.msg_dict.get('vm_bulk_list'):\n if vm_dict['status'] == 'down':\n ovs_vdp_obj.pop_local_cache(vm_dict['port_uuid'],\n vm_dict['vm_mac'],\n vm_dict['net_uuid'],\n vm_dict['local_vlan'],\n vm_dict['vdp_vlan'],\n vm_dict['segmentation_id'])\n vm_msg = VdpQueMsg(constants.VM_MSG_TYPE,\n port_uuid=vm_dict['port_uuid'],\n vm_mac=vm_dict['vm_mac'],\n net_uuid=vm_dict['net_uuid'],\n segmentation_id=vm_dict['segmentation_id'],\n status=vm_dict['status'],\n oui=vm_dict['oui'],\n phy_uplink=phy_uplink)\n self.process_vm_event(vm_msg, phy_uplink)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef is_openstack_running(self):\n try:\n if (ovs_vdp.is_bridge_present(self.br_ex, self.root_helper) and\n ovs_vdp.is_bridge_present(self.br_integ,\n self.root_helper)):\n return True\n else:\n return False\n except Exception as e:\n LOG.error(\"Exception in is_openstack_running %s\", str(e))\n return False", "response": "Checks if the openstack is running."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _fill_topology_cfg(self, topo_dict):\n cfg_dict = {}\n if topo_dict.bond_member_ports is not None:\n cfg_dict.update({'bond_member_ports':\n topo_dict.bond_member_ports})\n if topo_dict.bond_interface is not None:\n cfg_dict.update({'bond_interface':\n topo_dict.bond_interface})\n return cfg_dict", "response": "Fills the extra configurations in the topology."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef uplink_bond_intf_process(self):\n bond_intf = sys_utils.get_bond_intf(self.phy_uplink)\n if bond_intf is None:\n return False\n self.save_uplink(\n fail_reason=constants.port_transition_bond_down_reason)\n self.process_uplink_ongoing = True\n upl_msg = VdpQueMsg(constants.UPLINK_MSG_TYPE, status='down',\n phy_uplink=self.phy_uplink,\n br_int=self.br_integ, br_ex=self.br_ex,\n root_helper=self.root_helper)\n self.que.enqueue(constants.Q_UPL_PRIO, upl_msg)\n self.phy_uplink = None\n self.veth_intf = None\n self.uplink_det_compl = False\n\n # No veth interface\n self.save_uplink(\n uplink=bond_intf,\n fail_reason=constants.port_transition_bond_up_reason)\n self.phy_uplink = bond_intf\n self.process_uplink_ongoing = True\n upl_msg = VdpQueMsg(constants.UPLINK_MSG_TYPE, status='up',\n phy_uplink=self.phy_uplink,\n br_int=self.br_integ, br_ex=self.br_ex,\n root_helper=self.root_helper)\n self.que.enqueue(constants.Q_UPL_PRIO, upl_msg)\n return True", "response": "This function processes the case when uplink interface becomes part of a bond."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks if bulk VM event received from the agent.", "response": "def check_periodic_bulk_vm_notif_rcvd(self):\n \"\"\"Bulk VM check handler called from periodic uplink detection.\n\n This gets called by the 'normal' stage of uplink detection.\n The bulk VM event sends all the VM's running in this agent.\n Sometimes during upgrades, it was found that due to some race\n condition, the server does not send the Bulk VM event.\n Whenever, a save_uplink is done by the agent, the server sends\n the Bulk VM event.\n If Bulk VM event is not received after few attempts, save_uplink is\n done to request the Bulk VM list.\n It's not protected with a mutex, since worst case,\n Bulk VM event will be sent twice, which is not that bad. When\n uplink is detected for the first time, it will hit the below\n else case and there a save_uplink is anyways done.\n \"\"\"\n if not self.bulk_vm_rcvd_flag:\n if self.bulk_vm_check_cnt >= 1:\n self.bulk_vm_check_cnt = 0\n self.save_uplink(uplink=self.phy_uplink,\n veth_intf=self.veth_intf)\n LOG.info(\"Doing save_uplink again to request \"\n \"Bulk VM's\")\n else:\n LOG.info(\"Bulk VM not received, incrementing count\")\n self.bulk_vm_check_cnt += 1"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef static_uplink_detect(self, veth):\n LOG.info(\"In static_uplink_detect %(veth)s\", {'veth': veth})\n if self.static_uplink_first:\n self.static_uplink_first = False\n if self.phy_uplink is not None and (\n self.phy_uplink != self.static_uplink_port):\n return 'down'\n if veth is None:\n return self.static_uplink_port\n else:\n return 'normal'", "response": "Return the static uplink based on argument passed."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_hosting_device_config(self, client, hosting_device_id):\n return client.get((self.resource_path + HOSTING_DEVICE_CONFIG) %\n hosting_device_id)", "response": "Get config of hosting_device."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_client_class(self, client_class_name):\n request_url = self._build_url(['ClientClass', client_class_name])\n return self._do_request('GET', request_url)", "response": "Returns a specific client class details from CPNR server."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_vpn(self, vpn_name):\n request_url = self._build_url(['VPN', vpn_name])\n return self._do_request('GET', request_url)", "response": "Returns a specific VPN name details from CPNR server."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn a list of all the scopes from CPNR server.", "response": "def get_scopes(self, vpnid='.*'):\n \"\"\"Returns a list of all the scopes from CPNR server.\"\"\"\n request_url = self._build_url(['Scope'], vpn=vpnid)\n return self._do_request('GET', request_url)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn a specific scope name details from CPNR server.", "response": "def get_scope(self, scope_name):\n \"\"\"Returns a specific scope name details from CPNR server.\"\"\"\n request_url = self._build_url(['Scope', scope_name])\n return self._do_request('GET', request_url)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning a specific client entry name details from CPNR server.", "response": "def get_client_entry(self, client_entry_name):\n \"\"\"Returns a specific client entry name details from CPNR server.\"\"\"\n request_url = self._build_url(['ClientEntry', client_entry_name])\n return self._do_request('GET', request_url)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef release_address(self, address, vpnid):\n query = address + \"?action=releaseAddress&vpnId=\" + vpnid\n request_url = self._build_url(['Lease', query])\n return self._do_request('DELETE', request_url)", "response": "Release a specific lease"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef qsize(self, qname):\n if qname in self._queues:\n return self._queues[qname].qsize()\n else:\n raise ValueError(_(\"queue %s is not defined\"), qname)", "response": "Return the approximate size of the queue."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nlisting a nexusport binding.", "response": "def get_nexusport_binding(port_id, vlan_id, switch_ip, instance_id):\n \"\"\"Lists a nexusport binding.\"\"\"\n LOG.debug(\"get_nexusport_binding() called\")\n return _lookup_all_nexus_bindings(port_id=port_id,\n vlan_id=vlan_id,\n switch_ip=switch_ip,\n instance_id=instance_id)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nlist all bindings for this switch & port.", "response": "def get_nexus_switchport_binding(port_id, switch_ip):\n \"\"\"Lists all bindings for this switch & port.\"\"\"\n LOG.debug(\"get_nexus_switchport_binding() called\")\n return _lookup_all_nexus_bindings(port_id=port_id,\n switch_ip=switch_ip)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nlist a vlan and switch binding.", "response": "def get_nexusvlan_binding(vlan_id, switch_ip):\n \"\"\"Lists a vlan and switch binding.\"\"\"\n LOG.debug(\"get_nexusvlan_binding() called\")\n return _lookup_all_nexus_bindings(vlan_id=vlan_id, switch_ip=switch_ip)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef update_reserved_binding(vlan_id, switch_ip, instance_id,\n port_id, is_switch_binding=True,\n is_native=False, ch_grp=0):\n \"\"\"Updates reserved binding.\n\n This overloads port bindings to support reserved Switch binding\n used to maintain the state of a switch so it can be viewed by\n all other neutron processes. There's also the case of\n a reserved port binding to keep switch information on a given\n interface.\n\n The values of these arguments is as follows:\n :param vlan_id: 0\n :param switch_ip: ip address of the switch\n :param instance_id: fixed string RESERVED_NEXUS_SWITCH_DEVICE_ID_R1\n :param port_id: switch-state of ACTIVE, RESTORE_S1, RESTORE_S2, INACTIVE\n : port-expected port_id\n :param ch_grp: 0 if no port-channel else non-zero integer\n \"\"\"\n if not port_id:\n LOG.warning(\"update_reserved_binding called with no state\")\n return\n LOG.debug(\"update_reserved_binding called\")\n session = bc.get_writer_session()\n if is_switch_binding:\n # For reserved switch binding\n binding = _lookup_one_nexus_binding(session=session,\n vlan_id=vlan_id,\n switch_ip=switch_ip,\n instance_id=instance_id)\n binding.port_id = port_id\n else:\n # For reserved port binding\n binding = _lookup_one_nexus_binding(session=session,\n vlan_id=vlan_id,\n switch_ip=switch_ip,\n instance_id=instance_id,\n port_id=port_id)\n binding.is_native = is_native\n binding.channel_group = ch_grp\n session.merge(binding)\n session.flush()\n return binding", "response": "Updates the reserved binding for a given switch."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nremove a reserved binding from the reserved switch.", "response": "def remove_reserved_binding(vlan_id, switch_ip, instance_id,\n port_id):\n \"\"\"Removes reserved binding.\n\n This overloads port bindings to support reserved Switch binding\n used to maintain the state of a switch so it can be viewed by\n all other neutron processes. There's also the case of\n a reserved port binding to keep switch information on a given\n interface.\n The values of these arguments is as follows:\n :param vlan_id: 0\n :param switch_ip: ip address of the switch\n :param instance_id: fixed string RESERVED_NEXUS_SWITCH_DEVICE_ID_R1\n :param port_id: switch-state of ACTIVE, RESTORE_S1, RESTORE_S2, INACTIVE\n : port-expected port_id\n \"\"\"\n if not port_id:\n LOG.warning(\"remove_reserved_binding called with no state\")\n return\n LOG.debug(\"remove_reserved_binding called\")\n session = bc.get_writer_session()\n binding = _lookup_one_nexus_binding(session=session,\n vlan_id=vlan_id,\n switch_ip=switch_ip,\n instance_id=instance_id,\n port_id=port_id)\n for bind in binding:\n session.delete(bind)\n session.flush()\n return binding"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nadds a reserved switch binding.", "response": "def add_reserved_switch_binding(switch_ip, state):\n \"\"\"Add a reserved switch binding.\"\"\"\n\n # overload port_id to contain switch state\n add_nexusport_binding(\n state,\n const.NO_VLAN_OR_VNI_ID,\n const.NO_VLAN_OR_VNI_ID,\n switch_ip,\n const.RESERVED_NEXUS_SWITCH_DEVICE_ID_R1)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nupdate a reserved switch binding.", "response": "def update_reserved_switch_binding(switch_ip, state):\n \"\"\"Update a reserved switch binding.\"\"\"\n\n # overload port_id to contain switch state\n update_reserved_binding(\n const.NO_VLAN_OR_VNI_ID,\n switch_ip,\n const.RESERVED_NEXUS_SWITCH_DEVICE_ID_R1,\n state)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef add_nexusport_binding(port_id, vlan_id, vni, switch_ip, instance_id,\n is_native=False, ch_grp=0):\n \"\"\"Adds a nexusport binding.\"\"\"\n LOG.debug(\"add_nexusport_binding() called\")\n session = bc.get_writer_session()\n binding = nexus_models_v2.NexusPortBinding(port_id=port_id,\n vlan_id=vlan_id,\n vni=vni,\n switch_ip=switch_ip,\n instance_id=instance_id,\n is_native=is_native,\n channel_group=ch_grp)\n session.add(binding)\n session.flush()\n return binding", "response": "Adds a nexusport binding."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nremove a nexusport binding.", "response": "def remove_nexusport_binding(port_id, vlan_id, vni, switch_ip, instance_id):\n \"\"\"Removes a nexusport binding.\"\"\"\n LOG.debug(\"remove_nexusport_binding() called\")\n session = bc.get_writer_session()\n binding = _lookup_all_nexus_bindings(session=session,\n vlan_id=vlan_id,\n vni=vni,\n switch_ip=switch_ip,\n port_id=port_id,\n instance_id=instance_id)\n for bind in binding:\n session.delete(bind)\n session.flush()\n return binding"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nremoves all nexusport bindings.", "response": "def remove_all_nexusport_bindings():\n \"\"\"Removes all nexusport bindings.\"\"\"\n\n LOG.debug(\"remove_all_nexusport_bindings() called\")\n session = bc.get_writer_session()\n session.query(nexus_models_v2.NexusPortBinding).delete()\n session.flush()"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_port_vlan_switch_binding(port_id, vlan_id, switch_ip):\n LOG.debug(\"get_port_vlan_switch_binding() called\")\n return _lookup_all_nexus_bindings(port_id=port_id,\n switch_ip=switch_ip,\n vlan_id=vlan_id)", "response": "Lists nexus vm bindings."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_port_switch_bindings(port_id, switch_ip):\n LOG.debug(\"get_port_switch_bindings() called, \"\n \"port:'%(port_id)s', switch:'%(switch_ip)s'\",\n {'port_id': port_id, 'switch_ip': switch_ip})\n try:\n return _lookup_all_nexus_bindings(port_id=port_id,\n switch_ip=switch_ip)\n except c_exc.NexusPortBindingNotFound:\n pass", "response": "List all vm bindings on a Nexus switch port."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _lookup_nexus_bindings(query_type, session=None, **bfilter):\n if session is None:\n session = bc.get_reader_session()\n query_method = getattr(session.query(\n nexus_models_v2.NexusPortBinding).filter_by(**bfilter), query_type)\n try:\n bindings = query_method()\n if bindings:\n return bindings\n except sa_exc.NoResultFound:\n pass\n raise c_exc.NexusPortBindingNotFound(**bfilter)", "response": "Look up query_type Nexus bindings matching the filter."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef add_nexusnve_binding(vni, switch_ip, device_id, mcast_group):\n LOG.debug(\"add_nexusnve_binding() called\")\n session = bc.get_writer_session()\n binding = nexus_models_v2.NexusNVEBinding(vni=vni,\n switch_ip=switch_ip,\n device_id=device_id,\n mcast_group=mcast_group)\n session.add(binding)\n session.flush()\n return binding", "response": "Adds a nexus nve binding."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef remove_nexusnve_binding(vni, switch_ip, device_id):\n LOG.debug(\"remove_nexusnve_binding() called\")\n session = bc.get_writer_session()\n binding = (session.query(nexus_models_v2.NexusNVEBinding).\n filter_by(vni=vni, switch_ip=switch_ip,\n device_id=device_id).one())\n if binding:\n session.delete(binding)\n session.flush()\n return binding", "response": "Remove the nexus nve binding."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef remove_all_nexusnve_bindings():\n\n LOG.debug(\"remove_all_nexusport_bindings() called\")\n session = bc.get_writer_session()\n session.query(nexus_models_v2.NexusNVEBinding).delete()\n session.flush()", "response": "Removes all nexusnve bindings."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the nexus nve binding ( s ) per switch.", "response": "def get_nve_vni_switch_bindings(vni, switch_ip):\n \"\"\"Return the nexus nve binding(s) per switch.\"\"\"\n LOG.debug(\"get_nve_vni_switch_bindings() called\")\n session = bc.get_reader_session()\n try:\n return (session.query(nexus_models_v2.NexusNVEBinding).\n filter_by(vni=vni, switch_ip=switch_ip).all())\n except sa_exc.NoResultFound:\n return None"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsearching for Nexus Host mappings matching the filter.", "response": "def _lookup_host_mappings(query_type, session=None, **bfilter):\n \"\"\"Look up 'query_type' Nexus mappings matching the filter.\n\n :param query_type: 'all', 'one' or 'first'\n :param session: db session\n :param bfilter: filter for mappings query\n :returns: mappings if query gave a result, else\n raise NexusHostMappingNotFound.\n \"\"\"\n if session is None:\n session = bc.get_reader_session()\n query_method = getattr(session.query(\n nexus_models_v2.NexusHostMapping).filter_by(**bfilter), query_type)\n try:\n mappings = query_method()\n if mappings:\n return mappings\n except sa_exc.NoResultFound:\n pass\n raise c_exc.NexusHostMappingNotFound(**bfilter)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nadds Host to interface mapping entry into mapping data base.", "response": "def add_host_mapping(host_id, nexus_ip, interface, ch_grp, is_static):\n \"\"\"Add Host to interface mapping entry into mapping data base.\n\n :param host_id: is the name of the host to add\n :param interface: is the interface for this host\n :param nexus_ip: is the ip addr of the nexus switch for this interface\n :param ch_grp: is the port channel this interface belos\n :param is_static: whether this is from conf file or learned from baremetal.\n \"\"\"\n\n LOG.debug(\"add_nexusport_binding() called\")\n session = bc.get_writer_session()\n mapping = nexus_models_v2.NexusHostMapping(host_id=host_id,\n if_id=interface,\n switch_ip=nexus_ip,\n ch_grp=ch_grp,\n is_static=is_static)\n try:\n session.add(mapping)\n session.flush()\n except db_exc.DBDuplicateEntry:\n with excutils.save_and_reraise_exception() as ctxt:\n if is_static:\n ctxt.reraise = False\n LOG.debug(\"Duplicate static entry encountered \"\n \"host=%(host)s, if=%(if)s, ip=%(ip)s\",\n {'host': host_id, 'if': interface,\n 'ip': nexus_ip})\n\n return mapping"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef update_host_mapping(host_id, interface, nexus_ip, new_ch_grp):\n\n LOG.debug(\"update_host_mapping called\")\n session = bc.get_writer_session()\n mapping = _lookup_one_host_mapping(\n session=session,\n host_id=host_id,\n if_id=interface,\n switch_ip=nexus_ip)\n mapping.ch_grp = new_ch_grp\n session.merge(mapping)\n session.flush()\n return mapping", "response": "Update the host mapping data base."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nremove host to interface mapping entry from mapping data base.", "response": "def remove_host_mapping(interface, nexus_ip):\n \"\"\"Remove host to interface mapping entry from mapping data base.\"\"\"\n\n LOG.debug(\"remove_host_mapping() called\")\n session = bc.get_writer_session()\n try:\n mapping = _lookup_one_host_mapping(\n session=session,\n if_id=interface,\n switch_ip=nexus_ip)\n session.delete(mapping)\n session.flush()\n except c_exc.NexusHostMappingNotFound:\n pass"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef remove_all_static_host_mappings():\n\n LOG.debug(\"remove_host_mapping() called\")\n session = bc.get_writer_session()\n try:\n mapping = _lookup_all_host_mappings(\n session=session,\n is_static=True)\n for host in mapping:\n session.delete(host)\n session.flush()\n except c_exc.NexusHostMappingNotFound:\n pass", "response": "Remove all entries defined in config file from mapping data base."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _lookup_vpc_allocs(query_type, session=None, order=None, **bfilter):\n\n if session is None:\n session = bc.get_reader_session()\n\n if order:\n query_method = getattr(session.query(\n nexus_models_v2.NexusVPCAlloc).filter_by(**bfilter).order_by(\n order),\n query_type)\n else:\n query_method = getattr(session.query(\n nexus_models_v2.NexusVPCAlloc).filter_by(**bfilter), query_type)\n\n try:\n vpcs = query_method()\n if vpcs:\n return vpcs\n except sa_exc.NoResultFound:\n pass\n\n raise c_exc.NexusVPCAllocNotFound(**bfilter)", "response": "Searches for Nexus VPC Allocs matching the filter."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nlook up count min and max for given switch.", "response": "def _lookup_vpc_count_min_max(session=None, **bfilter):\n \"\"\"Look up count/min/max Nexus VPC Allocs for given switch.\n\n :param session: db session\n :param bfilter: filter for mappings query\n :returns: number of VPCs and min value if query gave a result,\n else raise NexusVPCAllocNotFound.\n \"\"\"\n\n if session is None:\n session = bc.get_reader_session()\n\n try:\n res = session.query(\n func.count(nexus_models_v2.NexusVPCAlloc.vpc_id),\n func.min(nexus_models_v2.NexusVPCAlloc.vpc_id),\n func.max(nexus_models_v2.NexusVPCAlloc.vpc_id),\n ).filter(nexus_models_v2.NexusVPCAlloc.switch_ip ==\n bfilter['switch_ip']).one()\n\n count = res[0]\n sw_min = res[1]\n sw_max = res[2]\n\n return count, sw_min, sw_max\n\n except sa_exc.NoResultFound:\n pass\n\n raise c_exc.NexusVPCAllocNotFound(**bfilter)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _get_free_vpcids_on_switches(switch_ip_list):\n '''Get intersect list of free vpcids in list of switches.'''\n\n session = bc.get_reader_session()\n\n prev_view = aliased(nexus_models_v2.NexusVPCAlloc)\n query = session.query(prev_view.vpc_id)\n prev_swip = switch_ip_list[0]\n\n for ip in switch_ip_list[1:]:\n cur_view = aliased(nexus_models_v2.NexusVPCAlloc)\n cur_swip = ip\n query = query.join(cur_view, sa.and_(\n prev_view.switch_ip == prev_swip, prev_view.active == False, # noqa\n cur_view.switch_ip == cur_swip, cur_view.active == False, # noqa\n prev_view.vpc_id == cur_view.vpc_id))\n prev_view = cur_view\n prev_swip = cur_swip\n\n unique_vpcids = query.all()\n shuffle(unique_vpcids)\n return unique_vpcids", "response": "Get intersect list of free vpcids in list of switches."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef init_vpc_entries(nexus_ip, vpc_list):\n\n LOG.debug(\"init_vpc_entries() called\")\n\n if not vpc_list:\n return\n session = bc.get_writer_session()\n\n for vpc in vpc_list:\n vpc_alloc = nexus_models_v2.NexusVPCAlloc(\n switch_ip=nexus_ip,\n vpc_id=vpc,\n learned=False,\n active=False)\n session.add(vpc_alloc)\n session.flush()", "response": "Initialize switch entries in vpc alloc data base."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nupdates the active state in vpc_allocate data base.", "response": "def update_vpc_entry(nexus_ips, vpc_id, learned, active):\n \"\"\"Change active state in vpc_allocate data base.\"\"\"\n\n LOG.debug(\"update_vpc_entry called\")\n\n session = bc.get_writer_session()\n\n with session.begin():\n for n_ip in nexus_ips:\n flipit = not active\n x = session.execute(\n sa.update(nexus_models_v2.NexusVPCAlloc).values({\n 'learned': learned,\n 'active': active}).where(sa.and_(\n nexus_models_v2.NexusVPCAlloc.switch_ip == n_ip,\n nexus_models_v2.NexusVPCAlloc.vpc_id == vpc_id,\n nexus_models_v2.NexusVPCAlloc.active == flipit\n )))\n if x.rowcount != 1:\n raise c_exc.NexusVPCAllocNotFound(\n switch_ip=n_ip, vpc_id=vpc_id, active=active)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef alloc_vpcid(nexus_ips):\n\n LOG.debug(\"alloc_vpc() called\")\n\n vpc_id = 0\n intersect = _get_free_vpcids_on_switches(nexus_ips)\n for intersect_tuple in intersect:\n try:\n update_vpc_entry(nexus_ips, intersect_tuple.vpc_id,\n False, True)\n vpc_id = intersect_tuple.vpc_id\n break\n except Exception:\n LOG.exception(\n \"This exception is expected if another controller \"\n \"beat us to vpcid %(vpcid)s for nexus %(ip)s\",\n {'vpcid': intersect_tuple.vpc_id,\n 'ip': ', '.join(map(str, nexus_ips))})\n\n return vpc_id", "response": "Allocate a vpc id for the given list of switch_ips."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef free_vpcid_for_switch_list(vpc_id, nexus_ips):\n\n LOG.debug(\"free_vpcid_for_switch_list() called\")\n if vpc_id != 0:\n update_vpc_entry(nexus_ips, vpc_id, False, False)", "response": "Free a vpc id for the given list of switch_ips."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef free_vpcid_for_switch(vpc_id, nexus_ip):\n\n LOG.debug(\"free_vpcid_for_switch() called\")\n if vpc_id != 0:\n update_vpc_entry([nexus_ip], vpc_id, False, False)", "response": "Free a vpc id for the given switch_ip."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nremove the vpcid for a switch.", "response": "def delete_vpcid_for_switch(vpc_id, switch_ip):\n \"\"\"Removes unused vpcid for a switch.\n\n :param vpc_id: vpc id to remove\n :param switch_ip: ip address of the switch\n \"\"\"\n\n LOG.debug(\"delete_vpcid_for_switch called\")\n session = bc.get_writer_session()\n\n vpc = _lookup_one_vpc_allocs(vpc_id=vpc_id,\n switch_ip=switch_ip,\n active=False)\n session.delete(vpc)\n session.flush()"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef create_process(cmd, root_helper=None, addl_env=None, log_output=True):\n if root_helper:\n cmd = shlex.split(root_helper) + cmd\n cmd = map(str, cmd)\n\n log_output and LOG.info(\"Running command: %s\", cmd)\n env = os.environ.copy()\n if addl_env:\n env.update(addl_env)\n\n obj = subprocess_popen(cmd, shell=False, stdin=subprocess.PIPE,\n stdout=subprocess.PIPE, stderr=subprocess.PIPE,\n env=env)\n return obj, cmd", "response": "Create a process object for the given command."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef is_intf_up(intf):\n intf_path = '/'.join(('/sys/class/net', intf))\n intf_exist = os.path.exists(intf_path)\n if not intf_exist:\n LOG.error(\"Unable to get interface %(intf)s, Interface dir \"\n \"%(dir)s does not exist\",\n {'intf': intf, 'dir': intf_path})\n return False\n try:\n oper_file = '/'.join((intf_path, 'operstate'))\n with open(oper_file, 'r') as fd:\n oper_state = fd.read().strip('\\n')\n if oper_state == 'up':\n return True\n except Exception as e:\n LOG.error(\"Exception in reading %s\", str(e))\n return False", "response": "Function to check if a given interface is up."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_all_run_phy_intf():\n intf_list = []\n base_dir = '/sys/class/net'\n dir_exist = os.path.exists(base_dir)\n if not dir_exist:\n LOG.error(\"Unable to get interface list :Base dir %s does not \"\n \"exist\", base_dir)\n return intf_list\n dir_cont = os.listdir(base_dir)\n for subdir in dir_cont:\n dev_dir = base_dir + '/' + subdir + '/' + 'device'\n dev_exist = os.path.exists(dev_dir)\n if dev_exist:\n oper_state = is_intf_up(subdir)\n if oper_state is True:\n intf_list.append(subdir)\n else:\n LOG.info(\"Dev dir %s does not exist, not physical intf\",\n dev_dir)\n return intf_list", "response": "Retrieve all physical interfaces that are operationally up."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef check_vnic_type_and_vendor_info(self, vnic_type, profile):\n # Check for vnic_type\n if vnic_type not in self.supported_sriov_vnic_types:\n LOG.info('Non SR-IOV vnic_type: %s.', vnic_type)\n return False\n\n if not profile:\n return False\n\n # Check for vendor_info\n return self._check_for_supported_vendor(profile)", "response": "Checks if this vnic_type and vendor device info are supported."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks if the port belongs to a supported vendor. Returns True if supported vendor returns False if not supported vendor returns True if supported_pci_devs returns True if supported_pci_devs returns False", "response": "def _check_for_supported_vendor(self, profile):\n \"\"\"Checks if the port belongs to a supported vendor.\n\n Returns True for supported_pci_devs.\n \"\"\"\n vendor_info = profile.get('pci_vendor_info')\n if not vendor_info:\n return False\n if vendor_info not in self.supported_pci_devs:\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _import_ucsmsdk(self):\n # Check if SSL certificate checking has been disabled.\n # If so, warn the user before proceeding.\n if not CONF.ml2_cisco_ucsm.ucsm_https_verify:\n LOG.warning(const.SSL_WARNING)\n\n # Monkey patch the UCS sdk version of urllib2 to disable\n # https verify if required.\n from networking_cisco.ml2_drivers.ucsm import ucs_urllib2\n ucsmsdkhandle = importutils.import_module('UcsSdk.UcsHandle')\n ucsmsdkhandle.urllib2 = ucs_urllib2\n\n ucsmsdk = importutils.import_module('UcsSdk')\n return ucsmsdk", "response": "Imports the Ucsm SDK module."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the contents of the Name field associated with a UCS Server.", "response": "def _get_server_name(self, handle, service_profile_mo, ucsm_ip):\n \"\"\"Get the contents of the 'Name' field associated with UCS Server.\n\n When a valid connection hande to UCS Manager is handed in, the Name\n field associated with a UCS Server is returned.\n \"\"\"\n try:\n resolved_dest = handle.ConfigResolveDn(service_profile_mo.PnDn)\n server_list = resolved_dest.OutConfig.GetChild()\n if not server_list:\n return \"\"\n return server_list[0].Name\n except Exception as e:\n # Raise a Neutron exception. Include a description of\n # the original exception.\n raise cexc.UcsmConfigReadFailed(ucsm_ip=ucsm_ip, exc=e)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _create_ucsm_host_to_service_profile_mapping(self):\n ucsm_ips = list(CONF.ml2_cisco_ucsm.ucsms)\n for ucsm_ip in ucsm_ips:\n with self.ucsm_connect_disconnect(ucsm_ip) as handle:\n try:\n sp_list_temp = handle.ConfigResolveClass('lsServer', None,\n inHierarchical=False)\n if sp_list_temp and sp_list_temp.OutConfigs is not None:\n sp_list = sp_list_temp.OutConfigs.GetChild() or []\n for sp in sp_list:\n if sp.PnDn:\n server_name = self._get_server_name(handle, sp,\n ucsm_ip)\n if (server_name and not sp.OperSrcTemplName):\n LOG.debug('Server %s info retrieved '\n 'from UCSM %s', server_name, ucsm_ip)\n key = (ucsm_ip, server_name)\n self.ucsm_sp_dict[key] = str(sp.Dn)\n self.ucsm_host_dict[server_name] = ucsm_ip\n except Exception as e:\n # Raise a Neutron exception. Include a description of\n # the original exception.\n raise cexc.UcsmConfigReadFailed(ucsm_ip=ucsm_ip, exc=e)", "response": "Reads list of Service profiles and finds associated Server and creates a mapping from server name to service profile name."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a VLAN Profile for the given vlan_id.", "response": "def _create_vlanprofile(self, handle, vlan_id, ucsm_ip):\n \"\"\"Creates VLAN profile to be assosiated with the Port Profile.\"\"\"\n vlan_name = self.make_vlan_name(vlan_id)\n vlan_profile_dest = (const.VLAN_PATH + const.VLAN_PROFILE_PATH_PREFIX +\n vlan_name)\n\n try:\n handle.StartTransaction()\n vp1 = handle.GetManagedObject(\n None,\n self.ucsmsdk.FabricLanCloud.ClassId(),\n {self.ucsmsdk.FabricLanCloud.DN: const.VLAN_PATH})\n if not vp1:\n LOG.warning('UCS Manager network driver Vlan Profile '\n 'path at %s missing', const.VLAN_PATH)\n return False\n\n # Create a vlan profile with the given vlan_id\n vp2 = handle.AddManagedObject(\n vp1,\n self.ucsmsdk.FabricVlan.ClassId(),\n {self.ucsmsdk.FabricVlan.COMPRESSION_TYPE:\n const.VLAN_COMPRESSION_TYPE,\n self.ucsmsdk.FabricVlan.DN: vlan_profile_dest,\n self.ucsmsdk.FabricVlan.SHARING: const.NONE,\n self.ucsmsdk.FabricVlan.PUB_NW_NAME: \"\",\n self.ucsmsdk.FabricVlan.ID: str(vlan_id),\n self.ucsmsdk.FabricVlan.MCAST_POLICY_NAME: \"\",\n self.ucsmsdk.FabricVlan.NAME: vlan_name,\n self.ucsmsdk.FabricVlan.DEFAULT_NET: \"no\"})\n\n handle.CompleteTransaction()\n if vp2:\n LOG.debug('UCS Manager network driver Created Vlan '\n 'Profile %s at %s', vlan_name, vlan_profile_dest)\n return True\n\n except Exception as e:\n return self._handle_ucsm_exception(e, 'Vlan Profile',\n vlan_name, ucsm_ip)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncreating a Port Profile on the UCS Manager.", "response": "def _create_port_profile(self, handle, profile_name, vlan_id,\n vnic_type, ucsm_ip, trunk_vlans, qos_policy):\n \"\"\"Creates a Port Profile on the UCS Manager.\n\n Significant parameters set in the port profile are:\n 1. Port profile name - Should match what was set in vif_details\n 2. High performance mode - For VM-FEX to be enabled/configured on\n the port using this port profile, this mode should be enabled.\n 3. Vlan id - Vlan id used by traffic to and from the port.\n \"\"\"\n port_profile_dest = (const.PORT_PROFILESETDN + const.VNIC_PATH_PREFIX +\n profile_name)\n\n vlan_name = self.make_vlan_name(vlan_id)\n vlan_associate_path = (const.PORT_PROFILESETDN +\n const.VNIC_PATH_PREFIX + profile_name +\n const.VLAN_PATH_PREFIX + vlan_name)\n cl_profile_name = const.CLIENT_PROFILE_NAME_PREFIX + str(vlan_id)\n cl_profile_dest = (const.PORT_PROFILESETDN + const.VNIC_PATH_PREFIX +\n profile_name + const.CLIENT_PROFILE_PATH_PREFIX +\n cl_profile_name)\n\n # Remove this Port Profile from the delete DB table if it was\n # addded there due to a previous delete.\n self.ucsm_db.remove_port_profile_to_delete(profile_name, ucsm_ip)\n\n # Check if direct or macvtap mode\n if vnic_type == bc.portbindings.VNIC_DIRECT:\n port_mode = const.HIGH_PERF\n else:\n port_mode = const.NONE\n\n try:\n handle.StartTransaction()\n port_profile = handle.GetManagedObject(\n None,\n self.ucsmsdk.VnicProfileSet.ClassId(),\n {self.ucsmsdk.VnicProfileSet.DN: const.PORT_PROFILESETDN})\n\n if not port_profile:\n LOG.warning('UCS Manager network driver Port Profile '\n 'path at %s missing',\n const.PORT_PROFILESETDN)\n return False\n\n # Create a port profile on the UCS Manager\n p_profile = handle.AddManagedObject(\n port_profile,\n self.ucsmsdk.VnicProfile.ClassId(),\n {self.ucsmsdk.VnicProfile.NAME: profile_name,\n self.ucsmsdk.VnicProfile.POLICY_OWNER: \"local\",\n self.ucsmsdk.VnicProfile.NW_CTRL_POLICY_NAME: \"\",\n self.ucsmsdk.VnicProfile.PIN_TO_GROUP_NAME: \"\",\n self.ucsmsdk.VnicProfile.DN: port_profile_dest,\n self.ucsmsdk.VnicProfile.DESCR: const.DESCR,\n self.ucsmsdk.VnicProfile.QOS_POLICY_NAME: qos_policy,\n self.ucsmsdk.VnicProfile.HOST_NW_IOPERF: port_mode,\n self.ucsmsdk.VnicProfile.MAX_PORTS: const.MAX_PORTS})\n if not p_profile:\n LOG.warning('UCS Manager network driver could not '\n 'create Port Profile %s.', profile_name)\n return False\n\n LOG.debug('UCS Manager network driver associating Vlan '\n 'Profile with Port Profile at %s',\n vlan_associate_path)\n # Associate port profile with vlan profile\n mo = handle.AddManagedObject(\n p_profile,\n self.ucsmsdk.VnicEtherIf.ClassId(),\n {self.ucsmsdk.VnicEtherIf.DN: vlan_associate_path,\n self.ucsmsdk.VnicEtherIf.NAME: vlan_name,\n self.ucsmsdk.VnicEtherIf.DEFAULT_NET: \"yes\"}, True)\n if not mo:\n LOG.warning('UCS Manager network driver cannot '\n 'associate Vlan Profile to Port '\n 'Profile %s', profile_name)\n return False\n LOG.debug('UCS Manager network driver created Port Profile %s '\n 'at %s', profile_name, port_profile_dest)\n\n # For Multi VLAN trunk support\n if trunk_vlans:\n for vlan in trunk_vlans:\n vlan_name = self.make_vlan_name(vlan)\n vlan_associate_path = (const.PORT_PROFILESETDN +\n const.VNIC_PATH_PREFIX + profile_name +\n const.VLAN_PATH_PREFIX + vlan_name)\n # Associate port profile with vlan profile\n # for the trunk vlans\n mo = handle.AddManagedObject(\n p_profile,\n self.ucsmsdk.VnicEtherIf.ClassId(),\n {self.ucsmsdk.VnicEtherIf.DN: vlan_associate_path,\n self.ucsmsdk.VnicEtherIf.NAME: vlan_name,\n self.ucsmsdk.VnicEtherIf.DEFAULT_NET: \"no\"}, True)\n if not mo:\n LOG.warning('UCS Manager network driver cannot '\n 'associate Vlan %(vlan)d to Port '\n 'Profile %(profile)s',\n {'vlan': vlan, 'profile': profile_name})\n\n cl_profile = handle.AddManagedObject(\n p_profile,\n self.ucsmsdk.VmVnicProfCl.ClassId(),\n {self.ucsmsdk.VmVnicProfCl.ORG_PATH: \".*\",\n self.ucsmsdk.VmVnicProfCl.DN: cl_profile_dest,\n self.ucsmsdk.VmVnicProfCl.NAME: cl_profile_name,\n self.ucsmsdk.VmVnicProfCl.POLICY_OWNER: \"local\",\n self.ucsmsdk.VmVnicProfCl.SW_NAME: \".*\",\n self.ucsmsdk.VmVnicProfCl.DC_NAME: \".*\",\n self.ucsmsdk.VmVnicProfCl.DESCR: const.DESCR})\n handle.CompleteTransaction()\n\n if not cl_profile:\n LOG.warning('UCS Manager network driver could not '\n 'create Client Profile %s.',\n cl_profile_name)\n return False\n\n LOG.debug('UCS Manager network driver created Client Profile '\n '%s at %s', cl_profile_name, cl_profile_dest)\n return True\n\n except Exception as e:\n return self._handle_ucsm_exception(e, 'Port Profile',\n profile_name, ucsm_ip)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef create_portprofile(self, profile_name, vlan_id, vnic_type, host_id,\n trunk_vlans):\n \"\"\"Top level method to create Port Profiles on the UCS Manager.\n\n Calls all the methods responsible for the individual tasks that\n ultimately result in the creation of the Port Profile on the UCS\n Manager.\n \"\"\"\n ucsm_ip = self.get_ucsm_ip_for_host(host_id)\n if not ucsm_ip:\n LOG.info('UCS Manager network driver does not have UCSM IP '\n 'for Host_id %s', str(host_id))\n return False\n\n with self.ucsm_connect_disconnect(ucsm_ip) as handle:\n # Create Vlan Profile\n if not self._create_vlanprofile(handle, vlan_id, ucsm_ip):\n LOG.error('UCS Manager network driver failed to create '\n 'Vlan Profile for vlan %s', str(vlan_id))\n return False\n if trunk_vlans:\n for vlan in trunk_vlans:\n if not self._create_vlanprofile(handle, vlan, ucsm_ip):\n LOG.error('UCS Manager network driver failed to '\n 'create Vlan Profile for vlan %s', vlan)\n return False\n\n qos_policy = CONF.ml2_cisco_ucsm.ucsms[ucsm_ip].sriov_qos_policy\n if qos_policy:\n LOG.debug('UCS Manager Network driver applying QoS Policy '\n '%(qos)s to Port Profile %(port_profile)s',\n {'qos': qos_policy, 'port_profile': profile_name})\n\n # Create Port Profile\n if not self._create_port_profile(handle, profile_name,\n vlan_id, vnic_type,\n ucsm_ip, trunk_vlans,\n qos_policy):\n LOG.error('UCS Manager network driver failed to create '\n 'Port Profile %s', profile_name)\n return False\n\n return True", "response": "Create Port Profiles on the UCS Manager."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nupdating the Service Profile on the UCS Manager.", "response": "def _update_service_profile(self, handle, service_profile,\n vlan_id, ucsm_ip):\n \"\"\"Updates Service Profile on the UCS Manager.\n\n Each of the ethernet ports on the Service Profile representing\n the UCS Server, is updated with the VLAN profile corresponding\n to the vlan_id passed in.\n \"\"\"\n virtio_port_list = (\n CONF.ml2_cisco_ucsm.ucsms[ucsm_ip].ucsm_virtio_eth_ports)\n\n eth_port_paths = [\"%s%s\" % (service_profile, ep)\n for ep in virtio_port_list]\n\n vlan_name = self.make_vlan_name(vlan_id)\n\n try:\n handle.StartTransaction()\n obj = handle.GetManagedObject(\n None,\n self.ucsmsdk.LsServer.ClassId(),\n {self.ucsmsdk.LsServer.DN: service_profile})\n\n if not obj:\n LOG.debug('UCS Manager network driver could not find '\n 'Service Profile %s in UCSM %s',\n service_profile, ucsm_ip)\n return False\n\n for eth_port_path in eth_port_paths:\n eth = handle.GetManagedObject(\n obj, self.ucsmsdk.VnicEther.ClassId(),\n {self.ucsmsdk.VnicEther.DN: eth_port_path}, True)\n\n if eth:\n vlan_path = (eth_port_path + const.VLAN_PATH_PREFIX +\n vlan_name)\n\n eth_if = handle.AddManagedObject(eth,\n self.ucsmsdk.VnicEtherIf.ClassId(),\n {self.ucsmsdk.VnicEtherIf.DN: vlan_path,\n self.ucsmsdk.VnicEtherIf.NAME: vlan_name,\n self.ucsmsdk.VnicEtherIf.DEFAULT_NET: \"no\"}, True)\n\n if not eth_if:\n LOG.debug('UCS Manager network driver could not '\n 'update Service Profile %s with vlan %d',\n service_profile, vlan_id)\n return False\n else:\n LOG.debug('UCS Manager network driver did not find '\n 'ethernet port at %s', eth_port_path)\n\n handle.CompleteTransaction()\n return True\n\n except Exception as e:\n return self._handle_ucsm_exception(e, 'Service Profile',\n vlan_name, ucsm_ip)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef update_serviceprofile(self, host_id, vlan_id):\n ucsm_ip = self.get_ucsm_ip_for_host(host_id)\n if not ucsm_ip:\n LOG.info('UCS Manager network driver does not have UCSM IP '\n 'for Host_id %s', str(host_id))\n return False\n\n service_profile = self.ucsm_sp_dict.get((ucsm_ip, host_id))\n if service_profile:\n LOG.debug('UCS Manager network driver Service Profile : %s',\n service_profile)\n else:\n LOG.info('UCS Manager network driver does not support '\n 'Host_id %s', host_id)\n return False\n\n with self.ucsm_connect_disconnect(ucsm_ip) as handle:\n # Create Vlan Profile\n if not self._create_vlanprofile(handle, vlan_id, ucsm_ip):\n LOG.error('UCS Manager network driver failed to create '\n 'Vlan Profile for vlan %s', str(vlan_id))\n return False\n\n # Update Service Profile\n if not self._update_service_profile(handle,\n service_profile,\n vlan_id,\n ucsm_ip):\n LOG.error('UCS Manager network driver failed to update '\n 'Service Profile %(service_profile)s in UCSM '\n '%(ucsm_ip)s',\n {'service_profile': service_profile, 'ucsm_ip': ucsm_ip})\n return False\n\n return True", "response": "Top level method to update Service Profiles on UCS Manager."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef update_vnic_template(self, host_id, vlan_id, physnet,\n vnic_template_path, vnic_template):\n \"\"\"Updates VNIC Template with the vlan_id.\"\"\"\n ucsm_ip = self.get_ucsm_ip_for_host(host_id)\n\n if not ucsm_ip:\n LOG.info('UCS Manager network driver does not have UCSM IP '\n 'for Host_id %s', str(host_id))\n return False\n\n vlan_name = self.make_vlan_name(vlan_id)\n\n with self.ucsm_connect_disconnect(ucsm_ip) as handle:\n # Create Vlan Profile\n if not self._create_vlanprofile(handle, vlan_id, ucsm_ip):\n LOG.error('UCS Manager network driver failed to create '\n 'Vlan Profile for vlan %s', vlan_id)\n return False\n\n try:\n LOG.debug('VNIC Template Path: %s', vnic_template_path)\n vnic_template_full_path = (vnic_template_path +\n const.VNIC_TEMPLATE_PREFIX + str(vnic_template))\n LOG.debug('VNIC Template Path: %s for physnet %s',\n vnic_template_full_path, physnet)\n\n handle.StartTransaction()\n mo = handle.GetManagedObject(\n None,\n self.ucsmsdk.VnicLanConnTempl.ClassId(),\n {self.ucsmsdk.VnicLanConnTempl.DN:\n vnic_template_full_path}, True)\n if not mo:\n LOG.error('UCS Manager network driver could '\n 'not find VNIC template %s',\n vnic_template_full_path)\n return False\n\n vlan_dn = (vnic_template_full_path + const.VLAN_PATH_PREFIX +\n vlan_name)\n LOG.debug('VNIC Template VLAN path: %s', vlan_dn)\n eth_if = handle.AddManagedObject(mo,\n self.ucsmsdk.VnicEtherIf.ClassId(),\n {self.ucsmsdk.VnicEtherIf.DN: vlan_dn,\n self.ucsmsdk.VnicEtherIf.NAME: vlan_name,\n self.ucsmsdk.VnicEtherIf.DEFAULT_NET: \"no\"}, True)\n if not eth_if:\n LOG.error('UCS Manager network driver could '\n 'not add VLAN %(vlan_name)s to VNIC '\n 'template %(vnic_template_full_path)s',\n {'vlan_name': vlan_name,\n 'vnic_template_full_path': vnic_template_full_path})\n return False\n\n handle.CompleteTransaction()\n return True\n except Exception as e:\n return self._handle_ucsm_exception(e, 'VNIC Template',\n vlan_id, ucsm_ip)", "response": "Updates the VNIC Template with the vlan_id."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _delete_vlan_profile(self, handle, vlan_id, ucsm_ip):\n vlan_name = self.make_vlan_name(vlan_id)\n vlan_profile_dest = (const.VLAN_PATH + const.VLAN_PROFILE_PATH_PREFIX +\n vlan_name)\n try:\n handle.StartTransaction()\n obj = handle.GetManagedObject(\n None,\n self.ucsmsdk.FabricVlan.ClassId(),\n {self.ucsmsdk.FabricVlan.DN: vlan_profile_dest})\n\n if obj:\n handle.RemoveManagedObject(obj)\n\n handle.CompleteTransaction()\n\n except Exception as e:\n # Raise a Neutron exception. Include a description of\n # the original exception.\n raise cexc.UcsmConfigFailed(config=vlan_id,\n ucsm_ip=ucsm_ip, exc=e)", "response": "Deletes VLAN Profile from UCS Manager."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ndelete Port Profile from UCS Manager.", "response": "def _delete_port_profile_from_ucsm(self, handle, port_profile, ucsm_ip):\n \"\"\"Deletes Port Profile from UCS Manager.\"\"\"\n port_profile_dest = (const.PORT_PROFILESETDN + const.VNIC_PATH_PREFIX +\n port_profile)\n handle.StartTransaction()\n\n # Find port profile on the UCS Manager\n p_profile = handle.GetManagedObject(\n None,\n self.ucsmsdk.VnicProfile.ClassId(),\n {self.ucsmsdk.VnicProfile.NAME: port_profile,\n self.ucsmsdk.VnicProfile.DN: port_profile_dest})\n\n if p_profile:\n handle.RemoveManagedObject(p_profile)\n else:\n LOG.warning('UCS Manager network driver did not find '\n 'Port Profile %s to delete.',\n port_profile)\n\n handle.CompleteTransaction()"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _delete_port_profile(self, handle, port_profile, ucsm_ip):\n try:\n self._delete_port_profile_from_ucsm(handle, port_profile, ucsm_ip)\n\n except Exception as e:\n # Add the Port Profile that we could not delete to the Port Profile\n # delete table. A periodic task will attempt to delete it.\n LOG.debug('Received Port Profile delete exception %s', e)\n self.ucsm_db.add_port_profile_to_delete_table(port_profile,\n ucsm_ip)", "response": "Calls method to delete Port Profile from UCS Manager."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nremoves VLAN Profile config from all service profiles.", "response": "def _remove_vlan_from_all_service_profiles(self, handle, vlan_id, ucsm_ip):\n \"\"\"Deletes VLAN Profile config from server's ethernet ports.\"\"\"\n service_profile_list = []\n for key, value in six.iteritems(self.ucsm_sp_dict):\n if (ucsm_ip in key) and value:\n service_profile_list.append(value)\n\n if not service_profile_list:\n # Nothing to do\n return\n\n try:\n handle.StartTransaction()\n for service_profile in service_profile_list:\n virtio_port_list = (\n CONF.ml2_cisco_ucsm.ucsms[ucsm_ip].ucsm_virtio_eth_ports)\n eth_port_paths = [\"%s%s\" % (service_profile, ep)\n for ep in virtio_port_list]\n\n # 1. From the Service Profile config, access the\n # configuration for its ports.\n # 2. Check if that Vlan has been configured on each port\n # 3. If Vlan config found, remove it.\n obj = handle.GetManagedObject(\n None,\n self.ucsmsdk.LsServer.ClassId(),\n {self.ucsmsdk.LsServer.DN: service_profile})\n\n if obj:\n # Check if this vlan_id has been configured on the\n # ports in this Service profile\n for eth_port_path in eth_port_paths:\n eth = handle.GetManagedObject(\n obj, self.ucsmsdk.VnicEther.ClassId(),\n {self.ucsmsdk.VnicEther.DN: eth_port_path},\n True)\n if eth:\n vlan_name = self.make_vlan_name(vlan_id)\n vlan_path = eth_port_path + \"/if-\" + vlan_name\n vlan = handle.GetManagedObject(eth,\n self.ucsmsdk.VnicEtherIf.ClassId(),\n {self.ucsmsdk.VnicEtherIf.DN: vlan_path})\n if vlan:\n # Found vlan config. Now remove it.\n handle.RemoveManagedObject(vlan)\n handle.CompleteTransaction()\n\n except Exception as e:\n # Raise a Neutron exception. Include a description of\n # the original exception.\n raise cexc.UcsmConfigDeleteFailed(config=vlan_id,\n ucsm_ip=ucsm_ip,\n exc=e)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndeletes VLAN config from all SP Templates that have it.", "response": "def _remove_vlan_from_all_sp_templates(self, handle, vlan_id, ucsm_ip):\n \"\"\"Deletes VLAN config from all SP Templates that have it.\"\"\"\n sp_template_info_list = (\n CONF.ml2_cisco_ucsm.ucsms[ucsm_ip].sp_template_list.values())\n\n vlan_name = self.make_vlan_name(vlan_id)\n virtio_port_list = (\n CONF.ml2_cisco_ucsm.ucsms[ucsm_ip].ucsm_virtio_eth_ports)\n\n try:\n handle.StartTransaction()\n # sp_template_info_list is a list of tuples.\n # Each tuple is of the form :\n # (ucsm_ip, sp_template_path, sp_template)\n for sp_template_info in sp_template_info_list:\n sp_template_path = sp_template_info.path\n sp_template = sp_template_info.name\n\n sp_template_full_path = (sp_template_path +\n const.SP_TEMPLATE_PREFIX + sp_template)\n\n obj = handle.GetManagedObject(\n None,\n self.ucsmsdk.LsServer.ClassId(),\n {self.ucsmsdk.LsServer.DN: sp_template_full_path})\n if not obj:\n LOG.error('UCS Manager network driver could not '\n 'find Service Profile template %s',\n sp_template_full_path)\n continue\n\n eth_port_paths = [\"%s%s\" % (sp_template_full_path, ep)\n for ep in virtio_port_list]\n for eth_port_path in eth_port_paths:\n eth = handle.GetManagedObject(\n obj, self.ucsmsdk.VnicEther.ClassId(),\n {self.ucsmsdk.VnicEther.DN: eth_port_path}, True)\n\n if eth:\n vlan_path = (eth_port_path +\n const.VLAN_PATH_PREFIX + vlan_name)\n vlan = handle.GetManagedObject(eth,\n self.ucsmsdk.VnicEtherIf.ClassId(),\n {self.ucsmsdk.VnicEtherIf.DN: vlan_path})\n if vlan:\n # Found vlan config. Now remove it.\n handle.RemoveManagedObject(vlan)\n else:\n LOG.debug('UCS Manager network driver did not '\n 'find VLAN %s at %s', vlan_name, eth_port_path)\n else:\n LOG.debug('UCS Manager network driver did not '\n 'find ethernet port at %s', eth_port_path)\n handle.CompleteTransaction()\n return True\n except Exception as e:\n # Raise a Neutron exception. Include a description of\n # the original exception.\n raise cexc.UcsmConfigDeleteFailed(config=vlan_id,\n ucsm_ip=ucsm_ip,\n exc=e)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nremoving VLAN from all VNIC templates that have it enabled.", "response": "def _remove_vlan_from_vnic_templates(self, handle, vlan_id, ucsm_ip):\n \"\"\"Removes VLAN from all VNIC templates that have it enabled.\"\"\"\n ucsm = CONF.ml2_cisco_ucsm.ucsms[ucsm_ip]\n vnic_template_info = ucsm.vnic_template_list.values()\n\n vlan_name = self.make_vlan_name(vlan_id)\n\n if not vnic_template_info:\n # Nothing to do\n return\n try:\n handle.StartTransaction()\n for temp_info in vnic_template_info:\n vnic_template = temp_info.template\n vnic_template_path = temp_info.path\n\n vnic_template_full_path = (vnic_template_path +\n const.VNIC_TEMPLATE_PREFIX + str(vnic_template))\n LOG.debug('vnic_template_full_path: %s',\n vnic_template_full_path)\n mo = handle.GetManagedObject(\n None,\n self.ucsmsdk.VnicLanConnTempl.ClassId(),\n {self.ucsmsdk.VnicLanConnTempl.DN: (\n vnic_template_full_path)},\n True)\n if not mo:\n LOG.error('UCS Manager network driver could '\n 'not find VNIC template %s at',\n vnic_template_full_path)\n continue\n\n vlan_dn = (vnic_template_full_path +\n const.VLAN_PATH_PREFIX + vlan_name)\n LOG.debug('VNIC Template VLAN path; %s', vlan_dn)\n eth_if = handle.GetManagedObject(mo,\n self.ucsmsdk.VnicEtherIf.ClassId(),\n {self.ucsmsdk.VnicEtherIf.DN: vlan_dn})\n\n if not eth_if:\n LOG.error('UCS Manager network driver could not '\n 'delete VLAN %(vlan_name)s from VNIC '\n 'template %(vnic_template_full_path)s',\n {'vlan_name': vlan_name,\n 'vnic_template_full_path':\n vnic_template_full_path})\n if eth_if:\n handle.RemoveManagedObject(eth_if)\n handle.CompleteTransaction()\n return True\n except Exception as e:\n return self._handle_ucsm_exception(e, 'VNIC Template',\n vlan_id, ucsm_ip)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef delete_all_config_for_vlan(self, vlan_id, port_profile,\n trunk_vlans):\n \"\"\"Top level method to delete all config for vlan_id.\"\"\"\n ucsm_ips = list(CONF.ml2_cisco_ucsm.ucsms)\n for ucsm_ip in ucsm_ips:\n\n with self.ucsm_connect_disconnect(ucsm_ip) as handle:\n LOG.debug('Deleting config for VLAN %d from UCSM %s', vlan_id,\n ucsm_ip)\n if (port_profile):\n self._delete_port_profile(handle, port_profile, ucsm_ip)\n\n ucsm = CONF.ml2_cisco_ucsm.ucsms[ucsm_ip]\n if ucsm.sp_template_list:\n self._remove_vlan_from_all_sp_templates(handle,\n vlan_id,\n ucsm_ip)\n if ucsm.vnic_template_list:\n self._remove_vlan_from_vnic_templates(handle,\n vlan_id,\n ucsm_ip)\n if not (ucsm.sp_template_list and\n ucsm.vnic_template_list):\n self._remove_vlan_from_all_service_profiles(handle,\n vlan_id,\n ucsm_ip)\n self._delete_vlan_profile(handle, vlan_id, ucsm_ip)\n if trunk_vlans:\n for vlan_id in trunk_vlans:\n self._delete_vlan_profile(handle, vlan_id, ucsm_ip)", "response": "Delete all config for a given vlan."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef ucs_manager_disconnect(self, handle, ucsm_ip):\n try:\n handle.Logout()\n except Exception as e:\n # Raise a Neutron exception. Include a description of\n # the original exception.\n raise cexc.UcsmDisconnectFailed(ucsm_ip=ucsm_ip, exc=e)", "response": "Disconnects from the UCS Manager."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_events(self, **kwargs):\n\n event_q = kwargs.get('event_queue')\n pri = kwargs.get('priority')\n if not event_q or not pri:\n return\n\n try:\n event_type = 'server.failure.recovery'\n payload = {}\n timestamp = time.ctime()\n data = (event_type, payload)\n event_q.put((pri, timestamp, data))\n LOG.debug('Added failure recovery event to the queue.')\n except Exception as exc:\n LOG.exception('Error: %(exc)s for event %(event)s',\n {'exc': str(exc), 'event': event_type})\n raise exc", "response": "Add a failure recovery event into the queue."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef init_params(self, protocol_interface, phy_interface):\n self.lldp_cfgd = False\n self.local_intf = protocol_interface\n self.phy_interface = phy_interface\n self.remote_evb_cfgd = False\n self.remote_evb_mode = None\n self.remote_mgmt_addr = None\n self.remote_system_desc = None\n self.remote_system_name = None\n self.remote_port = None\n self.remote_chassis_id_mac = None\n self.remote_port_id_mac = None\n self.local_evb_cfgd = False\n self.local_evb_mode = None\n self.local_mgmt_address = None\n self.local_system_desc = None\n self.local_system_name = None\n self.local_port = None\n self.local_chassis_id_mac = None\n self.local_port_id_mac = None\n self.db_retry_status = False\n self.topo_send_cnt = 0\n self.bond_interface = None\n self.bond_member_ports = None", "response": "Initialize the parameters of the class."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nupdate the bond interface and its members. Return True if there s a change.", "response": "def cmp_update_bond_intf(self, bond_interface):\n \"\"\"Update the bond interface and its members.\n\n Update the bond interface, if this interface is a part of bond\n Return True if there's a change.\n \"\"\"\n if bond_interface != self.bond_interface:\n self.bond_interface = bond_interface\n self.bond_member_ports = sys_utils.get_member_ports(bond_interface)\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef remote_evb_mode_uneq_store(self, remote_evb_mode):\n if remote_evb_mode != self.remote_evb_mode:\n self.remote_evb_mode = remote_evb_mode\n return True\n return False", "response": "Saves the EVB mode if it is not the same as stored."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef remote_mgmt_addr_uneq_store(self, remote_mgmt_addr):\n if remote_mgmt_addr != self.remote_mgmt_addr:\n self.remote_mgmt_addr = remote_mgmt_addr\n return True\n return False", "response": "This function saves the MGMT address if different from stored."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef remote_sys_desc_uneq_store(self, remote_system_desc):\n if remote_system_desc != self.remote_system_desc:\n self.remote_system_desc = remote_system_desc\n return True\n return False", "response": "This function saves the system desc if different from stored."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef remote_sys_name_uneq_store(self, remote_system_name):\n if remote_system_name != self.remote_system_name:\n self.remote_system_name = remote_system_name\n return True\n return False", "response": "This function saves the system name if different from stored."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef remote_chassis_id_mac_uneq_store(self, remote_chassis_id_mac):\n if remote_chassis_id_mac != self.remote_chassis_id_mac:\n self.remote_chassis_id_mac = remote_chassis_id_mac\n return True\n return False", "response": "This function saves the Chassis MAC if different from stored."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nretrieves the LLDP status.", "response": "def get_lldp_status(cls, intf):\n \"\"\"Retrieves the LLDP status. \"\"\"\n if intf not in cls.topo_intf_obj_dict:\n LOG.error(\"Interface %s not configured at all\", intf)\n return False\n intf_obj = cls.topo_intf_obj_dict.get(intf)\n return intf_obj.get_lldp_status()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nconfiguring the interfaces during init time.", "response": "def _init_cfg_interfaces(self, cb, intf_list=None, all_intf=True):\n \"\"\"Configure the interfaces during init time. \"\"\"\n if not all_intf:\n self.intf_list = intf_list\n else:\n self.intf_list = sys_utils.get_all_run_phy_intf()\n self.cb = cb\n self.intf_attr = {}\n self.cfg_lldp_interface_list(self.intf_list)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncalls by application to add an interface to the list of interfaces.", "response": "def cfg_intf(self, protocol_interface, phy_interface=None):\n \"\"\"Called by application to add an interface to the list. \"\"\"\n self.intf_list.append(protocol_interface)\n self.cfg_lldp_interface(protocol_interface, phy_interface)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating the local interface attribute object and stores it.", "response": "def create_attr_obj(self, protocol_interface, phy_interface):\n \"\"\"Creates the local interface attribute object and stores it. \"\"\"\n self.intf_attr[protocol_interface] = TopoIntfAttr(\n protocol_interface, phy_interface)\n self.store_obj(protocol_interface, self.intf_attr[protocol_interface])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncomparing and store the received TLV. Compare and store the received TLV.", "response": "def cmp_store_tlv_params(self, intf, tlv_data):\n \"\"\"Compare and store the received TLV.\n\n Compares the received TLV with stored TLV. Store the new TLV if it is\n different.\n \"\"\"\n flag = False\n attr_obj = self.get_attr_obj(intf)\n remote_evb_mode = self.pub_lldp.get_remote_evb_mode(tlv_data)\n if attr_obj.remote_evb_mode_uneq_store(remote_evb_mode):\n flag = True\n remote_evb_cfgd = self.pub_lldp.get_remote_evb_cfgd(tlv_data)\n if attr_obj.remote_evb_cfgd_uneq_store(remote_evb_cfgd):\n flag = True\n remote_mgmt_addr = self.pub_lldp.get_remote_mgmt_addr(tlv_data)\n if attr_obj.remote_mgmt_addr_uneq_store(remote_mgmt_addr):\n flag = True\n remote_sys_desc = self.pub_lldp.get_remote_sys_desc(tlv_data)\n if attr_obj.remote_sys_desc_uneq_store(remote_sys_desc):\n flag = True\n remote_sys_name = self.pub_lldp.get_remote_sys_name(tlv_data)\n if attr_obj.remote_sys_name_uneq_store(remote_sys_name):\n flag = True\n remote_port = self.pub_lldp.get_remote_port(tlv_data)\n if attr_obj.remote_port_uneq_store(remote_port):\n flag = True\n remote_chassis_id_mac = self.pub_lldp.\\\n get_remote_chassis_id_mac(tlv_data)\n if attr_obj.remote_chassis_id_mac_uneq_store(remote_chassis_id_mac):\n flag = True\n remote_port_id_mac = self.pub_lldp.get_remote_port_id_mac(tlv_data)\n if attr_obj.remote_port_id_mac_uneq_store(remote_port_id_mac):\n flag = True\n return flag"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef cfg_lldp_interface(self, protocol_interface, phy_interface=None):\n if phy_interface is None:\n phy_interface = protocol_interface\n self.create_attr_obj(protocol_interface, phy_interface)\n ret = self.pub_lldp.enable_lldp(protocol_interface)\n attr_obj = self.get_attr_obj(protocol_interface)\n attr_obj.update_lldp_status(ret)", "response": "Cfg LLDP on interface and create object."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef periodic_discovery_task(self):\n try:\n self._periodic_task_int()\n except Exception as exc:\n LOG.error(\"Exception caught in periodic discovery task %s\",\n str(exc))", "response": "Periodic task that checks the TLV attributes."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _check_bond_interface_change(self, phy_interface, attr_obj):\n bond_phy = sys_utils.get_bond_intf(phy_interface)\n if sys_utils.is_intf_bond(phy_interface):\n bond_intf = phy_interface\n else:\n bond_intf = bond_phy\n # This can be an addition or removal of the interface to a bond.\n bond_intf_change = attr_obj.cmp_update_bond_intf(bond_intf)\n return bond_intf_change", "response": "Check if there s any change in bond interface."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nperforming authentication and retries cookie.", "response": "def _get_cookie(self, mgmt_ip, config, refresh=False):\n \"\"\"Performs authentication and retries cookie.\"\"\"\n\n if mgmt_ip not in self.credentials:\n return None\n\n security_data = self.credentials[mgmt_ip]\n verify = security_data[const.HTTPS_CERT_TUPLE]\n if not verify:\n verify = security_data[const.HTTPS_VERIFY_TUPLE]\n\n if not refresh and security_data[const.COOKIE_TUPLE]:\n return security_data[const.COOKIE_TUPLE], verify\n\n payload = {\"aaaUser\": {\"attributes\": {\n \"name\": security_data[const.UNAME_TUPLE],\n \"pwd\": security_data[const.PW_TUPLE]}}}\n headers = {\"Content-type\": \"application/json\", \"Accept\": \"text/plain\"}\n\n url = \"{0}://{1}/api/aaaLogin.json\".format(DEFAULT_SCHEME, mgmt_ip)\n\n try:\n response = self.session.request('POST',\n url,\n data=jsonutils.dumps(payload),\n headers=headers,\n verify=verify,\n timeout=self.timeout * 2)\n except Exception as e:\n raise cexc.NexusConnectFailed(nexus_host=mgmt_ip,\n exc=e)\n\n self.status = response.status_code\n if response.status_code == requests.codes.OK:\n cookie = response.headers.get('Set-Cookie')\n security_data = (\n security_data[const.UNAME_TUPLE:const.COOKIE_TUPLE] +\n (cookie,))\n self.credentials[mgmt_ip] = security_data\n return cookie, verify\n else:\n e = \"REST API connect returned Error code: \"\n e += str(self.status)\n raise cexc.NexusConnectFailed(nexus_host=mgmt_ip,\n exc=e)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef send_request(self, method, action, body=None,\n headers=None, ipaddr=None):\n \"\"\"Perform the HTTP request.\n\n The response is in either JSON format or plain text. A GET method will\n invoke a JSON response while a PUT/POST/DELETE returns message from the\n the server in plain text format.\n Exception is raised when server replies with an INTERNAL SERVER ERROR\n status code (500) i.e. an error has occurred on the server or SERVICE\n UNAVAILABLE (404) i.e. server is not reachable.\n\n :param method: type of the HTTP request. POST, GET, PUT or DELETE\n :param action: path to which the client makes request\n :param body: dict of arguments which are sent as part of the request\n :param headers: header for the HTTP request\n :param server_ip: server_ip for the HTTP request.\n :returns: JSON or plain text in HTTP response\n \"\"\"\n\n action = ''.join([self.scheme, '://%s/', action])\n if netaddr.valid_ipv6(ipaddr):\n # Enclose IPv6 address in [] in the URL\n action = action % (\"[%s]\" % ipaddr)\n else:\n # IPv4 address\n action = action % ipaddr\n\n config = action + \" : \" + body if body else action\n\n # if cookie needed and one not previously created\n if self.request_cookie:\n cookie, verify = self._get_cookie(ipaddr, config)\n headers = {\"Content-type\": \"application/json\",\n \"Accept\": \"text/plain\", \"Cookie\": cookie}\n else:\n if ipaddr not in self.credentials:\n raise cexc.NexusCredentialNotFound(switch_ip=ipaddr)\n else:\n headers = {'Content-Type': 'application/json'}\n security_data = self.credentials[ipaddr]\n verify = security_data[const.HTTPS_CERT_TUPLE]\n if not verify:\n verify = security_data[const.HTTPS_VERIFY_TUPLE]\n self.session.auth = (security_data[0], security_data[1])\n if self.status != requests.codes.OK:\n return {}\n\n for attempt in range(self.max_retries + 1):\n try:\n LOG.debug(\"[Nexus %(ipaddr)s attempt %(id)s]: Connecting..\" %\n {\"ipaddr\": ipaddr, \"id\": attempt})\n response = self.session.request(\n method,\n action,\n data=body,\n headers=headers,\n verify=verify,\n timeout=self.timeout)\n if (self.request_cookie and\n response.status_code in CREDENTIAL_EXPIRED):\n # if need new cookie\n cookie, verify = self._get_cookie(\n ipaddr, config, refresh=True)\n headers = {\"Content-type\": \"application/json\",\n \"Accept\": \"text/plain\", \"Cookie\": cookie}\n continue\n except Exception as e:\n LOG.error(\n \"Exception raised %(err)s for Rest/NXAPI %(cfg)s\",\n {'err': str(e), 'cfg': config})\n raise cexc.NexusConfigFailed(nexus_host=ipaddr,\n config=config,\n exc=e)\n else:\n break\n\n status_string = requests.status_codes._codes[response.status_code][0]\n if response.status_code in self.accepted_codes:\n LOG.debug(\n \"Good status %(status)s(%(code)d) returned for %(url)s\",\n {'status': status_string,\n 'code': response.status_code,\n 'url': action})\n # 'text/json' used with nxapi else application/json with restapi\n output = {}\n if ('application/json' in response.headers['content-type'] or\n 'text/json' in response.headers['content-type']):\n try:\n output = response.json()\n except Exception as e:\n LOG.exception(\n \"Unexpected error encountered extracting \"\n \"json body from response.\")\n if 'ins_api' in output:\n # do special nxapi response handling\n try:\n cli_resp = output['ins_api']['outputs']['output']\n except Exception:\n cli_resp = []\n # Check results for each command\n for cli in cli_resp:\n try:\n status = int((cli['code']))\n except ValueError:\n status = 'bad_status %s' % cli['code']\n if status not in self.accepted_codes:\n excpt = \"ins_api CLI failure occurred \"\n \"with cli return code %s\" % str(status)\n raise cexc.NexusConfigFailed(\n nexus_host=ipaddr, config=config,\n exc=excpt)\n return output\n else:\n LOG.error(\n \"Bad status %(status)s(%(code)d) returned for %(url)s\",\n {'status': status_string,\n 'code': response.status_code,\n 'url': action})\n LOG.error(\"Response text: %(txt)s\",\n {'txt': response.text})\n raise cexc.NexusConfigFailed(nexus_host=ipaddr,\n config=config,\n exc=response.text)", "response": "Send the HTTP request to the Nexus server."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ninitializing all nexus interfaces to trunk allowed none.", "response": "def _initialize_trunk_interfaces_to_none(self, switch_ip, replay=True):\n \"\"\"Initialize all nexus interfaces to trunk allowed none.\"\"\"\n\n try:\n # The following determines if the switch interfaces are\n # in place. If so, make sure they have a basic trunk\n # configuration applied to none.\n switch_ifs = self._mdriver._get_switch_interfaces(\n switch_ip, cfg_only=(False if replay else True))\n if not switch_ifs:\n LOG.debug(\"Skipping switch %s which has no configured \"\n \"interfaces\",\n switch_ip)\n return\n self._driver.initialize_all_switch_interfaces(\n switch_ifs, switch_ip)\n except Exception:\n with excutils.save_and_reraise_exception():\n LOG.warning(\"Unable to initialize interfaces to \"\n \"switch %(switch_ip)s\",\n {'switch_ip': switch_ip})\n self._mdriver.register_switch_as_inactive(switch_ip,\n 'replay init_interface')\n\n if self._mdriver.is_replay_enabled():\n return"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsends pending config data in OpenStack to Nexus.", "response": "def replay_config(self, switch_ip):\n \"\"\"Sends pending config data in OpenStack to Nexus.\"\"\"\n LOG.debug(\"Replaying config for switch ip %(switch_ip)s\",\n {'switch_ip': switch_ip})\n\n # Before replaying all config, initialize trunk interfaces\n # to none as required. If this fails, the switch may not\n # be up all the way. Quit and retry later.\n try:\n self._initialize_trunk_interfaces_to_none(switch_ip)\n except Exception:\n return\n\n nve_bindings = nxos_db.get_nve_switch_bindings(switch_ip)\n\n # If configured to set global VXLAN values and\n # there exists VXLAN data base entries, then configure\n # the \"interface nve\" entry on the switch.\n if (len(nve_bindings) > 0 and\n cfg.CONF.ml2_cisco.vxlan_global_config):\n LOG.debug(\"Nexus: Replay NVE Interface\")\n loopback = self._mdriver.get_nve_loopback(switch_ip)\n self._driver.enable_vxlan_feature(switch_ip,\n const.NVE_INT_NUM, loopback)\n\n for x in nve_bindings:\n try:\n self._driver.create_nve_member(switch_ip,\n const.NVE_INT_NUM, x.vni, x.mcast_group)\n except Exception as e:\n LOG.error(\"Failed to configure nve_member for \"\n \"switch %(switch_ip)s, vni %(vni)s\"\n \"Reason:%(reason)s \",\n {'switch_ip': switch_ip, 'vni': x.vni,\n 'reason': e})\n self._mdriver.register_switch_as_inactive(switch_ip,\n 'replay create_nve_member')\n return\n\n try:\n port_bindings = nxos_db.get_nexusport_switch_bindings(switch_ip)\n except excep.NexusPortBindingNotFound:\n LOG.warning(\"No port entries found for switch ip \"\n \"%(switch_ip)s during replay.\",\n {'switch_ip': switch_ip})\n return\n\n try:\n self._mdriver.configure_switch_entries(\n switch_ip, port_bindings)\n except Exception as e:\n LOG.error(\"Unexpected exception while replaying \"\n \"entries for switch %(switch_ip)s, Reason:%(reason)s \",\n {'switch_ip': switch_ip, 'reason': e})\n self._mdriver.register_switch_as_inactive(switch_ip,\n 'replay switch_entries')"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef check_connections(self):\n switch_connections = self._mdriver.get_all_switch_ips()\n\n for switch_ip in switch_connections:\n state = self._mdriver.get_switch_ip_and_active_state(switch_ip)\n config_failure = self._mdriver.get_switch_replay_failure(\n const.FAIL_CONFIG, switch_ip)\n contact_failure = self._mdriver.get_switch_replay_failure(\n const.FAIL_CONTACT, switch_ip)\n LOG.debug(\"check_connections() thread %(thid)d, switch \"\n \"%(switch_ip)s state %(state)s \"\n \"contact_failure %(contact_failure)d \"\n \"config_failure %(config_failure)d \",\n {'thid': threading.current_thread().ident,\n 'switch_ip': switch_ip, 'state': state,\n 'contact_failure': contact_failure,\n 'config_failure': config_failure})\n try:\n # Send a simple get nexus type to determine if\n # the switch is up\n nexus_type = self._driver.get_nexus_type(switch_ip)\n except Exception:\n if state != const.SWITCH_INACTIVE:\n LOG.error(\"Lost connection to switch ip \"\n \"%(switch_ip)s\", {'switch_ip': switch_ip})\n self._mdriver.set_switch_ip_and_active_state(\n switch_ip, const.SWITCH_INACTIVE)\n else:\n self._mdriver.incr_switch_replay_failure(\n const.FAIL_CONTACT, switch_ip)\n else:\n if state == const.SWITCH_RESTORE_S2:\n try:\n self._mdriver.configure_next_batch_of_vlans(switch_ip)\n except Exception as e:\n LOG.error(\"Unexpected exception while replaying \"\n \"entries for switch %(switch_ip)s, \"\n \"Reason:%(reason)s \",\n {'switch_ip': switch_ip, 'reason': e})\n self._mdriver.register_switch_as_inactive(\n switch_ip, 'replay next_vlan_batch')\n continue\n\n if state == const.SWITCH_INACTIVE:\n self._configure_nexus_type(switch_ip, nexus_type)\n LOG.info(\"Re-established connection to switch \"\n \"ip %(switch_ip)s\",\n {'switch_ip': switch_ip})\n\n self._mdriver.set_switch_ip_and_active_state(\n switch_ip, const.SWITCH_RESTORE_S1)\n self.replay_config(switch_ip)\n\n # If replay failed, it stops trying to configure db entries\n # and sets switch state to inactive so this caller knows\n # it failed. If it did fail, we increment the\n # retry counter else reset it to 0.\n if self._mdriver.get_switch_ip_and_active_state(\n switch_ip) == const.SWITCH_INACTIVE:\n self._mdriver.incr_switch_replay_failure(\n const.FAIL_CONFIG, switch_ip)\n LOG.warning(\"Replay config failed for \"\n \"ip %(switch_ip)s\",\n {'switch_ip': switch_ip})\n else:\n self._mdriver.reset_switch_replay_failure(\n const.FAIL_CONFIG, switch_ip)\n self._mdriver.reset_switch_replay_failure(\n const.FAIL_CONTACT, switch_ip)\n LOG.info(\"Replay config successful for \"\n \"ip %(switch_ip)s\",\n {'switch_ip': switch_ip})", "response": "Check connection between OpenStack and Nexus switch."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nloading Nexus Config driver.", "response": "def _load_nexus_cfg_driver(self):\n \"\"\"Load Nexus Config driver.\n :raises SystemExit of 1 if driver cannot be loaded\n \"\"\"\n\n try:\n loaded_class = runtime_utils.load_class_by_alias_or_classname(\n 'networking_cisco.ml2.nexus_driver', 'restapi')\n return loaded_class(CONF.ml2_cisco.nexus_switches)\n except ImportError:\n LOG.error(\"Error loading Nexus Config driver 'restapi'\")\n raise SystemExit(1)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nverifying this ip address is defined ( for Nexus", "response": "def _switch_defined(self, switch_ip):\n \"\"\"Verify this ip address is defined (for Nexus).\"\"\"\n switch = cfg.CONF.ml2_cisco.nexus_switches.get(switch_ip)\n if switch and switch.username and switch.password:\n return True\n else:\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _pop_vlan_range(self, switch_ip, size):\n vlan_range = self._get_switch_vlan_range(switch_ip)\n sized_range = ''\n fr = 0\n to = 0\n # if vlan_range not empty and haven't met requested size\n while size > 0 and vlan_range:\n vlan_id, vni = vlan_range.pop(0)\n size -= 1\n if fr == 0 and to == 0:\n fr = vlan_id\n to = vlan_id\n else:\n diff = vlan_id - to\n if diff == 1:\n to = vlan_id\n else:\n if fr == to:\n sized_range += str(to) + ','\n else:\n sized_range += str(fr) + '-'\n sized_range += str(to) + ','\n fr = vlan_id\n to = vlan_id\n if fr != 0:\n if fr == to:\n sized_range += str(to)\n else:\n sized_range += str(fr) + '-'\n sized_range += str(to)\n self._save_switch_vlan_range(switch_ip, vlan_range)\n\n return sized_range", "response": "This function takes a switch_ip and size and returns a list of vlans and a size of size."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting all switch ips.", "response": "def get_all_switch_ips(self):\n \"\"\"Using reserved switch binding get all switch ips.\"\"\"\n\n switch_connections = []\n try:\n bindings = nxos_db.get_reserved_switch_binding()\n except excep.NexusPortBindingNotFound:\n LOG.error(\"No switch bindings in the port data base\")\n bindings = []\n for switch in bindings:\n switch_connections.append(switch.switch_ip)\n\n return switch_connections"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _get_baremetal_switch_info(self, link_info):\n\n try:\n switch_info = link_info['switch_info']\n if not isinstance(switch_info, dict):\n switch_info = jsonutils.loads(switch_info)\n except Exception as e:\n LOG.error(\"switch_info can't be decoded: %(exp)s\",\n {\"exp\": e})\n switch_info = {}\n\n return switch_info", "response": "Get switch_info dictionary from context."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _supported_baremetal_transaction(self, context):\n\n port = context.current\n\n if self.trunk.is_trunk_subport_baremetal(port):\n return self._baremetal_set_binding(context)\n\n if not nexus_help.is_baremetal(port):\n return False\n\n if bc.portbindings.PROFILE not in port:\n return False\n\n profile = port[bc.portbindings.PROFILE]\n\n if 'local_link_information' not in profile:\n return False\n\n all_link_info = profile['local_link_information']\n\n selected = False\n for link_info in all_link_info:\n\n if 'port_id' not in link_info:\n return False\n\n switch_info = self._get_baremetal_switch_info(\n link_info)\n if 'switch_ip' in switch_info:\n switch_ip = switch_info['switch_ip']\n else:\n return False\n\n if self._switch_defined(switch_ip):\n selected = True\n else:\n LOG.warning(\"Skip switch %s. Not configured \"\n \"in ini file\" % switch_ip)\n\n if not selected:\n return False\n\n selected = self._baremetal_set_binding(context, all_link_info)\n if selected:\n self._init_baremetal_trunk_interfaces(\n context.current, context.top_bound_segment)\n\n if self.trunk.is_trunk_parentport(port):\n self.trunk.update_subports(port)\n\n return selected", "response": "Verify transaction is complete and for us."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget switch ip addresses from baremetal transaction.", "response": "def _get_baremetal_switches(self, port):\n \"\"\"Get switch ip addresses from baremetal transaction.\n\n This method is used to extract switch information\n from the transaction where VNIC_TYPE is baremetal.\n\n :param port: Received port transaction\n :returns: list of all switches\n :returns: list of only switches which are active\n \"\"\"\n\n all_switches = set()\n active_switches = set()\n all_link_info = port[bc.portbindings.PROFILE]['local_link_information']\n for link_info in all_link_info:\n switch_info = self._get_baremetal_switch_info(link_info)\n if not switch_info:\n continue\n switch_ip = switch_info['switch_ip']\n\n # If not for Nexus\n if not self._switch_defined(switch_ip):\n continue\n\n all_switches.add(switch_ip)\n if self.is_switch_active(switch_ip):\n active_switches.add(switch_ip)\n\n return list(all_switches), list(active_switches)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _get_baremetal_connections(self, port,\n only_active_switch=False,\n from_segment=False):\n \"\"\"Get switch ips and interfaces from baremetal transaction.\n\n This method is used to extract switch/interface\n information from transactions where VNIC_TYPE is\n baremetal.\n\n :param port: Received port transaction\n :param only_active_switch: Indicator for selecting\n connections with switches that are active\n :param from_segment: only return interfaces from the\n segment/transaction as opposed to\n say port channels which are learned.\n :Returns: list of switch_ip, intf_type, port_id, is_native\n \"\"\"\n\n connections = []\n\n is_native = False if self.trunk.is_trunk_subport(port) else True\n\n all_link_info = port[bc.portbindings.PROFILE]['local_link_information']\n\n for link_info in all_link_info:\n\n # Extract port info\n intf_type, port = nexus_help.split_interface_name(\n link_info['port_id'])\n\n # Determine if this switch is to be skipped\n switch_info = self._get_baremetal_switch_info(\n link_info)\n if not switch_info:\n continue\n switch_ip = switch_info['switch_ip']\n\n # If not for Nexus\n if not self._switch_defined(switch_ip):\n continue\n\n # Requested connections for only active switches\n if (only_active_switch and\n not self.is_switch_active(switch_ip)):\n continue\n\n ch_grp = 0\n if not from_segment:\n try:\n reserved = nxos_db.get_switch_if_host_mappings(\n switch_ip,\n nexus_help.format_interface_name(\n intf_type, port))\n if reserved[0].ch_grp > 0:\n ch_grp = reserved[0].ch_grp\n intf_type, port = nexus_help.split_interface_name(\n '', ch_grp)\n except excep.NexusHostMappingNotFound:\n pass\n\n connections.append((switch_ip, intf_type, port,\n is_native, ch_grp))\n\n return connections", "response": "This method returns a list of switch ips and interfaces from baremetal transaction."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _init_baremetal_trunk_interfaces(self, port_seg, segment):\n\n # interfaces list requiring switch initialization and\n # reserved port and port_binding db entry creation\n list_to_init = []\n\n # interfaces list requiring reserved port and port_binding\n # db entry creation\n inactive_switch = []\n\n connections = self._get_baremetal_connections(\n port_seg, False, True)\n for switch_ip, intf_type, port, is_native, _ in connections:\n try:\n nxos_db.get_switch_if_host_mappings(\n switch_ip,\n nexus_help.format_interface_name(intf_type, port))\n except excep.NexusHostMappingNotFound:\n if self.is_switch_active(switch_ip):\n # channel-group added later\n list_to_init.append(\n (switch_ip, intf_type, port, is_native, 0))\n else:\n inactive_switch.append(\n (switch_ip, intf_type, port, is_native, 0))\n\n # channel_group is appended to tuples in list_to_init\n self.driver.initialize_baremetal_switch_interfaces(list_to_init)\n\n host_id = port_seg.get('dns_name')\n if host_id is None:\n host_id = const.RESERVED_PORT_HOST_ID\n\n # Add inactive list to list_to_init to create RESERVED\n # port data base entries\n list_to_init += inactive_switch\n for switch_ip, intf_type, port, is_native, ch_grp in list_to_init:\n nxos_db.add_host_mapping(\n host_id,\n switch_ip,\n nexus_help.format_interface_name(intf_type, port),\n ch_grp, False)", "response": "Initialize baremetal switch interfaces and DB entry."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget switch IPs from configured host mapping.", "response": "def _get_host_switches(self, host_id):\n \"\"\"Get switch IPs from configured host mapping.\n\n This method is used to extract switch information\n from transactions where VNIC_TYPE is normal.\n Information is extracted from ini file which\n is stored in _nexus_switches.\n\n :param host_id: host_name from transaction\n :returns: list of all switches\n :returns: list of only switches which are active\n \"\"\"\n\n all_switches = set()\n active_switches = set()\n\n try:\n host_list = nxos_db.get_host_mappings(host_id)\n for mapping in host_list:\n all_switches.add(mapping.switch_ip)\n if self.is_switch_active(mapping.switch_ip):\n active_switches.add(mapping.switch_ip)\n except excep.NexusHostMappingNotFound:\n pass\n\n return list(all_switches), list(active_switches)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _get_host_connections(self, host_id,\n only_active_switch=False):\n \"\"\"Get switch IPs and interfaces from config host mapping.\n\n This method is used to extract switch/interface\n information from ini files when VNIC_TYPE is\n normal. The ini files contain host to interface\n mappings.\n\n :param host_id: Host name from transaction\n :param only_active_switch: Indicator for selecting only\n connections for switches that are active\n :returns: list of switch_ip, intf_type, port_id, is_native\n \"\"\"\n\n host_found = False\n host_connections = []\n try:\n host_ifs = nxos_db.get_host_mappings(host_id)\n except excep.NexusHostMappingNotFound:\n host_ifs = []\n for ifs in host_ifs:\n host_found = True\n if (only_active_switch and\n not self.is_switch_active(ifs.switch_ip)):\n continue\n intf_type, port = nexus_help.split_interface_name(\n ifs.if_id, ifs.ch_grp)\n # is_native set to const.NOT_NATIVE for\n # VNIC_TYPE of normal\n host_connections.append((\n ifs.switch_ip, intf_type, port,\n const.NOT_NATIVE, ifs.ch_grp))\n\n if not host_found:\n LOG.warning(HOST_NOT_FOUND, host_id)\n\n return host_connections", "response": "Get switch IPs and interfaces from config host mapping."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _get_switch_interfaces(self, requested_switch_ip, cfg_only=False):\n\n switch_ifs = []\n\n try:\n port_info = nxos_db.get_switch_host_mappings(\n requested_switch_ip)\n except excep.NexusHostMappingNotFound:\n port_info = []\n\n for binding in port_info:\n if cfg_only and not binding.is_static:\n continue\n intf_type, port = nexus_help.split_interface_name(\n binding.if_id)\n switch_ifs.append(\n (requested_switch_ip, intf_type, port,\n const.NOT_NATIVE, binding.ch_grp))\n return switch_ifs", "response": "Get switch interfaces from host mapping DB."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _configure_nve_db(self, vni, device_id, mcast_group, host_id):\n host_nve_connections = self._get_switch_nve_info(host_id)\n for switch_ip in host_nve_connections:\n if not nxos_db.get_nve_vni_member_bindings(vni, switch_ip,\n device_id):\n nxos_db.add_nexusnve_binding(vni, switch_ip, device_id,\n mcast_group)", "response": "Configure the nexus NVE database entry."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nconfigure NVE member vni on the switch.", "response": "def _configure_nve_member(self, vni, device_id, mcast_group, host_id):\n \"\"\"Add \"member vni\" configuration to the NVE interface.\n\n Called during update postcommit port event.\n \"\"\"\n host_nve_connections = self._get_switch_nve_info(host_id)\n\n for switch_ip in host_nve_connections:\n\n # If configured to set global VXLAN values then\n # If this is the first database entry for this switch_ip\n # then configure the \"interface nve\" entry on the switch.\n if cfg.CONF.ml2_cisco.vxlan_global_config:\n nve_bindings = nxos_db.get_nve_switch_bindings(switch_ip)\n if len(nve_bindings) == 1:\n LOG.debug(\"Nexus: create NVE interface\")\n loopback = self.get_nve_loopback(switch_ip)\n self.driver.enable_vxlan_feature(switch_ip,\n const.NVE_INT_NUM, loopback)\n\n # If this is the first database entry for this (VNI, switch_ip)\n # then configure the \"member vni #\" entry on the switch.\n member_bindings = nxos_db.get_nve_vni_switch_bindings(vni,\n switch_ip)\n if len(member_bindings) == 1:\n LOG.debug(\"Nexus: add member\")\n self.driver.create_nve_member(switch_ip, const.NVE_INT_NUM,\n vni, mcast_group)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ndelete the nexus NVE database entry.", "response": "def _delete_nve_db(self, vni, device_id, mcast_group, host_id):\n \"\"\"Delete the nexus NVE database entry.\n\n Called during delete precommit port event.\n \"\"\"\n rows = nxos_db.get_nve_vni_deviceid_bindings(vni, device_id)\n for row in rows:\n nxos_db.remove_nexusnve_binding(vni, row.switch_ip, device_id)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _delete_nve_member(self, vni, device_id, mcast_group, host_id):\n host_nve_connections = self._get_switch_nve_info(host_id)\n for switch_ip in host_nve_connections:\n\n if not nxos_db.get_nve_vni_switch_bindings(vni, switch_ip):\n self.driver.delete_nve_member(switch_ip,\n const.NVE_INT_NUM, vni)\n if (cfg.CONF.ml2_cisco.vxlan_global_config and\n not nxos_db.get_nve_switch_bindings(switch_ip)):\n self.driver.disable_vxlan_feature(switch_ip)", "response": "Delete member vni configuration from NVE interface."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nconfigure the nexus database entry.", "response": "def _configure_nxos_db(self, port, vlan_id, device_id, host_id, vni,\n is_provider_vlan):\n \"\"\"Create the nexus database entry.\n\n Called during update precommit port event.\n \"\"\"\n connections = self._get_port_connections(port, host_id)\n for switch_ip, intf_type, nexus_port, is_native, ch_grp in connections:\n port_id = nexus_help.format_interface_name(\n intf_type, nexus_port, ch_grp)\n try:\n nxos_db.get_nexusport_binding(port_id, vlan_id, switch_ip,\n device_id)\n except excep.NexusPortBindingNotFound:\n nxos_db.add_nexusport_binding(port_id, str(vlan_id), str(vni),\n switch_ip, device_id,\n is_native)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncollect auto_create auto_trunk from config.", "response": "def _gather_config_parms(self, is_provider_vlan, vlan_id):\n \"\"\"Collect auto_create, auto_trunk from config.\"\"\"\n if is_provider_vlan:\n auto_create = cfg.CONF.ml2_cisco.provider_vlan_auto_create\n auto_trunk = cfg.CONF.ml2_cisco.provider_vlan_auto_trunk\n else:\n auto_create = True\n auto_trunk = True\n return auto_create, auto_trunk"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _configure_port_binding(self, is_provider_vlan, duplicate_type,\n is_native,\n switch_ip, vlan_id,\n intf_type, nexus_port, vni):\n \"\"\"Conditionally calls vlan and port Nexus drivers.\"\"\"\n\n # This implies VLAN, VNI, and Port are all duplicate.\n # Then there is nothing to configure in Nexus.\n if duplicate_type == const.DUPLICATE_PORT:\n return\n\n auto_create, auto_trunk = self._gather_config_parms(\n is_provider_vlan, vlan_id)\n\n # if type DUPLICATE_VLAN, don't create vlan\n if duplicate_type == const.DUPLICATE_VLAN:\n auto_create = False\n\n if auto_create and auto_trunk:\n LOG.debug(\"Nexus: create vlan %s and add to interface\", vlan_id)\n self.driver.create_and_trunk_vlan(\n switch_ip, vlan_id, intf_type,\n nexus_port, vni, is_native)\n elif auto_create:\n LOG.debug(\"Nexus: create vlan %s\", vlan_id)\n self.driver.create_vlan(switch_ip, vlan_id, vni)\n elif auto_trunk:\n LOG.debug(\"Nexus: trunk vlan %s\", vlan_id)\n self.driver.send_enable_vlan_on_trunk_int(\n switch_ip, vlan_id,\n intf_type, nexus_port, is_native)", "response": "Conditionally calls vlan and port Nexus drivers."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _get_compressed_vlan_list(self, pvlan_ids):\n\n if not pvlan_ids:\n return []\n\n pvlan_list = list(pvlan_ids)\n pvlan_list.sort()\n compressed_list = []\n begin = -1\n prev_vlan = -1\n for port_vlan in pvlan_list:\n if prev_vlan == -1:\n prev_vlan = port_vlan\n else:\n if (port_vlan - prev_vlan) == 1:\n if begin == -1:\n begin = prev_vlan\n prev_vlan = port_vlan\n else:\n if begin == -1:\n compressed_list.append(str(prev_vlan))\n else:\n compressed_list.append(\"%d-%d\" % (begin, prev_vlan))\n begin = -1\n prev_vlan = port_vlan\n\n if begin == -1:\n compressed_list.append(str(prev_vlan))\n else:\n compressed_list.append(\"%s-%s\" % (begin, prev_vlan))\n return compressed_list", "response": "Generate a list of compressed VLANs ready for XML using a vlan set."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nrestoring a set of vlans for a given port.", "response": "def _restore_port_binding(self,\n switch_ip, pvlan_ids,\n port, native_vlan):\n \"\"\"Restores a set of vlans for a given port.\"\"\"\n\n intf_type, nexus_port = nexus_help.split_interface_name(port)\n\n # If native_vlan is configured, this is isolated since\n # two configs (native + trunk) must be sent for this vlan only.\n if native_vlan != 0:\n self.driver.send_enable_vlan_on_trunk_int(\n switch_ip, native_vlan,\n intf_type, nexus_port, True)\n # If this is the only vlan\n if len(pvlan_ids) == 1:\n return\n\n concat_vlans = ''\n compressed_vlans = self._get_compressed_vlan_list(pvlan_ids)\n for pvlan in compressed_vlans:\n\n if concat_vlans == '':\n concat_vlans = \"%s\" % pvlan\n else:\n concat_vlans += \",%s\" % pvlan\n\n # if string starts getting a bit long, send it.\n if len(concat_vlans) >= const.CREATE_PORT_VLAN_LENGTH:\n self.driver.send_enable_vlan_on_trunk_int(\n switch_ip, concat_vlans,\n intf_type, nexus_port, False)\n concat_vlans = ''\n\n # Send remaining vlans if any\n if len(concat_vlans):\n self.driver.send_enable_vlan_on_trunk_int(\n switch_ip, concat_vlans,\n intf_type, nexus_port, False)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nrestores vxlan entries on a Nexus switch.", "response": "def _restore_vxlan_entries(self, switch_ip, vlans):\n \"\"\"Restore vxlan entries on a Nexus switch.\"\"\"\n\n count = 1\n conf_str = ''\n vnsegment_sent = 0\n path_str, conf_str = self.driver.start_create_vlan()\n # At this time, this will only configure vni information when needed\n while vnsegment_sent < const.CREATE_VLAN_BATCH and vlans:\n vlan_id, vni = vlans.pop(0)\n # Add it to the batch\n conf_str = self.driver.get_create_vlan(\n switch_ip, vlan_id, vni, conf_str)\n # batch size has been met\n if (count == const.CREATE_VLAN_SEND_SIZE):\n conf_str = self.driver.end_create_vlan(conf_str)\n self.driver.send_edit_string(switch_ip, path_str, conf_str)\n vnsegment_sent += count\n conf_str = ''\n count = 1\n else:\n count += 1\n\n # batch size was not met\n if conf_str:\n vnsegment_sent += count\n conf_str = self.driver.end_create_vlan(conf_str)\n self.driver.send_edit_string(switch_ip, path_str, conf_str)\n conf_str = ''\n\n LOG.debug(\"Switch %s VLAN vn-segment replay summary: %d\",\n switch_ip, vnsegment_sent)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate a nexus switch entry.", "response": "def _configure_port_entries(self, port, vlan_id, device_id, host_id, vni,\n is_provider_vlan):\n \"\"\"Create a nexus switch entry.\n\n if needed, create a VLAN in the appropriate switch or port and\n configure the appropriate interfaces for this VLAN.\n\n Called during update postcommit port event.\n \"\"\"\n connections = self._get_active_port_connections(port, host_id)\n\n # (nexus_port,switch_ip) will be unique in each iteration.\n # But switch_ip will repeat if host has >1 connection to same switch.\n # So track which switch_ips already have vlan created in this loop.\n vlan_already_created = []\n starttime = time.time()\n\n for switch_ip, intf_type, nexus_port, is_native, _ in connections:\n\n try:\n all_bindings = nxos_db.get_nexusvlan_binding(\n vlan_id, switch_ip)\n except excep.NexusPortBindingNotFound:\n LOG.warning(\"Switch %(switch_ip)s and Vlan \"\n \"%(vlan_id)s not found in port binding \"\n \"database. Skipping this update\",\n {'switch_ip': switch_ip, 'vlan_id': vlan_id})\n continue\n\n previous_bindings = [row for row in all_bindings\n if row.instance_id != device_id]\n if previous_bindings and (switch_ip in vlan_already_created):\n duplicate_type = const.DUPLICATE_VLAN\n else:\n vlan_already_created.append(switch_ip)\n duplicate_type = const.NO_DUPLICATE\n port_starttime = time.time()\n try:\n self._configure_port_binding(\n is_provider_vlan, duplicate_type,\n is_native,\n switch_ip, vlan_id,\n intf_type, nexus_port,\n vni)\n except Exception:\n with excutils.save_and_reraise_exception():\n self.driver.capture_and_print_timeshot(\n port_starttime, \"port_configerr\",\n switch=switch_ip)\n self.driver.capture_and_print_timeshot(\n starttime, \"configerr\",\n switch=switch_ip)\n self.driver.capture_and_print_timeshot(\n port_starttime, \"port_config\",\n switch=switch_ip)\n self.driver.capture_and_print_timeshot(\n starttime, \"config\")"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting next batch of vlans and send them to Nexus switch.", "response": "def configure_next_batch_of_vlans(self, switch_ip):\n \"\"\"Get next batch of vlans and send them to Nexus.\"\"\"\n\n next_range = self._pop_vlan_range(\n switch_ip, const.CREATE_VLAN_BATCH)\n if next_range:\n try:\n self.driver.set_all_vlan_states(\n switch_ip, next_range)\n except Exception:\n with excutils.save_and_reraise_exception():\n LOG.error(\"Error encountered restoring vlans \"\n \"for switch %(switch_ip)s\",\n {'switch_ip': switch_ip})\n self._save_switch_vlan_range(switch_ip, [])\n\n vxlan_range = self._get_switch_vxlan_range(switch_ip)\n if vxlan_range:\n try:\n self._restore_vxlan_entries(switch_ip, vxlan_range)\n except Exception:\n with excutils.save_and_reraise_exception():\n LOG.error(\"Error encountered restoring vxlans \"\n \"for switch %(switch_ip)s\",\n {'switch_ip': switch_ip})\n self._save_switch_vxlan_range(switch_ip, [])\n\n # if no more vlans to restore, we're done. go active.\n if (not self._get_switch_vlan_range(switch_ip) and\n not self._get_switch_vxlan_range(switch_ip)):\n self.set_switch_ip_and_active_state(\n switch_ip, const.SWITCH_ACTIVE)\n LOG.info(\"Restore of Nexus switch \"\n \"ip %(switch_ip)s is complete\",\n {'switch_ip': switch_ip})\n else:\n LOG.debug((\"Restored batch of VLANS on \"\n \"Nexus switch ip %(switch_ip)s\"),\n {'switch_ip': switch_ip})"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef configure_switch_entries(self, switch_ip, port_bindings):\n prev_vlan = -1\n prev_vni = -1\n prev_port = None\n prev_native_vlan = 0\n starttime = time.time()\n\n port_bindings.sort(key=lambda x: (x.port_id, x.vlan_id, x.vni))\n self.driver.capture_and_print_timeshot(\n starttime, \"replay_t2_aft_sort\",\n switch=switch_ip)\n\n # Let's make these lists a set to exclude duplicates\n vlans = set()\n pvlans = set()\n interface_count = 0\n duplicate_port = 0\n vlan_count = 0\n for port in port_bindings:\n if nxos_db.is_reserved_binding(port):\n continue\n\n auto_create, auto_trunk = self._gather_config_parms(\n nxos_db.is_provider_vlan(port.vlan_id), port.vlan_id)\n if port.port_id == prev_port:\n if port.vlan_id == prev_vlan and port.vni == prev_vni:\n # Same port/Same Vlan - skip duplicate\n duplicate_port += 1\n continue\n else:\n # Same port/different Vlan - track it\n vlan_count += 1\n if auto_create:\n vlans.add((port.vlan_id, port.vni))\n if auto_trunk:\n pvlans.add(port.vlan_id)\n if port.is_native:\n prev_native_vlan = port.vlan_id\n else:\n # Different port - write out interface trunk on previous port\n if prev_port:\n interface_count += 1\n LOG.debug(\"Switch %s port %s replay summary: unique vlan \"\n \"count %d, duplicate port entries %d\",\n switch_ip, prev_port, vlan_count, duplicate_port)\n duplicate_port = 0\n vlan_count = 0\n if pvlans:\n self._restore_port_binding(\n switch_ip, pvlans, prev_port, prev_native_vlan)\n pvlans.clear()\n prev_native_vlan = 0\n # Start tracking new port\n if auto_create:\n vlans.add((port.vlan_id, port.vni))\n if auto_trunk:\n pvlans.add(port.vlan_id)\n prev_port = port.port_id\n if port.is_native:\n prev_native_vlan = port.vlan_id\n\n if pvlans:\n LOG.debug(\"Switch %s port %s replay summary: unique vlan \"\n \"count %d, duplicate port entries %d\",\n switch_ip, port.port_id, vlan_count, duplicate_port)\n self._restore_port_binding(\n switch_ip, pvlans, prev_port, prev_native_vlan)\n\n LOG.debug(\"Replayed total %d ports for Switch %s\",\n interface_count + 1, switch_ip)\n\n self.driver.capture_and_print_timeshot(\n starttime, \"replay_part_1\",\n switch=switch_ip)\n vlans = list(vlans)\n if vlans:\n vlans.sort()\n vlan, vni = vlans[0]\n if vni == 0:\n self._save_switch_vlan_range(switch_ip, vlans)\n else:\n self._save_switch_vxlan_range(switch_ip, vlans)\n\n self.set_switch_ip_and_active_state(\n switch_ip, const.SWITCH_RESTORE_S2)\n self.configure_next_batch_of_vlans(switch_ip)\n self.driver.capture_and_print_timeshot(\n starttime, \"replay_part_2\",\n switch=switch_ip)", "response": "Create a nexus switch entry in Nexus."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndeletes the nexus database entry.", "response": "def _delete_nxos_db(self, unused, vlan_id, device_id, host_id, vni,\n is_provider_vlan):\n \"\"\"Delete the nexus database entry.\n\n Called during delete precommit port event.\n \"\"\"\n try:\n rows = nxos_db.get_nexusvm_bindings(vlan_id, device_id)\n for row in rows:\n nxos_db.remove_nexusport_binding(row.port_id, row.vlan_id,\n row.vni, row.switch_ip, row.instance_id)\n except excep.NexusPortBindingNotFound:\n return"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _delete_port_channel_resources(self, host_id, switch_ip,\n intf_type, nexus_port, port_id):\n '''This determines if port channel id needs to be freed.'''\n\n # if this connection is not a port-channel, nothing to do.\n if intf_type != 'port-channel':\n return\n\n # Check if this driver created it and its no longer needed.\n try:\n vpc = nxos_db.get_switch_vpc_alloc(\n switch_ip, nexus_port)\n except excep.NexusVPCAllocNotFound:\n # This can occur for non-baremetal configured\n # port-channels. Nothing more to do.\n LOG.debug(\"Switch %s portchannel %s vpc entry not \"\n \"found in vpcid alloc table.\",\n switch_ip, nexus_port)\n return\n\n # if this isn't one which was allocated or learned,\n # don't do any further processing.\n if not vpc.active:\n LOG.debug(\"Switch %s portchannel %s vpc entry not \"\n \"active.\",\n switch_ip, nexus_port)\n return\n\n # Is this port-channel still in use?\n # If so, nothing more to do.\n try:\n nxos_db.get_nexus_switchport_binding(port_id, switch_ip)\n LOG.debug(\"Switch %s portchannel %s port entries \"\n \"in use. Skipping port-channel clean-up.\",\n switch_ip, nexus_port)\n return\n except excep.NexusPortBindingNotFound:\n pass\n\n # need to get ethernet interface name\n try:\n mapping = nxos_db.get_switch_and_host_mappings(\n host_id, switch_ip)\n eth_type, eth_port = nexus_help.split_interface_name(\n mapping[0].if_id)\n except excep.NexusHostMappingNotFound:\n LOG.warning(\"Switch %s hostid %s host_mapping not \"\n \"found. Skipping port-channel clean-up.\",\n switch_ip, host_id)\n return\n\n # Remove the channel group from ethernet interface\n # and remove port channel from this switch.\n if not vpc.learned:\n self.driver.delete_ch_grp_to_interface(\n switch_ip, eth_type, eth_port,\n nexus_port)\n self.driver.delete_port_channel(switch_ip,\n nexus_port)\n try:\n nxos_db.free_vpcid_for_switch(nexus_port, switch_ip)\n LOG.info(\"Released portchannel %s resources for \"\n \"switch %s\",\n nexus_port, switch_ip)\n except excep.NexusVPCAllocNotFound:\n # Not all learned port channels will be in this db when\n # they're outside the configured vpc_pool so\n # this exception may be possible.\n LOG.warning(\"Failed to free vpcid %s for switch %s \"\n \"since it did not exist in table.\",\n nexus_port, switch_ip)", "response": "This function deletes the port - channel resource."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _delete_switch_entry(self, port, vlan_id, device_id, host_id, vni,\n is_provider_vlan):\n \"\"\"Delete the nexus switch entry.\n\n By accessing the current db entries determine if switch\n configuration can be removed.\n\n Called during delete postcommit port event.\n \"\"\"\n\n connections = self._get_active_port_connections(port, host_id)\n\n # (nexus_port,switch_ip) will be unique in each iteration.\n # But switch_ip will repeat if host has >1 connection to same switch.\n # So track which switch_ips already have vlan removed in this loop.\n vlan_already_removed = []\n for switch_ip, intf_type, nexus_port, is_native, _ in connections:\n\n # if there are no remaining db entries using this vlan on this\n # nexus switch port then remove vlan from the switchport trunk.\n port_id = nexus_help.format_interface_name(intf_type, nexus_port)\n auto_create = True\n auto_trunk = True\n if is_provider_vlan:\n auto_create = cfg.CONF.ml2_cisco.provider_vlan_auto_create\n auto_trunk = cfg.CONF.ml2_cisco.provider_vlan_auto_trunk\n\n try:\n nxos_db.get_port_vlan_switch_binding(port_id, vlan_id,\n switch_ip)\n except excep.NexusPortBindingNotFound:\n pass\n else:\n continue\n\n if auto_trunk:\n self.driver.disable_vlan_on_trunk_int(\n switch_ip, vlan_id, intf_type, nexus_port,\n is_native)\n\n # if there are no remaining db entries using this vlan on this\n # nexus switch then remove the vlan.\n if auto_create:\n try:\n nxos_db.get_nexusvlan_binding(vlan_id, switch_ip)\n except excep.NexusPortBindingNotFound:\n # Do not perform a second time on same switch\n if switch_ip not in vlan_already_removed:\n self.driver.delete_vlan(switch_ip, vlan_id)\n vlan_already_removed.append(switch_ip)\n\n self._delete_port_channel_resources(\n host_id, switch_ip, intf_type, nexus_port, port_id)\n\n if nexus_help.is_baremetal(port):\n connections = self._get_baremetal_connections(\n port, False, True)\n for switch_ip, intf_type, nexus_port, is_native, _ in connections:\n if_id = nexus_help.format_interface_name(\n intf_type, nexus_port)\n try:\n mapping = nxos_db.get_switch_if_host_mappings(\n switch_ip, if_id)\n ch_grp = mapping[0].ch_grp\n except excep.NexusHostMappingNotFound:\n ch_grp = 0\n bind_port_id = nexus_help.format_interface_name(\n intf_type, nexus_port, ch_grp)\n binding = nxos_db.get_port_switch_bindings(\n bind_port_id,\n switch_ip)\n if not binding:\n nxos_db.remove_host_mapping(if_id, switch_ip)", "response": "Delete the nexus switch entry."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _port_action_vlan(self, port, segment, func, vni):\n\n # Verify segment.\n if not self._is_valid_segment(segment):\n return\n\n device_id = self._get_port_uuid(port)\n\n if nexus_help.is_baremetal(port):\n host_id = port.get('dns_name')\n else:\n host_id = port.get(bc.portbindings.HOST_ID)\n\n vlan_id = segment.get(api.SEGMENTATION_ID)\n is_provider = nxos_db.is_provider_vlan(vlan_id)\n\n settings = {\"vlan_id\": vlan_id,\n \"device_id\": device_id,\n \"host_id\": host_id}\n missing_fields = [field for field, value in settings.items()\n if (field != 'host_id' and not value)]\n if not missing_fields:\n func(port, vlan_id, device_id, host_id, vni, is_provider)\n else:\n raise excep.NexusMissingRequiredFields(\n fields=' '.join(missing_fields))", "response": "Verify configuration and then process event."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _port_action_vxlan(self, port, segment, func):\n\n # If the segment is None, just log a warning message and return.\n if segment is None:\n self._log_missing_segment()\n return\n\n device_id = port.get('device_id')\n mcast_group = segment.get(api.PHYSICAL_NETWORK)\n host_id = port.get(bc.portbindings.HOST_ID)\n vni = segment.get(api.SEGMENTATION_ID)\n\n if vni and device_id and mcast_group and host_id:\n func(vni, device_id, mcast_group, host_id)\n return vni\n else:\n fields = \"vni \" if not vni else \"\"\n fields += \"device_id \" if not device_id else \"\"\n fields += \"mcast_group \" if not mcast_group else \"\"\n fields += \"host_id\" if not host_id else \"\"\n raise excep.NexusMissingRequiredFields(fields=fields)", "response": "Verify configuration and then process event."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate port non - database commit event.", "response": "def create_port_postcommit(self, context):\n \"\"\"Create port non-database commit event.\"\"\"\n\n # No new events are handled until replay\n # thread has put the switch in active state.\n # If a switch is in active state, verify\n # the switch is still in active state\n # before accepting this new event.\n #\n # If create_port_postcommit fails, it causes\n # other openstack dbs to be cleared and\n # retries for new VMs will stop. Subnet\n # transactions will continue to be retried.\n\n vlan_segment, vxlan_segment = self._get_segments(\n context.top_bound_segment,\n context.bottom_bound_segment)\n # Verify segment.\n if not self._is_valid_segment(vlan_segment):\n return\n\n port = context.current\n if self._is_supported_deviceowner(port):\n if nexus_help.is_baremetal(context.current):\n all_switches, active_switches = (\n self._get_baremetal_switches(context.current))\n else:\n host_id = context.current.get(bc.portbindings.HOST_ID)\n all_switches, active_switches = (\n self._get_host_switches(host_id))\n\n # Verify switch is still up before replay\n # thread checks.\n verified_active_switches = []\n for switch_ip in active_switches:\n try:\n self.driver.get_nexus_type(switch_ip)\n verified_active_switches.append(switch_ip)\n except Exception as e:\n LOG.error(\"Failed to ping \"\n \"switch ip %(switch_ip)s error %(exp_err)s\",\n {'switch_ip': switch_ip, 'exp_err': e})\n\n LOG.debug(\"Create Stats: thread %(thid)d, \"\n \"all_switches %(all)d, \"\n \"active %(active)d, verified %(verify)d\",\n {'thid': threading.current_thread().ident,\n 'all': len(all_switches),\n 'active': len(active_switches),\n 'verify': len(verified_active_switches)})\n\n # if host_id is valid and there is no active\n # switches remaining\n if all_switches and not verified_active_switches:\n raise excep.NexusConnectFailed(\n nexus_host=all_switches[0], config=\"None\",\n exc=\"Create Failed: Port event can not \"\n \"be processed at this time.\")"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef update_port_precommit(self, context):\n vlan_segment, vxlan_segment = self._get_segments(\n context.top_bound_segment, context.bottom_bound_segment)\n orig_vlan_segment, orig_vxlan_segment = self._get_segments(\n context.original_top_bound_segment,\n context.original_bottom_bound_segment)\n\n if (self._is_vm_migrating(context, vlan_segment, orig_vlan_segment) or\n self._is_status_down(context.current)):\n vni = (self._port_action_vxlan(\n context.original, orig_vxlan_segment, self._delete_nve_db)\n if orig_vxlan_segment else 0)\n self._port_action_vlan(context.original, orig_vlan_segment,\n self._delete_nxos_db, vni)\n elif self._is_supported_deviceowner(context.current):\n vni = self._port_action_vxlan(context.current, vxlan_segment,\n self._configure_nve_db) if vxlan_segment else 0\n self._port_action_vlan(context.current, vlan_segment,\n self._configure_nxos_db, vni)", "response": "Update port pre - database transaction commit event."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef update_port_postcommit(self, context):\n vlan_segment, vxlan_segment = self._get_segments(\n context.top_bound_segment, context.bottom_bound_segment)\n orig_vlan_segment, orig_vxlan_segment = self._get_segments(\n context.original_top_bound_segment,\n context.original_bottom_bound_segment)\n\n if (self._is_vm_migrating(context, vlan_segment, orig_vlan_segment) or\n self._is_status_down(context.current)):\n vni = (self._port_action_vxlan(\n context.original, orig_vxlan_segment,\n self._delete_nve_member) if orig_vxlan_segment else 0)\n self._port_action_vlan(context.original, orig_vlan_segment,\n self._delete_switch_entry, vni)\n elif self._is_supported_deviceowner(context.current):\n if nexus_help.is_baremetal(context.current):\n all_switches, active_switches = (\n self._get_baremetal_switches(context.current))\n else:\n host_id = context.current.get(bc.portbindings.HOST_ID)\n all_switches, active_switches = (\n self._get_host_switches(host_id))\n\n # if switches not active but host_id is valid\n if not active_switches and all_switches:\n raise excep.NexusConnectFailed(\n nexus_host=all_switches[0], config=\"None\",\n exc=\"Update Port Failed: Nexus Switch \"\n \"is down or replay in progress\")\n vni = self._port_action_vxlan(context.current, vxlan_segment,\n self._configure_nve_member) if vxlan_segment else 0\n self._port_action_vlan(context.current, vlan_segment,\n self._configure_port_entries, vni)", "response": "Update port non - database commit event."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef delete_port_precommit(self, context):\n if self._is_supported_deviceowner(context.current):\n vlan_segment, vxlan_segment = self._get_segments(\n context.top_bound_segment,\n context.bottom_bound_segment)\n vni = self._port_action_vxlan(context.current, vxlan_segment,\n self._delete_nve_db) if vxlan_segment else 0\n self._port_action_vlan(context.current, vlan_segment,\n self._delete_nxos_db, vni)", "response": "Delete port pre - database commit event."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndeletes port non - database commit event.", "response": "def delete_port_postcommit(self, context):\n \"\"\"Delete port non-database commit event.\"\"\"\n if self._is_supported_deviceowner(context.current):\n vlan_segment, vxlan_segment = self._get_segments(\n context.top_bound_segment,\n context.bottom_bound_segment)\n vni = self._port_action_vxlan(context.current, vxlan_segment,\n self._delete_nve_member) if vxlan_segment else 0\n self._port_action_vlan(context.current, vlan_segment,\n self._delete_switch_entry, vni)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _detect_iplus(self):\n\n ver_expr = \"([0-9]+)\\.([0-9]+)\\((.*)\\)\"\n re.compile(ver_expr)\n v1 = re.match(ver_expr, self._cur_ver)\n v2 = re.match(ver_expr, self._base_ver)\n\n if int(v1.group(1)) > int(v2.group(1)):\n self._is_iplus = True\n elif int(v1.group(1)) == int(v2.group(1)):\n if int(v1.group(2)) > int(v2.group(2)):\n self._is_iplus = True\n elif int(v1.group(2)) == int(v2.group(2)):\n self._is_iplus = v1.group(3) >= v2.group(3)\n\n LOG.info(\"DCNM version: %(cur_ver)s, iplus: %(is_iplus)s\",\n {'cur_ver': self._cur_ver, 'is_iplus': self._is_iplus})", "response": "Detects if the DCNM version is for iplus"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget segment id range from DCNM.", "response": "def get_segmentid_range(self, orchestrator_id):\n \"\"\"Get segment id range from DCNM. \"\"\"\n\n url = \"%s/%s\" % (self._segmentid_ranges_url, orchestrator_id)\n\n res = self._send_request('GET', url, None, 'segment-id range')\n if res and res.status_code in self._resp_ok:\n return res.json()"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nset segment id range in DCNM.", "response": "def set_segmentid_range(self, orchestrator_id, segid_min, segid_max):\n \"\"\"set segment id range in DCNM. \"\"\"\n\n url = self._segmentid_ranges_url\n\n payload = {'orchestratorId': orchestrator_id,\n 'segmentIdRanges': \"%s-%s\" % (segid_min, segid_max)}\n\n res = self._send_request('POST', url, payload, 'segment-id range')\n if not (res and res.status_code in self._resp_ok):\n LOG.error(\"Failed to set segment id range for orchestrator \"\n \"%(orch)s on DCNM: %(text)s\",\n {'orch': orchestrator_id, 'text': res.text})\n raise dexc.DfaClientRequestFailed(reason=self._failure_msg(res))"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsetting default network config profile.", "response": "def _set_default_cfg_profile(self):\n \"\"\"Set default network config profile.\n\n Check whether the default_cfg_profile value exist in the current\n version of DCNM. If not, set it to new default value which is supported\n by latest version.\n \"\"\"\n try:\n cfgplist = self.config_profile_list()\n if self.default_cfg_profile not in cfgplist:\n self.default_cfg_profile = ('defaultNetworkUniversalEfProfile'\n if self._is_iplus else\n 'defaultNetworkIpv4EfProfile')\n except dexc.DfaClientRequestFailed:\n LOG.error(\"Failed to send request to DCNM.\")\n self.default_cfg_profile = 'defaultNetworkIpv4EfProfile'"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsends create network request to DCNM.", "response": "def _create_network(self, network_info):\n \"\"\"Send create network request to DCNM.\n\n :param network_info: network parameters to be created on DCNM\n \"\"\"\n url = self._create_network_url % (network_info['organizationName'],\n network_info['partitionName'])\n payload = network_info\n\n LOG.info('url %(url)s payload %(payload)s',\n {'url': url, 'payload': payload})\n return self._send_request('POST', url, payload, 'network')"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _config_profile_get(self, thisprofile):\n url = self._cfg_profile_get_url % (thisprofile)\n payload = {}\n\n res = self._send_request('GET', url, payload, 'config-profile')\n if res and res.status_code in self._resp_ok:\n return res.json()", "response": "Get information of a config profile from DCNM."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting list of supported config profile from DCNM.", "response": "def _config_profile_list(self):\n \"\"\"Get list of supported config profile from DCNM.\"\"\"\n url = self._cfg_profile_list_url\n payload = {}\n\n try:\n res = self._send_request('GET', url, payload, 'config-profile')\n if res and res.status_code in self._resp_ok:\n return res.json()\n except dexc.DfaClientRequestFailed:\n LOG.error(\"Failed to send request to DCNM.\")"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _get_settings(self):\n url = self._global_settings_url\n payload = {}\n res = self._send_request('GET', url, payload, 'settings')\n if res and res.status_code in self._resp_ok:\n return res.json()", "response": "Get global mobility domain from DCNM."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncreating organization on the DCNM.", "response": "def _create_org(self, orch_id, name, desc):\n \"\"\"Create organization on the DCNM.\n\n :param orch_id: orchestrator ID\n :param name: Name of organization\n :param desc: Description of organization\n \"\"\"\n url = self._org_url\n payload = {\n \"organizationName\": name,\n \"description\": name if len(desc) == 0 else desc,\n \"orchestrationSource\": orch_id}\n\n return self._send_request('POST', url, payload, 'organization')"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _create_or_update_partition(self, org_name, part_name, desc,\n dci_id=UNKNOWN_DCI_ID, vrf_prof=None,\n service_node_ip=UNKNOWN_SRVN_NODE_IP,\n operation='POST'):\n \"\"\"Send create or update partition request to the DCNM.\n\n :param org_name: name of organization\n :param part_name: name of partition\n :param desc: description of partition\n :dci_id: DCI ID for inter-DC\n :vrf_prof: VRF Profile Name\n :service_node_ip: Service Node's Address\n \"\"\"\n if part_name is None:\n part_name = self._part_name\n if vrf_prof is None or dci_id == UNKNOWN_DCI_ID or (\n service_node_ip == UNKNOWN_SRVN_NODE_IP):\n part_info = self._get_partition(org_name, part_name)\n if vrf_prof is None:\n vrf_prof = self.get_partition_vrfProf(org_name, part_name,\n part_info=part_info)\n if dci_id == UNKNOWN_DCI_ID:\n dci_id = self.get_partition_dciId(org_name, part_name,\n part_info=part_info)\n if service_node_ip == UNKNOWN_SRVN_NODE_IP:\n service_node_ip = self.get_partition_serviceNodeIp(\n org_name, part_name, part_info=part_info)\n url = ((self._create_part_url % (org_name)) if operation == 'POST' else\n self._update_part_url % (org_name, part_name))\n\n payload = {\n \"partitionName\": part_name,\n \"description\": part_name if len(desc) == 0 else desc,\n \"serviceNodeIpAddress\": service_node_ip,\n \"organizationName\": org_name}\n\n # Check the DCNM version and find out whether it is need to have\n # extra payload for the new version when creating/updating a partition.\n if self._is_iplus:\n # Need to add extra payload for the new version.\n enable_dci = \"true\" if dci_id and int(dci_id) != 0 else \"false\"\n extra_payload = {\n \"vrfProfileName\": vrf_prof,\n \"vrfName\": ':'.join((org_name, part_name)),\n \"dciId\": dci_id,\n \"enableDCIExtension\": enable_dci}\n payload.update(extra_payload)\n\n return self._send_request(operation, url, payload, 'partition')", "response": "Create or update the partition in the DCNM."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsending get partition request to the DCNM.", "response": "def _get_partition(self, org_name, part_name=None):\n \"\"\"send get partition request to the DCNM.\n\n :param org_name: name of organization\n :param part_name: name of partition\n \"\"\"\n if part_name is None:\n part_name = self._part_name\n url = self._update_part_url % (org_name, part_name)\n res = self._send_request(\"GET\", url, '', 'partition')\n if res and res.status_code in self._resp_ok:\n return res.json()"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef update_partition_static_route(self, org_name, part_name,\n static_ip_list, vrf_prof=None,\n service_node_ip=None):\n \"\"\"Send static route update requests to DCNM.\n\n :param org_name: name of organization\n :param part_name: name of partition\n :static_ip_list: List of static IP addresses\n :vrf_prof: VRF Profile\n :service_node_ip: Service Node IP address\n \"\"\"\n if part_name is None:\n part_name = self._part_name\n if vrf_prof is None:\n vrf_prof = self.default_vrf_profile\n operation = 'PUT'\n url = (self._update_part_url % (org_name, part_name))\n ip_str = ''\n ip_cnt = 0\n for ip in static_ip_list:\n ip_sub = \"$n0\" + str(ip_cnt) + \"=\" + str(ip) + \";\"\n ip_str = ip_str + ip_sub\n ip_cnt = ip_cnt + 1\n cfg_args = {\n \"$vrfName=\" + org_name + ':' + part_name + \";\"\n \"$include_serviceNodeIpAddress=\" + service_node_ip + \";\" + ip_str\n }\n cfg_args = ';'.join(cfg_args)\n payload = {\n \"partitionName\": part_name,\n \"organizationName\": org_name,\n \"dciExtensionStatus\": \"Not configured\",\n \"vrfProfileName\": vrf_prof,\n \"vrfName\": ':'.join((org_name, part_name)),\n \"configArg\": cfg_args}\n\n res = self._send_request(operation, url, payload, 'partition')\n return (res is not None and res.status_code in self._resp_ok)", "response": "Send static route update requests to DCNM."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsends organization delete request to DCNM.", "response": "def _delete_org(self, org_name):\n \"\"\"Send organization delete request to DCNM.\n\n :param org_name: name of organization to be deleted\n \"\"\"\n url = self._del_org_url % (org_name)\n return self._send_request('DELETE', url, '', 'organization')"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsend partition delete request to DCNM.", "response": "def _delete_partition(self, org_name, partition_name):\n \"\"\"Send partition delete request to DCNM.\n\n :param org_name: name of organization\n :param partition_name: name of partition\n \"\"\"\n url = self._del_part % (org_name, partition_name)\n return self._send_request('DELETE', url, '', 'partition')"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsending network delete request to DCNM.", "response": "def _delete_network(self, network_info):\n \"\"\"Send network delete request to DCNM.\n\n :param network_info: contains network info to be deleted.\n \"\"\"\n org_name = network_info.get('organizationName', '')\n part_name = network_info.get('partitionName', '')\n segment_id = network_info['segmentId']\n if 'mobDomainName' in network_info:\n vlan_id = network_info['vlanId']\n mob_dom_name = network_info['mobDomainName']\n url = self._network_mob_url % (org_name, part_name, vlan_id,\n mob_dom_name)\n else:\n url = self._network_url % (org_name, part_name, segment_id)\n return self._send_request('DELETE', url, '', 'network')"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsend network get request to DCNM.", "response": "def _get_network(self, network_info):\n \"\"\"Send network get request to DCNM.\n\n :param network_info: contains network info to query.\n \"\"\"\n org_name = network_info.get('organizationName', '')\n part_name = network_info.get('partitionName', '')\n segment_id = network_info['segmentId']\n url = self._network_url % (org_name, part_name, segment_id)\n return self._send_request('GET', url, '', 'network')"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _login_request(self, url_login):\n\n expiration_time = self._exp_time\n payload = {'expirationTime': expiration_time}\n # TODO(padkrish), after testing with certificates, make the\n # verify option configurable.\n res = requests.post(url_login,\n data=jsonutils.dumps(payload),\n headers=self._req_headers,\n auth=(self._user, self._pwd),\n timeout=self.timeout_resp, verify=False)\n session_id = ''\n if res and res.status_code in self._resp_ok:\n session_id = res.json().get('Dcnm-Token')\n self._req_headers.update({'Dcnm-Token': session_id})", "response": "Internal function to send login request."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _logout_request(self, url_logout):\n\n requests.post(url_logout,\n headers=self._req_headers,\n timeout=self.timeout_resp, verify=False)", "response": "Internal logout request to DCNM."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _send_request(self, operation, url, payload, desc):\n\n res = None\n try:\n payload_json = None\n if payload and payload != '':\n payload_json = jsonutils.dumps(payload)\n self._login()\n desc_lookup = {'POST': ' creation', 'PUT': ' update',\n 'DELETE': ' deletion', 'GET': ' get'}\n\n res = requests.request(operation, url, data=payload_json,\n headers=self._req_headers,\n timeout=self.timeout_resp, verify=False)\n desc += desc_lookup.get(operation, operation.lower())\n LOG.info(\"DCNM-send_request: %(desc)s %(url)s %(pld)s\",\n {'desc': desc, 'url': url, 'pld': payload})\n\n self._logout()\n except (requests.HTTPError, requests.Timeout,\n requests.ConnectionError) as exc:\n LOG.exception('Error during request: %s', exc)\n raise dexc.DfaClientRequestFailed(reason=exc)\n\n return res", "response": "Send request to DCNM."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef config_profile_list(self):\n\n these_profiles = self._config_profile_list() or []\n profile_list = [q for p in these_profiles for q in\n [p.get('profileName')]]\n return profile_list", "response": "Return config profile list from DCNM."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn forwarding mode of given config profile.", "response": "def config_profile_fwding_mode_get(self, profile_name):\n \"\"\"Return forwarding mode of given config profile.\"\"\"\n\n profile_params = self._config_profile_get(profile_name)\n fwd_cli = 'fabric forwarding mode proxy-gateway'\n if profile_params and fwd_cli in profile_params['configCommands']:\n return 'proxy-gateway'\n else:\n return 'anycast-gateway'"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget the config profile and fwding mode for a network.", "response": "def get_config_profile_for_network(self, net_name):\n \"\"\"Get the list of profiles.\"\"\"\n\n cfgplist = self.config_profile_list()\n cfgname = net_name.partition(':')[2]\n\n cfgtuple = set()\n for cfg_prof in cfgplist:\n if cfg_prof.startswith('defaultNetwork'):\n cfg_alias = (cfg_prof.split('defaultNetwork')[1].\n split('Profile')[0])\n elif cfg_prof.endswith('Profile'):\n cfg_alias = cfg_prof.split('Profile')[0]\n else:\n cfg_alias = cfg_prof\n cfgtuple.update([(cfg_prof, cfg_alias)])\n cfgp = [a for a, b in cfgtuple if cfgname == b]\n prof = cfgp[0] if cfgp else self.default_cfg_profile\n fwd_mod = self.config_profile_fwding_mode_get(prof)\n return (prof, fwd_mod)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncreating the network on the DCNM.", "response": "def create_network(self, tenant_name, network, subnet,\n dhcp_range=True):\n \"\"\"Create network on the DCNM.\n\n :param tenant_name: name of tenant the network belongs to\n :param network: network parameters\n :param subnet: subnet parameters of the network\n \"\"\"\n seg_id = str(network.segmentation_id)\n subnet_ip_mask = subnet.cidr.split('/')\n gw_ip = subnet.gateway_ip\n cfg_args = [\n \"$segmentId=\" + seg_id,\n \"$netMaskLength=\" + subnet_ip_mask[1],\n \"$gatewayIpAddress=\" + gw_ip,\n \"$networkName=\" + network.name,\n \"$vlanId=0\",\n \"$vrfName=\" + tenant_name + ':' + self._part_name\n ]\n cfg_args = ';'.join(cfg_args)\n\n ip_range = ','.join([\"%s-%s\" % (p['start'], p['end']) for p in\n subnet.allocation_pools])\n\n dhcp_scopes = {'ipRange': ip_range,\n 'subnet': subnet.cidr,\n 'gateway': gw_ip}\n\n network_info = {\"segmentId\": seg_id,\n \"vlanId\": \"0\",\n \"mobilityDomainId\": \"None\",\n \"profileName\": network.config_profile,\n \"networkName\": network.name,\n \"configArg\": cfg_args,\n \"organizationName\": tenant_name,\n \"partitionName\": self._part_name,\n \"description\": network.name,\n \"netmaskLength\": subnet_ip_mask[1],\n \"gateway\": gw_ip}\n if dhcp_range:\n network_info[\"dhcpScope\"] = dhcp_scopes\n\n if self._is_iplus:\n # Need to add the vrf name to the network info\n prof = self._config_profile_get(network.config_profile)\n if prof and prof.get('profileSubType') == 'network:universal':\n # For universal profile vrf has to e organization:partition\n network_info[\"vrfName\"] = ':'.join((tenant_name,\n self._part_name))\n else:\n # Otherwise, it should be left empty.\n network_info[\"vrfName\"] = \"\"\n\n LOG.info(\"Creating %s network in DCNM.\", network_info)\n\n res = self._create_network(network_info)\n if res and res.status_code in self._resp_ok:\n LOG.info(\"Created %s network in DCNM.\", network_info)\n else:\n LOG.error(\"Failed to create %s network in DCNM.\",\n network_info)\n raise dexc.DfaClientRequestFailed(reason=res)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate the service network on the DCNM.", "response": "def create_service_network(self, tenant_name, network, subnet,\n dhcp_range=True):\n \"\"\"Create network on the DCNM.\n\n :param tenant_name: name of tenant the network belongs to\n :param network: network parameters\n :param subnet: subnet parameters of the network\n \"\"\"\n network_info = {}\n subnet_ip_mask = subnet.cidr.split('/')\n if self._default_md is None:\n self._set_default_mobility_domain()\n vlan_id = '0'\n gw_ip = subnet.gateway_ip\n part_name = network.part_name\n if not part_name:\n part_name = self._part_name\n\n if network.vlan_id:\n vlan_id = str(network.vlan_id)\n if network.mob_domain_name is not None:\n mob_domain_name = network.mob_domain_name\n else:\n mob_domain_name = self._default_md\n else:\n mob_domain_name = None\n\n seg_id = str(network.segmentation_id)\n seg_str = \"$segmentId=\" + seg_id\n cfg_args = [\n seg_str,\n \"$netMaskLength=\" + subnet_ip_mask[1],\n \"$gatewayIpAddress=\" + gw_ip,\n \"$networkName=\" + network.name,\n \"$vlanId=\" + vlan_id,\n \"$vrfName=\" + tenant_name + ':' + part_name\n ]\n cfg_args = ';'.join(cfg_args)\n\n ip_range = ','.join([\"%s-%s\" % (p['start'], p['end']) for p in\n subnet.allocation_pools])\n\n dhcp_scopes = {'ipRange': ip_range,\n 'subnet': subnet.cidr,\n 'gateway': gw_ip}\n\n network_info = {\"vlanId\": vlan_id,\n \"mobilityDomainId\": mob_domain_name,\n \"profileName\": network.config_profile,\n \"networkName\": network.name,\n \"configArg\": cfg_args,\n \"organizationName\": tenant_name,\n \"partitionName\": part_name,\n \"description\": network.name,\n \"netmaskLength\": subnet_ip_mask[1],\n \"gateway\": gw_ip}\n if seg_id:\n network_info[\"segmentId\"] = seg_id\n if dhcp_range:\n network_info[\"dhcpScope\"] = dhcp_scopes\n if hasattr(subnet, 'secondary_gw'):\n network_info[\"secondaryGateway\"] = subnet.secondary_gw\n if self._is_iplus:\n # Need to add the vrf name to the network info\n prof = self._config_profile_get(network.config_profile)\n if prof and prof.get('profileSubType') == 'network:universal':\n # For universal profile vrf has to e organization:partition\n network_info[\"vrfName\"] = ':'.join((tenant_name, part_name))\n else:\n # Otherwise, it should be left empty.\n network_info[\"vrfName\"] = \"\"\n\n LOG.info(\"Creating %s network in DCNM.\", network_info)\n\n res = self._create_network(network_info)\n if res and res.status_code in self._resp_ok:\n LOG.info(\"Created %s network in DCNM.\", network_info)\n else:\n LOG.error(\"Failed to create %s network in DCNM.\",\n network_info)\n raise dexc.DfaClientRequestFailed(reason=self._failure_msg(res))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ndelete the network on the DCNM.", "response": "def delete_network(self, tenant_name, network):\n \"\"\"Delete network on the DCNM.\n\n :param tenant_name: name of tenant the network belongs to\n :param network: object that contains network parameters\n \"\"\"\n seg_id = network.segmentation_id\n network_info = {\n 'organizationName': tenant_name,\n 'partitionName': self._part_name,\n 'segmentId': seg_id,\n }\n LOG.debug(\"Deleting %s network in DCNM.\", network_info)\n\n res = self._delete_network(network_info)\n if res and res.status_code in self._resp_ok:\n LOG.debug(\"Deleted %s network in DCNM.\", network_info)\n else:\n LOG.error(\"Failed to delete %s network in DCNM.\",\n network_info)\n raise dexc.DfaClientRequestFailed(reason=res)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ndelete the service network on the DCNM.", "response": "def delete_service_network(self, tenant_name, network):\n \"\"\"Delete service network on the DCNM.\n\n :param tenant_name: name of tenant the network belongs to\n :param network: object that contains network parameters\n \"\"\"\n\n network_info = {}\n part_name = network.part_name\n if not part_name:\n part_name = self._part_name\n seg_id = str(network.segmentation_id)\n if network.vlan:\n vlan_id = str(network.vlan)\n if network.mob_domain_name is not None:\n mob_domain_name = network.mob_domain_name\n else:\n # The current way will not work since _default_md is obtained\n # during create_service_network. It's preferrable to get it\n # during init TODO(padkrish)\n if self._default_md is None:\n self._set_default_mobility_domain()\n mob_domain_name = self._default_md\n network_info = {\n 'organizationName': tenant_name,\n 'partitionName': part_name,\n 'mobDomainName': mob_domain_name,\n 'vlanId': vlan_id,\n 'segmentId': seg_id,\n }\n else:\n network_info = {\n 'organizationName': tenant_name,\n 'partitionName': part_name,\n 'segmentId': seg_id,\n }\n LOG.debug(\"Deleting %s network in DCNM.\", network_info)\n\n res = self._delete_network(network_info)\n if res and res.status_code in self._resp_ok:\n LOG.debug(\"Deleted %s network in DCNM.\", network_info)\n else:\n LOG.error(\"Failed to delete %s network in DCNM.\",\n network_info)\n raise dexc.DfaClientRequestFailed(reason=self._failure_msg(res))"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ndeletes the project on the DCNM.", "response": "def delete_project(self, tenant_name, part_name):\n \"\"\"Delete project on the DCNM.\n\n :param tenant_name: name of project.\n :param part_name: name of partition.\n \"\"\"\n res = self._delete_partition(tenant_name, part_name)\n if res and res.status_code in self._resp_ok:\n LOG.debug(\"Deleted %s partition in DCNM.\", part_name)\n else:\n LOG.error(\"Failed to delete %(part)s partition in DCNM.\"\n \"Response: %(res)s\", {'part': part_name, 'res': res})\n raise dexc.DfaClientRequestFailed(reason=res)\n\n res = self._delete_org(tenant_name)\n if res and res.status_code in self._resp_ok:\n LOG.debug(\"Deleted %s organization in DCNM.\", tenant_name)\n else:\n LOG.error(\"Failed to delete %(org)s organization in DCNM.\"\n \"Response: %(res)s\", {'org': tenant_name, 'res': res})\n raise dexc.DfaClientRequestFailed(reason=res)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef delete_partition(self, org_name, partition_name):\n res = self._delete_partition(org_name, partition_name)\n if res and res.status_code in self._resp_ok:\n LOG.debug(\"Deleted %s partition in DCNM.\", partition_name)\n else:\n LOG.error(\"Failed to delete %(part)s partition in DCNM.\"\n \"Response: %(res)s\",\n ({'part': partition_name, 'res': res}))\n raise dexc.DfaClientRequestFailed(reason=self._failure_msg(res))", "response": "Send partition delete request to DCNM."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef create_project(self, orch_id, org_name, part_name, dci_id, desc=None):\n desc = desc or org_name\n res = self._create_org(orch_id, org_name, desc)\n if res and res.status_code in self._resp_ok:\n LOG.debug(\"Created %s organization in DCNM.\", org_name)\n else:\n LOG.error(\"Failed to create %(org)s organization in DCNM.\"\n \"Response: %(res)s\", {'org': org_name, 'res': res})\n raise dexc.DfaClientRequestFailed(reason=res)\n\n self.create_partition(org_name, part_name, dci_id,\n self.default_vrf_profile, desc=desc)", "response": "Create a project on the DCNM."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef update_project(self, org_name, part_name, dci_id=UNKNOWN_DCI_ID,\n service_node_ip=UNKNOWN_SRVN_NODE_IP,\n vrf_prof=None, desc=None):\n \"\"\"Update project on the DCNM.\n\n :param org_name: name of organization.\n :param part_name: name of partition.\n :param dci_id: Data Center interconnect id.\n :param desc: description of project.\n \"\"\"\n desc = desc or org_name\n res = self._create_or_update_partition(org_name, part_name, desc,\n dci_id=dci_id,\n service_node_ip=service_node_ip,\n vrf_prof=vrf_prof,\n operation='PUT')\n if res and res.status_code in self._resp_ok:\n LOG.debug(\"Update %s partition in DCNM.\", part_name)\n else:\n LOG.error(\"Failed to update %(part)s partition in DCNM.\"\n \"Response: %(res)s\", {'part': part_name, 'res': res})\n raise dexc.DfaClientRequestFailed(reason=res)", "response": "Update the project on the DCNM."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef create_partition(self, org_name, part_name, dci_id, vrf_prof,\n service_node_ip=None, desc=None):\n \"\"\"Create partition on the DCNM.\n\n :param org_name: name of organization to be created\n :param part_name: name of partition to be created\n :param dci_id: DCI ID\n :vrf_prof: VRF profile for the partition\n :param service_node_ip: Specifies the Default route IP address.\n :param desc: string that describes organization\n \"\"\"\n desc = desc or org_name\n res = self._create_or_update_partition(org_name, part_name,\n desc, dci_id=dci_id,\n service_node_ip=service_node_ip,\n vrf_prof=vrf_prof)\n if res and res.status_code in self._resp_ok:\n LOG.debug(\"Created %s partition in DCNM.\", part_name)\n else:\n LOG.error(\"Failed to create %(part)s partition in DCNM.\"\n \"Response: %(res)s\", ({'part': part_name, 'res': res}))\n raise dexc.DfaClientRequestFailed(reason=self._failure_msg(res))", "response": "Create a new partition on the DCNM."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting VRF Profile for the partition from the DCNM.", "response": "def get_partition_vrfProf(self, org_name, part_name=None, part_info=None):\n \"\"\"get VRF Profile for the partition from the DCNM.\n\n :param org_name: name of organization\n :param part_name: name of partition\n \"\"\"\n vrf_profile = None\n if part_info is None:\n part_info = self._get_partition(org_name, part_name)\n LOG.info(\"query result from dcnm for partition info is %s\",\n part_info)\n if (\"vrfProfileName\" in part_info):\n vrf_profile = part_info.get(\"vrfProfileName\")\n return vrf_profile"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting DCI ID for the partition.", "response": "def get_partition_dciId(self, org_name, part_name, part_info=None):\n \"\"\"get DCI ID for the partition.\n\n :param org_name: name of organization\n :param part_name: name of partition\n \"\"\"\n if part_info is None:\n part_info = self._get_partition(org_name, part_name)\n LOG.info(\"query result from dcnm for partition info is %s\",\n part_info)\n if part_info is not None and \"dciId\" in part_info:\n return part_info.get(\"dciId\")"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef list_networks(self, org, part):\n if org and part:\n list_url = self._del_part + '/networks'\n list_url = list_url % (org, part)\n res = self._send_request('GET', list_url, '', 'networks')\n if res and res.status_code in self._resp_ok:\n return res.json()", "response": "Return list of networks from DCNM."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef list_organizations(self):\n\n try:\n res = self._send_request('GET', self._org_url, '', 'organizations')\n if res and res.status_code in self._resp_ok:\n return res.json()\n except dexc.DfaClientRequestFailed:\n LOG.error(\"Failed to send request to DCNM.\")", "response": "Return list of organizations from DCNM."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets the given network from DCNM.", "response": "def get_network(self, org, segid):\n \"\"\"Return given network from DCNM.\n\n :param org: name of organization.\n :param segid: segmentation id of the network.\n \"\"\"\n network_info = {\n 'organizationName': org,\n 'partitionName': self._part_name,\n 'segmentId': segid,\n }\n res = self._get_network(network_info)\n if res and res.status_code in self._resp_ok:\n return res.json()"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets the DCNM version.", "response": "def get_version(self):\n \"\"\"Get the DCNM version.\"\"\"\n\n url = '%s://%s/rest/dcnm-version' % (self.dcnm_protocol, self._ip)\n payload = {}\n\n try:\n res = self._send_request('GET', url, payload, 'dcnm-version')\n if res and res.status_code in self._resp_ok:\n return res.json().get('Dcnm-Version')\n except dexc.DfaClientRequestFailed as exc:\n LOG.error(\"Failed to get DCNM version.\")\n sys.exit(\"ERROR: Failed to connect to DCNM: %s\", exc)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fill_urls(self):\n\n protocol = self.dcnm_protocol\n self._org_url = '%s://%s/rest/auto-config/organizations' % (\n (protocol, self._ip))\n self._create_network_url = ('%s://%s/' % (protocol, self._ip) +\n 'rest/auto-config/organizations'\n '/%s/partitions/%s/networks')\n self.host_protocol_url = '%s://%s/' % (protocol, self._ip)\n self._create_network_url = self._build_url(\n 'rest/auto-config/organizations'\n '/%s/partitions/%s/networks')\n self._cfg_profile_list_url = '%s://%s/rest/auto-config/profiles' % (\n (protocol, self._ip))\n self._cfg_profile_get_url = self._cfg_profile_list_url + '/%s'\n self._global_settings_url = self._build_url(\n 'rest/auto-config/settings')\n self._create_part_url = self._build_url(\n 'rest/auto-config/organizations/%s/partitions')\n self._update_part_url = self._build_url(\n 'rest/auto-config/organizations/%s/partitions/%s')\n self._del_org_url = self._build_url(\n 'rest/auto-config/organizations/%s')\n self._del_part = self._build_url(\n 'rest/auto-config/organizations/%s/partitions/%s')\n self._network_url = self._build_url(\n 'rest/auto-config/organizations/%s/partitions/'\n '%s/networks/segment/%s')\n self._network_mob_url = self._build_url(\n 'rest/auto-config/organizations/%s/partitions/'\n '%s/networks/vlan/%s/mobility-domain/%s')\n self._segmentid_ranges_url = self._build_url(\n 'rest/settings/segmentid-ranges')\n self._login_url = self._build_url('rest/logon')\n self._logout_url = self._build_url('rest/logout')", "response": "This method assigns the URL s based on the protocol and IP."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a list of tuples that is built based on retrieved tables where column with name tenant_id exists.", "response": "def get_data():\n \"\"\"Returns combined list of tuples: [(table, column)].\n List is built, based on retrieved tables, where column with name\n ``tenant_id`` exists.\n \"\"\"\n\n output = []\n tables = get_tables()\n for table in tables:\n columns = get_columns(table)\n\n for column in columns:\n if column['name'] == 'tenant_id':\n output.append((table, column))\n\n return output"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nnotifying the Cisco cfg agent handling a particular hosting_device.", "response": "def _agent_notification_bulk(self, context, method, routers,\n hosting_device, operation):\n \"\"\"Notify the Cisco cfg agent handling a particular hosting_device.\n\n A single notification can contain multiple routers.\n \"\"\"\n admin_context = context.is_admin and context or context.elevated()\n dmplugin = bc.get_plugin(cisco_constants.DEVICE_MANAGER)\n if (hosting_device is not None and extensions.is_extension_supported(\n dmplugin, CFGAGENT_SCHED)):\n agents = dmplugin.get_cfg_agents_for_hosting_devices(\n admin_context, [hosting_device['id']], admin_state_up=True,\n schedule=True)\n if agents:\n agent = agents[0]\n LOG.debug('Notify %(agent_type)s at %(topic)s.%(host)s the '\n 'message %(method)s [BULK]',\n {'agent_type': agent.agent_type,\n 'topic': CFG_AGENT_L3_ROUTING,\n 'host': agent.host,\n 'method': method})\n cctxt = self.client.prepare(server=agent.host,\n version='1.1')\n cctxt.cast(context, method, routers=routers)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _notification(self, context, method, routers, operation,\n shuffle_agents):\n \"\"\"Notify all or individual Cisco cfg agents.\"\"\"\n if extensions.is_extension_supported(self._l3plugin, L3AGENT_SCHED):\n adm_context = (context.is_admin and context or context.elevated())\n # This is where hosting device gets scheduled to Cisco cfg agent\n self._l3plugin.schedule_routers(adm_context, routers)\n self._agent_notification(\n context, method, routers, operation, shuffle_agents)\n else:\n cctxt = self.client.prepare(topics=topics.L3_AGENT, fanout=True)\n cctxt.cast(context, method, routers=[r['id'] for r in routers])", "response": "Notify all or individual Cisco cfg agents."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef routers_updated(self, context, routers, operation=None, data=None,\n shuffle_agents=False):\n \"\"\"Notify cfg agents about configuration changes to routers.\n\n This includes operations performed on the router like when a\n router interface is added or removed.\n \"\"\"\n if routers:\n self._notification(context, 'routers_updated', routers, operation,\n shuffle_agents)", "response": "Notify cfg agents about routers configuration changes."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nnotify cfg agent about router removed from hosting device.", "response": "def router_removed_from_hosting_device(self, context, router):\n \"\"\"Notify cfg agent about router removed from hosting device.\"\"\"\n self._notification(context, 'router_removed_from_hosting_device',\n [router], operation=None, shuffle_agents=False)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nnotifies cfg agent about router added to hosting device.", "response": "def router_added_to_hosting_device(self, context, router):\n \"\"\"Notify cfg agent about router added to hosting device.\"\"\"\n self._notification(context, 'router_added_to_hosting_device',\n [router], operation=None, shuffle_agents=False)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nnotifying cfg agent that routers have been removed from hosting device.", "response": "def routers_removed_from_hosting_device(self, context, router_ids,\n hosting_device):\n \"\"\"Notify cfg agent that routers have been removed from hosting device.\n @param: context - information about tenant, user etc\n @param: router-ids - list of ids\n @param: hosting_device - device hosting the routers\n \"\"\"\n self._agent_notification_bulk(\n context, 'router_removed_from_hosting_device', router_ids,\n hosting_device, operation=None)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nprocessing the messages in the queue and call callbacks to process them.", "response": "def process_amqp_msgs(self):\n \"\"\"Process AMQP queue messages.\n\n It connects to AMQP server and calls callbacks to process DCNM events,\n i.e. routing key containing '.cisco.dcnm.', once they arrive in the\n queue.\n \"\"\"\n\n LOG.info('Starting process_amqp_msgs...')\n while True:\n (mtd_fr, hdr_fr, body) = (None, None, None)\n try:\n if self.consume_channel:\n (mtd_fr, hdr_fr, body) = self.consume_channel.basic_get(\n self._dcnm_queue_name)\n if mtd_fr:\n # Queue has messages.\n LOG.info('RX message: %s', body)\n self._cb_dcnm_msg(mtd_fr, body)\n self.consume_channel.basic_ack(mtd_fr.delivery_tag)\n else:\n # Queue is empty.\n try:\n self._conn.sleep(1)\n except AttributeError:\n time.sleep(1)\n except Exception:\n exc_type, exc_value, exc_tb = sys.exc_info()\n tb_str = traceback.format_exception(exc_type,\n exc_value, exc_tb)\n LOG.exception(\"Failed to read from queue: %(queue)s \"\n \"%(exc_type)s, %(exc_value)s, %(exc_tb)s.\", {\n 'queue': self._dcnm_queue_name,\n 'exc_type': exc_type,\n 'exc_value': exc_value,\n 'exc_tb': tb_str})"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nchecks whether an IP address is reachable by pinging.", "response": "def _is_pingable(ip):\n \"\"\"Checks whether an IP address is reachable by pinging.\n\n Use linux utils to execute the ping (ICMP ECHO) command.\n Sends 5 packets with an interval of 0.2 seconds and timeout of 1\n seconds. Runtime error implies unreachability else IP is pingable.\n :param ip: IP to check\n :return: bool - True or False depending on pingability.\n \"\"\"\n ping_cmd = ['ping',\n '-c', '5',\n '-W', '1',\n '-i', '0.2',\n ip]\n try:\n linux_utils.execute(ping_cmd, check_exit_code=True)\n return True\n except RuntimeError:\n LOG.warning(\"Cannot ping ip address: %s\", ip)\n return False"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking if a TCP port at IP address is possible to connect to the base TCP server.", "response": "def _can_connect(ip, port):\n \"\"\"Checks if a TCP port at IP address is possible to connect to\"\"\"\n cs = socket.socket()\n try:\n cs.connect((ip, port))\n cs.close()\n return True\n except socket.error:\n return False"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget a list of dead hosting devices that have been marked dead.", "response": "def get_dead_hosting_devices_info(self):\n \"\"\"\n Get a list of hosting devices that have been marked dead\n :return: List of dead hosting device ids\n \"\"\"\n res = []\n for hd_id in self.hosting_devices_backlog:\n hd = self.hosting_devices_backlog[hd_id]['hd']\n if hd['hd_state'] == cc.HD_DEAD:\n res.append(hd['id'])\n return res"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef is_hosting_device_reachable(self, hosting_device):\n ret_val = False\n\n hd = hosting_device\n hd_id = hosting_device['id']\n hd_mgmt_ip = hosting_device['management_ip_address']\n\n dead_hd_list = self.get_dead_hosting_devices_info()\n if hd_id in dead_hd_list:\n LOG.debug(\"Hosting device: %(hd_id)s@%(ip)s is already marked as\"\n \" Dead. It is assigned as non-reachable\",\n {'hd_id': hd_id, 'ip': hd_mgmt_ip})\n return False\n\n # Modifying the 'created_at' to a date time object if it is not\n if not isinstance(hd['created_at'], datetime.datetime):\n hd['created_at'] = datetime.datetime.strptime(hd['created_at'],\n '%Y-%m-%d %H:%M:%S')\n\n if _is_pingable(hd_mgmt_ip):\n LOG.debug(\"Hosting device: %(hd_id)s@%(ip)s is reachable.\",\n {'hd_id': hd_id, 'ip': hd_mgmt_ip})\n hd['hd_state'] = cc.HD_ACTIVE\n ret_val = True\n else:\n LOG.debug(\"Hosting device: %(hd_id)s@%(ip)s is NOT reachable.\",\n {'hd_id': hd_id, 'ip': hd_mgmt_ip})\n hd['hd_state'] = cc.HD_NOT_RESPONDING\n ret_val = False\n\n if self.enable_heartbeat is True or ret_val is False:\n self.backlog_hosting_device(hd)\n\n return ret_val", "response": "Check the hosting device which hosts this resource is reachable."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck the status of the backlogged hosting devices.", "response": "def check_backlogged_hosting_devices(self, driver_mgr):\n \"\"\"Checks the status of backlogged hosting devices.\n\n Skips newly spun up instances during their booting time as specified\n in the boot time parameter.\n\n Each hosting-device tracked has a key, hd_state, that represents the\n last known state for the hosting device. Valid values for hd_state\n are ['Active', 'Unknown', 'Dead']\n\n Each time check_backlogged_hosting_devices is invoked, a ping-test\n is performed to determine the current state. If the current state\n differs, hd_state is updated.\n\n The hd_state transitions/actions are represented by the following\n table.\n\n +------------+---------------------+----------------+----------------+\n | current / | Active | Unknown | Dead |\n | last state | | | |\n +============+=====================+================+================+\n | Active | Device is reachable.| Device was | Dead device |\n | | No state change | temporarily | recovered. |\n | | | unreachable. | Trigger resync |\n +------------+---------------------+----------------+----------------+\n | Unknown | Device connectivity | Device | Not a valid |\n | | test failed. Set | connectivity | state |\n | | backlog timestamp | test failed. | transition. |\n | | and wait for dead | Dead timeout | |\n | | timeout to occur. | has not | |\n | | | occurred yet. | |\n +------------+---------------------+----------------+----------------+\n | Dead | Not a valid state | Dead timeout | Device is |\n | | transition. | for device has | still dead. |\n | | | elapsed. | No state |\n | | | Notify plugin | change. |\n +------------+---------------------+----------------+----------------+\n\n :returns: A dict of the format:\n\n ::\n\n {\"reachable\": [,..],\n \"dead\": [,..],\n \"revived\": [,..]}\n\n * reachable - a list of hosting devices that are now reachable\n * dead - a list of hosting devices deemed dead\n * revived - a list of hosting devices (dead to active)\n \"\"\"\n response_dict = {'reachable': [], 'revived': [], 'dead': []}\n LOG.debug(\"Current Backlogged hosting devices: \\n%s\\n\",\n self.hosting_devices_backlog.keys())\n for hd_id in self.hosting_devices_backlog.keys():\n hd = self.hosting_devices_backlog[hd_id]['hd']\n if not timeutils.is_older_than(hd['created_at'],\n hd['booting_time']):\n LOG.info(\"Hosting device: %(hd_id)s @ %(ip)s hasn't \"\n \"passed minimum boot time. Skipping it. \",\n {'hd_id': hd_id, 'ip': hd['management_ip_address']})\n continue\n LOG.info(\"Checking hosting device: %(hd_id)s @ %(ip)s for \"\n \"reachability.\", {'hd_id': hd_id,\n 'ip': hd['management_ip_address']})\n hd_state = hd['hd_state']\n if _is_pingable(hd['management_ip_address']):\n if hd_state == cc.HD_NOT_RESPONDING:\n LOG.debug(\"hosting devices revived & reachable, %s\" %\n (pprint.pformat(hd)))\n hd['hd_state'] = cc.HD_ACTIVE\n # hosting device state\n response_dict['reachable'].append(hd_id)\n elif hd_state == cc.HD_DEAD:\n # test if management port is available\n if _can_connect(hd['management_ip_address'],\n hd['protocol_port']) is True:\n LOG.debug(\"Dead hosting devices revived %s\" %\n (pprint.pformat(hd)))\n hd['hd_state'] = cc.HD_ACTIVE\n response_dict['revived'].append(hd_id)\n else:\n LOG.debug(\"Cannot connect to management port %(p)d on \"\n \"hosting device with ip %(ip)s\",\n {'p': hd['protocol_port'],\n 'ip': hd['management_ip_address']})\n else:\n LOG.debug(\"No-op.\"\n \"_is_pingable is True and current\"\n \" hd['hd_state']=%s\" % hd_state)\n\n LOG.info(\"Hosting device: %(hd_id)s @ %(ip)s is now \"\n \"reachable. Adding it to response\",\n {'hd_id': hd_id, 'ip': hd['management_ip_address']})\n else:\n LOG.info(\"Hosting device: %(hd_id)s %(hd_state)s\"\n \" @ %(ip)s not reachable \",\n {'hd_id': hd_id,\n 'hd_state': hd['hd_state'],\n 'ip': hd['management_ip_address']})\n if hd_state == cc.HD_ACTIVE:\n LOG.debug(\"hosting device lost connectivity, %s\" %\n (pprint.pformat(hd)))\n hd['backlog_insertion_ts'] = timeutils.utcnow()\n hd['hd_state'] = cc.HD_NOT_RESPONDING\n\n elif hd_state == cc.HD_NOT_RESPONDING:\n if timeutils.is_older_than(\n hd['backlog_insertion_ts'],\n cfg.CONF.cfg_agent.hosting_device_dead_timeout):\n # current hd_state is now dead, previous state: Unknown\n hd['hd_state'] = cc.HD_DEAD\n LOG.debug(\"Hosting device: %(hd_id)s @ %(ip)s hasn't \"\n \"been reachable for the \"\n \"last %(time)d seconds. \"\n \"Marking it dead.\",\n {'hd_id': hd_id,\n 'ip': hd['management_ip_address'],\n 'time': cfg.CONF.cfg_agent.\n hosting_device_dead_timeout})\n response_dict['dead'].append(hd_id)\n LOG.debug(\"Response: %s\", response_dict)\n return response_dict"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate a floating IP object.", "response": "def create_floatingip(self, context, floatingip):\n \"\"\"Create floating IP.\n\n :param context: Neutron request context\n :param floatingip: data for the floating IP being created\n :returns: A floating IP object on success\n\n As the l3 router plugin asynchronously creates floating IPs\n leveraging the l3 agent and l3 cfg agent, the initial status for the\n floating IP object will be DOWN.\n \"\"\"\n return super(CiscoRouterPlugin, self).create_floatingip(\n context, floatingip,\n initial_status=bc.constants.FLOATINGIP_STATUS_DOWN)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef attach_intf_router(self, tenant_id, tenant_name, router_id):\n in_sub = self.get_in_subnet_id(tenant_id)\n out_sub = self.get_out_subnet_id(tenant_id)\n # Modify Hard coded Name fixme\n subnet_lst = set()\n subnet_lst.add(in_sub)\n subnet_lst.add(out_sub)\n ret = self.os_helper.add_intf_router(router_id, tenant_id, subnet_lst)\n return ret, in_sub, out_sub", "response": "Routine to attach the interface to the router."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nretrieves the router ID.", "response": "def get_router_id(self, tenant_id, tenant_name):\n \"\"\"Retrieve the router ID. \"\"\"\n router_id = None\n if tenant_id in self.tenant_dict:\n router_id = self.tenant_dict.get(tenant_id).get('router_id')\n if not router_id:\n router_list = self.os_helper.get_rtr_by_name(\n 'FW_RTR_' + tenant_name)\n if len(router_list) > 0:\n router_id = router_list[0].get('id')\n return router_id"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef delete_intf_router(self, tenant_id, tenant_name, router_id):\n in_sub = self.get_in_subnet_id(tenant_id)\n out_sub = self.get_out_subnet_id(tenant_id)\n subnet_lst = set()\n subnet_lst.add(in_sub)\n subnet_lst.add(out_sub)\n router_id = self.get_router_id(tenant_id, tenant_name)\n if router_id:\n ret = self.os_helper.delete_intf_router(tenant_name, tenant_id,\n router_id, subnet_lst)\n if not ret:\n LOG.error(\"Failed to delete router intf id %(rtr)s, \"\n \"tenant %(tenant)s\",\n {'rtr': router_id, 'tenant': tenant_id})\n return ret\n LOG.error(\"Invalid router ID, can't delete interface from \"\n \"router\")", "response": "Routine to delete the router."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nsend the native FW message to the queue.", "response": "def send_router_port_msg(self, tenant_id, tenant_name, router_id, net_id,\n subnet_id, seg, status):\n \"\"\"Sends the router port message to the queue. \"\"\"\n data = self.prepare_router_vm_msg(tenant_id, tenant_name, router_id,\n net_id, subnet_id, seg, status)\n if data is None:\n return False\n timestamp = time.ctime()\n pri = Q_PRIORITY\n LOG.info(\"Sending native FW data into queue %(data)s\",\n {'data': data})\n self.que_obj.put((pri, timestamp, data))\n return True"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef update_dcnm_partition_static_route(self, tenant_id, arg_dict):\n ip_list = self.os_helper.get_subnet_nwk_excl(tenant_id,\n arg_dict.get('excl_list'))\n srvc_node_ip = self.get_out_srvc_node_ip_addr(tenant_id)\n ret = self.dcnm_obj.update_partition_static_route(\n arg_dict.get('tenant_name'), fw_const.SERV_PART_NAME, ip_list,\n vrf_prof=self.cfg.firewall.fw_service_part_vrf_profile,\n service_node_ip=srvc_node_ip)\n if not ret:\n LOG.error(\"Unable to update DCNM ext profile with static \"\n \"route %s\", arg_dict.get('router_id'))\n self.delete_intf_router(tenant_id, arg_dict.get('tenant_name'),\n arg_dict.get('router_id'))\n return False\n return True", "response": "Update DCNM partition static route in DCNM s partition."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _create_arg_dict(self, tenant_id, data, in_sub, out_sub):\n in_seg, in_vlan = self.get_in_seg_vlan(tenant_id)\n out_seg, out_vlan = self.get_out_seg_vlan(tenant_id)\n in_ip_dict = self.get_in_ip_addr(tenant_id)\n out_ip_dict = self.get_out_ip_addr(tenant_id)\n excl_list = [in_ip_dict.get('subnet'), out_ip_dict.get('subnet')]\n\n arg_dict = {'tenant_id': tenant_id,\n 'tenant_name': data.get('tenant_name'),\n 'in_seg': in_seg, 'in_vlan': in_vlan,\n 'out_seg': out_seg, 'out_vlan': out_vlan,\n 'router_id': data.get('router_id'),\n 'in_sub': in_sub, 'out_sub': out_sub,\n 'in_gw': in_ip_dict.get('gateway'),\n 'out_gw': out_ip_dict.get('gateway'),\n 'excl_list': excl_list}\n return arg_dict", "response": "Create the argument dictionary."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_fw(self, tenant_id, data):\n try:\n return self._create_fw(tenant_id, data)\n except Exception as exc:\n LOG.error(\"Failed to create FW for device native, tenant \"\n \"%(tenant)s data %(data)s Exc %(exc)s\",\n {'tenant': tenant_id, 'data': data, 'exc': exc})\n return False", "response": "Top level routine called when a FW is created."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef delete_fw(self, tenant_id, data):\n try:\n ret = self._delete_fw(tenant_id, data)\n return ret\n except Exception as exc:\n LOG.error(\"Failed to delete FW for device native, tenant \"\n \"%(tenant)s data %(data)s Exc %(exc)s\",\n {'tenant': tenant_id, 'data': data, 'exc': exc})\n return False", "response": "Top level routine called when a FW is deleted."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nprograming DCNM static route.", "response": "def _program_dcnm_static_route(self, tenant_id, tenant_name):\n \"\"\"Program DCNM Static Route. \"\"\"\n in_ip_dict = self.get_in_ip_addr(tenant_id)\n in_gw = in_ip_dict.get('gateway')\n in_ip = in_ip_dict.get('subnet')\n if in_gw is None:\n LOG.error(\"No FW service GW present\")\n return False\n out_ip_dict = self.get_out_ip_addr(tenant_id)\n out_ip = out_ip_dict.get('subnet')\n\n # Program DCNM to update profile's static IP address on OUT part\n excl_list = []\n excl_list.append(in_ip)\n excl_list.append(out_ip)\n subnet_lst = self.os_helper.get_subnet_nwk_excl(tenant_id, excl_list,\n excl_part=True)\n # This count is for telling DCNM to insert the static route in a\n # particular position. Total networks created - exclusive list as\n # above - the network that just got created.\n srvc_node_ip = self.get_out_srvc_node_ip_addr(tenant_id)\n ret = self.dcnm_obj.update_partition_static_route(\n tenant_name, fw_const.SERV_PART_NAME, subnet_lst,\n vrf_prof=self.cfg.firewall.fw_service_part_vrf_profile,\n service_node_ip=srvc_node_ip)\n if not ret:\n LOG.error(\"Unable to update DCNM ext profile with static \"\n \"route\")\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef network_create_notif(self, tenant_id, tenant_name, cidr):\n router_id = self.get_router_id(tenant_id, tenant_name)\n if not router_id:\n LOG.error(\"Rout ID not present for tenant\")\n return False\n ret = self._program_dcnm_static_route(tenant_id, tenant_name)\n if not ret:\n LOG.error(\"Program DCNM with static routes failed \"\n \"for router %s\", router_id)\n return False\n\n # Program router namespace to have this network to be routed\n # to IN service network\n in_ip_dict = self.get_in_ip_addr(tenant_id)\n in_gw = in_ip_dict.get('gateway')\n if in_gw is None:\n LOG.error(\"No FW service GW present\")\n return False\n ret = self.os_helper.program_rtr_nwk_next_hop(router_id, in_gw, cidr)\n if not ret:\n LOG.error(\"Unable to program default router next hop %s\",\n router_id)\n return False\n return True", "response": "Tenant Network create Notification."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef network_delete_notif(self, tenant_id, tenant_name, network_id):\n router_id = self.get_router_id(tenant_id, tenant_name)\n if router_id is None:\n LOG.error(\"Rout ID not present for tenant\")\n return False\n ret = self._program_dcnm_static_route(tenant_id, tenant_name)\n if not ret:\n LOG.error(\"Program DCNM with static routes failed for \"\n \"router %s\", router_id)\n return False\n\n # Program router namespace to have this network to be routed\n # to IN service network\n in_ip_dict = self.get_in_ip_addr(tenant_id)\n in_gw = in_ip_dict.get('gateway')\n in_ip = in_ip_dict.get('subnet')\n if in_gw is None:\n LOG.error(\"No FW service GW present\")\n return False\n out_ip_dict = self.get_out_ip_addr(tenant_id)\n out_ip = out_ip_dict.get('subnet')\n excl_list = []\n excl_list.append(in_ip)\n excl_list.append(out_ip)\n subnet_lst = self.os_helper.get_subnet_nwk_excl(tenant_id, excl_list,\n excl_part=True)\n ret = self.os_helper.remove_rtr_nwk_next_hop(router_id, in_gw,\n subnet_lst, excl_list)\n if not ret:\n LOG.error(\"Unable to program default router next hop %s\",\n router_id)\n return False\n return True", "response": "Tenant Network delete Notification."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef create_hosting_device_resources(self, context, complementary_id,\n tenant_id, mgmt_context, max_hosted):\n \"\"\"Create resources for a hosting device in a plugin specific way.\"\"\"\n mgmt_port = None\n if mgmt_context and mgmt_context.get('mgmt_nw_id') and tenant_id:\n # Create port for mgmt interface\n p_spec = {'port': {\n 'tenant_id': tenant_id,\n 'admin_state_up': True,\n 'name': 'mgmt',\n 'network_id': mgmt_context['mgmt_nw_id'],\n 'mac_address': bc.constants.ATTR_NOT_SPECIFIED,\n 'fixed_ips': self._mgmt_subnet_spec(context, mgmt_context),\n 'device_id': \"\",\n # Use device_owner attribute to ensure we can query for these\n # ports even before Nova has set device_id attribute.\n 'device_owner': complementary_id}}\n try:\n mgmt_port = self._core_plugin.create_port(context, p_spec)\n except n_exc.NeutronException as e:\n LOG.error('Error %s when creating management port. '\n 'Cleaning up.', e)\n self.delete_hosting_device_resources(\n context, tenant_id, mgmt_port)\n mgmt_port = None\n # We are setting the 'ports' to an empty list as it is expected by\n # the callee: device_handling_db._create_svc_vm_hosting_devices()\n return {'mgmt_port': mgmt_port, 'ports': []}", "response": "Create resources for a hosting device in a specific way."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning information about all resources for a hosting device.", "response": "def get_hosting_device_resources(self, context, id, complementary_id,\n tenant_id, mgmt_nw_id):\n \"\"\"Returns information about all resources for a hosting device.\"\"\"\n mgmt_port = None\n # Ports for hosting device may not yet have 'device_id' set to\n # Nova assigned uuid of VM instance. However, those ports will still\n # have 'device_owner' attribute set to complementary_id. Hence, we\n # use both attributes in the query to ensure we find all ports.\n query = context.session.query(models_v2.Port)\n query = query.filter(expr.or_(\n models_v2.Port.device_id == id,\n models_v2.Port.device_owner == complementary_id))\n for port in query:\n if port['network_id'] != mgmt_nw_id:\n raise Exception\n else:\n mgmt_port = port\n return {'mgmt_port': mgmt_port}"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndeletes resources for a hosting device in a plugin specific way.", "response": "def delete_hosting_device_resources(self, context, tenant_id, mgmt_port,\n **kwargs):\n \"\"\"Deletes resources for a hosting device in a plugin specific way.\"\"\"\n\n if mgmt_port is not None:\n try:\n self._cleanup_hosting_port(context, mgmt_port['id'])\n except n_exc.NeutronException as e:\n LOG.error(\"Unable to delete port:%(port)s after %(tries)d\"\n \" attempts due to exception %(exception)s. \"\n \"Skipping it\", {'port': mgmt_port['id'],\n 'tries': DELETION_ATTEMPTS,\n 'exception': str(e)})"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef setup_logical_port_connectivity(self, context, port_db,\n hosting_device_id):\n \"\"\"Establishes connectivity for a logical port.\n\n This is done by hot plugging the interface(VIF) corresponding to the\n port from the VM.\n \"\"\"\n hosting_port = port_db.hosting_info.hosting_port\n if hosting_port:\n try:\n self._dev_mgr.svc_vm_mgr.interface_attach(hosting_device_id,\n hosting_port.id)\n LOG.debug(\"Setup logical port completed for port:%s\",\n port_db.id)\n except nova_exc.Conflict as e:\n # VM is still in vm_state building\n LOG.debug(\"Failed to attach interface - spawn thread \"\n \"error %(error)s\", {'error': str(e)})\n self._gt_pool.spawn_n(self._attach_hosting_port,\n hosting_device_id, hosting_port.id)\n except Exception as e:\n LOG.error(\"Failed to attach interface mapped to port:\"\n \"%(p_id)s on hosting device:%(hd_id)s due to \"\n \"error %(error)s\", {'p_id': hosting_port.id,\n 'hd_id': hosting_device_id,\n 'error': str(e)})", "response": "Setup connectivity for a logical port."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef teardown_logical_port_connectivity(self, context, port_db,\n hosting_device_id):\n \"\"\"Removes connectivity for a logical port.\n\n Unplugs the corresponding data interface from the VM.\n \"\"\"\n if port_db is None or port_db.get('id') is None:\n LOG.warning(\"Port id is None! Cannot remove port \"\n \"from hosting_device:%s\", hosting_device_id)\n return\n hosting_port_id = port_db.hosting_info.hosting_port.id\n try:\n self._dev_mgr.svc_vm_mgr.interface_detach(hosting_device_id,\n hosting_port_id)\n self._gt_pool.spawn_n(self._cleanup_hosting_port, context,\n hosting_port_id)\n LOG.debug(\"Teardown logicalport completed for port:%s\", port_db.id)\n\n except Exception as e:\n LOG.error(\"Failed to detach interface corresponding to port:\"\n \"%(p_id)s on hosting device:%(hd_id)s due to \"\n \"error %(error)s\", {'p_id': hosting_port_id,\n 'hd_id': hosting_device_id,\n 'error': str(e)})", "response": "Removes connectivity for a logical port."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef allocate_hosting_port(self, context, router_id, port_db, network_type,\n hosting_device_id):\n \"\"\"Allocates a hosting port for a logical port.\n\n We create a hosting port for the router port\n \"\"\"\n l3admin_tenant_id = self._dev_mgr.l3_tenant_id()\n hostingport_name = 'hostingport_' + port_db['id'][:8]\n p_spec = {'port': {\n 'tenant_id': l3admin_tenant_id,\n 'admin_state_up': True,\n 'name': hostingport_name,\n 'network_id': port_db['network_id'],\n 'mac_address': bc.constants.ATTR_NOT_SPECIFIED,\n 'fixed_ips': [],\n 'device_id': '',\n 'device_owner': '',\n 'port_security_enabled': False}}\n try:\n hosting_port = self._core_plugin.create_port(context, p_spec)\n except n_exc.NeutronException as e:\n LOG.error('Error %s when creating hosting port'\n 'Cleaning up.', e)\n self.delete_hosting_device_resources(\n context, l3admin_tenant_id, hosting_port)\n hosting_port = None\n finally:\n if hosting_port:\n return {'allocated_port_id': hosting_port['id'],\n 'allocated_vlan': None}\n else:\n return None", "response": "Allocates a hosting port for a logical port."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef disable(self, retain_port=False):\n self.update_server(disabled=True)\n if retain_port:\n return\n self.update_device(disabled=True)\n if self.conf.dhcp_delete_namespaces and self.network.namespace:\n ns_ip = ip_lib.IPWrapper(self.root_helper,\n self.network.namespace)\n try:\n ns_ip.netns.delete(self.network.namespace)\n except RuntimeError:\n msg = _('Failed trying to delete namespace: %s')\n LOG.exception(msg, self.network.namespace)", "response": "Teardown DHCP. This method disables DHCP for this network."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates global dict to track device names across driver invocations and populates based on current devices configured on the system.", "response": "def recover_devices(cls):\n \"\"\"Track devices.\n\n Creates global dict to track device names across driver invocations\n and populates based on current devices configured on the system.\n \"\"\"\n\n if \"_devices\" in globals():\n return\n\n global _devices\n confs_dir = os.path.abspath(os.path.normpath(cfg.CONF.dhcp_confs))\n for netid in os.listdir(confs_dir):\n conf_dir = os.path.join(confs_dir, netid)\n intf_filename = os.path.join(conf_dir, 'interface')\n try:\n with open(intf_filename, 'r') as f:\n ifname = f.read()\n _devices[netid] = ifname\n except IOError:\n LOG.error('Unable to read interface file: %s',\n intf_filename)\n LOG.debug(\"Recovered device %s for network %s'\",\n ifname, netid)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck server version against minimum required version.", "response": "def check_version(cls):\n \"\"\"Checks server version against minimum required version.\"\"\"\n super(SimpleCpnrDriver, cls).check_version()\n model.configure_pnr()\n cls.recover_networks()\n ver = model.get_version()\n if ver < cls.MIN_VERSION:\n LOG.warning(\"CPNR version does not meet minimum requirements, \"\n \"expected: %(ever)f, actual: %(rver)f\",\n {'ever': cls.MIN_VERSION, 'rver': ver})\n return ver"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns a list of existing networks ids that we have configs for.", "response": "def existing_dhcp_networks(cls, conf):\n \"\"\"Return a list of existing networks ids that we have configs for.\"\"\"\n global _networks\n sup = super(SimpleCpnrDriver, cls)\n superkeys = sup.existing_dhcp_networks(conf)\n return set(_networks.keys()) & set(superkeys)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nupdate server with latest network configuration.", "response": "def _unsafe_update_server(self, disabled=False):\n \"\"\"Update server with latest network configuration.\"\"\"\n id = self.network.id\n net = model.Network.from_neutron(self.network)\n if id not in _networks:\n if disabled:\n return\n _networks[id] = net\n _networks[id].create()\n elif disabled:\n _networks[id].delete()\n del _networks[id]\n else:\n _networks[id].update(net)\n _networks[id] = net"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncreating the openstack network including the subnet.", "response": "def create_network(self, name, tenant_id, subnet, gw=None):\n \"\"\"Create the openstack network, including the subnet. \"\"\"\n\n try:\n body = {'network': {'name': name, 'tenant_id': tenant_id,\n 'admin_state_up': True}}\n netw = self.neutronclient.create_network(body=body)\n net_dict = netw.get('network')\n net_id = net_dict.get('id')\n except Exception as exc:\n LOG.error(\"Failed to create network %(name)s, Exc %(exc)s\",\n {'name': name, 'exc': str(exc)})\n return None, None\n\n try:\n if gw is None:\n body = {'subnet': {'cidr': subnet,\n 'ip_version': 4,\n 'network_id': net_id,\n 'tenant_id': tenant_id,\n 'enable_dhcp': False}}\n else:\n body = {'subnet': {'cidr': subnet,\n 'ip_version': 4,\n 'network_id': net_id,\n 'tenant_id': tenant_id,\n 'enable_dhcp': False,\n 'gateway_ip': gw}}\n subnet_ret = self.neutronclient.create_subnet(body=body)\n subnet_dict = subnet_ret.get('subnet')\n subnet_id = subnet_dict.get('id')\n except Exception as exc:\n LOG.error(\"Failed to create subnet %(sub)s, exc %(exc)s\",\n {'sub': subnet, 'exc': str(exc)})\n try:\n self.neutronclient.delete_network(net_id)\n except Exception as exc:\n LOG.error(\"Failed to delete network %(net)s, exc %(exc)s\",\n {'net': net_id, 'exc': str(exc)})\n return None, None\n return net_id, subnet_id"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef delete_network(self, name, tenant_id, subnet_id, net_id):\n try:\n self.neutronclient.delete_subnet(subnet_id)\n except Exception as exc:\n LOG.error(\"Failed to delete subnet %(sub)s exc %(exc)s\",\n {'sub': subnet_id, 'exc': str(exc)})\n return\n try:\n self.neutronclient.delete_network(net_id)\n except Exception as exc:\n LOG.error(\"Failed to delete network %(name)s exc %(exc)s\",\n {'name': name, 'exc': str(exc)})", "response": "Delete the openstack subnet and network."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef delete_network_all_subnets(self, net_id):\n try:\n body = {'network_id': net_id}\n subnet_list = self.neutronclient.list_subnets(body=body)\n subnet_list = subnet_list.get('subnets')\n for subnet in subnet_list:\n if subnet.get('network_id') == net_id:\n subnet_id = subnet.get('id')\n self.neutronclient.delete_subnet(subnet_id)\n except Exception as exc:\n LOG.error(\"Failed to delete subnet for net %(net)s \"\n \"Exc %(exc)s\", {'net': net_id, 'exc': str(exc)})\n return False\n try:\n self.neutronclient.delete_network(net_id)\n except Exception as exc:\n LOG.error(\"Failed to delete network %(net)s Exc %(exc)s\",\n {'net': net_id, 'exc': str(exc)})\n return False\n return True", "response": "Delete the openstack network including all its subnets."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef is_subnet_present(self, subnet_addr):\n try:\n subnet_list = self.neutronclient.list_subnets(body={})\n subnet_dat = subnet_list.get('subnets')\n for sub in subnet_dat:\n if sub.get('cidr') == subnet_addr:\n return True\n return False\n except Exception as exc:\n LOG.error(\"Failed to list subnet %(sub)s, Exc %(exc)s\",\n {'sub': subnet_addr, 'exc': str(exc)})\n return False", "response": "Returns if a subnet is present."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning all the subnets.", "response": "def get_all_subnets_cidr(self, no_mask=False):\n \"\"\"Returns all the subnets. \"\"\"\n body = {}\n subnet_cidrs = []\n try:\n subnet_list = self.neutronclient.list_subnets(body=body)\n subnet_dat = subnet_list.get('subnets')\n for sub in subnet_dat:\n if no_mask:\n subnet_cidrs.append(sub.get('cidr').split('/')[0])\n else:\n subnet_cidrs.append(sub.get('cidr'))\n except Exception as exc:\n LOG.error(\"Failed to list subnet Exc %s\", str(exc))\n return subnet_cidrs"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_subnets_for_net(self, net):\n try:\n subnet_list = self.neutronclient.list_subnets(network_id=net)\n subnet_dat = subnet_list.get('subnets')\n return subnet_dat\n except Exception as exc:\n LOG.error(\"Failed to list subnet net %(net)s, Exc: %(exc)s\",\n {'net': net, 'exc': str(exc)})\n return None", "response": "Returns the subnets in a network."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_subnet_cidr(self, subnet_id):\n try:\n subnet_list = self.neutronclient.list_subnets(id=subnet_id)\n subnet_dat = subnet_list.get('subnets')[0]\n return subnet_dat.get('cidr')\n except Exception as exc:\n LOG.error(\"Failed to list subnet for ID %(subnet)s, \"\n \"exc %(exc)s\", {'subnet': subnet_id, 'exc': exc})\n return None", "response": "retrieve the CIDR associated with a subnet given its ID."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef delete_network_subname(self, sub_name):\n try:\n body = {}\n net_list = self.neutronclient.list_networks(body=body)\n for net in net_list:\n if net.get('name').find(sub_name) != -1:\n self.delete_network_all_subnets(net.get('net_id'))\n except Exception as exc:\n LOG.error(\"Failed to get network by subname %(name)s, \"\n \"Exc %(exc)s\",\n {'name': sub_name, 'exc': str(exc)})", "response": "Delete the network by part of its name use with caution."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsearches for a openstack network by name.", "response": "def get_network_by_name(self, nwk_name):\n \"\"\"Search for a openstack network by name. \"\"\"\n ret_net_lst = []\n try:\n body = {}\n net_list = self.neutronclient.list_networks(body=body)\n net_list = net_list.get('networks')\n for net in net_list:\n if net.get('name') == nwk_name:\n ret_net_lst.append(net)\n except Exception as exc:\n LOG.error(\"Failed to get network by name %(name)s, \"\n \"Exc %(exc)s\",\n {'name': nwk_name, 'exc': str(exc)})\n return ret_net_lst"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the list of networks in a given tenant.", "response": "def get_network_by_tenant(self, tenant_id):\n \"\"\"Returns the network of a given tenant. \"\"\"\n ret_net_lst = []\n try:\n net_list = self.neutronclient.list_networks(body={})\n for net in net_list.get('networks'):\n if net.get('tenant_id') == tenant_id:\n ret_net_lst.append(net)\n except Exception as exc:\n LOG.error(\"Failed to get network by tenant %(tenant)s, \"\n \"Exc %(exc)s\",\n {'tenant': tenant_id, 'exc': str(exc)})\n return ret_net_lst"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsearching a router by its name.", "response": "def get_rtr_by_name(self, rtr_name):\n \"\"\"Search a router by its name. \"\"\"\n upd_rtr_list = []\n try:\n rtr_list = self.neutronclient.list_routers()\n for rtr in rtr_list.get('routers'):\n if rtr_name == rtr['name']:\n upd_rtr_list.append(rtr)\n except Exception as exc:\n LOG.error(\"Failed to get router by name %(name)s, \"\n \"Exc %(exc)s\",\n {'name': rtr_name, 'exc': str(exc)})\n return upd_rtr_list"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_router(self, name, tenant_id, subnet_lst):\n try:\n body = {'router': {'name': name, 'tenant_id': tenant_id,\n 'admin_state_up': True}}\n router = self.neutronclient.create_router(body=body)\n rout_dict = router.get('router')\n rout_id = rout_dict.get('id')\n except Exception as exc:\n LOG.error(\"Failed to create router with name %(name)s\"\n \" Exc %(exc)s\", {'name': name, 'exc': str(exc)})\n return None\n\n ret = self.add_intf_router(rout_id, tenant_id, subnet_lst)\n if not ret:\n try:\n ret = self.neutronclient.delete_router(rout_id)\n except Exception as exc:\n LOG.error(\"Failed to delete router %(name)s, Exc %(exc)s\",\n {'name': name, 'exc': str(exc)})\n return None\n return rout_id", "response": "Create a openstack router and add the interfaces."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_intf_router(self, rout_id, tenant_id, subnet_lst):\n try:\n for subnet_id in subnet_lst:\n body = {'subnet_id': subnet_id}\n intf = self.neutronclient.add_interface_router(rout_id,\n body=body)\n intf.get('port_id')\n except Exception as exc:\n LOG.error(\"Failed to create router intf ID %(id)s,\"\n \" Exc %(exc)s\", {'id': rout_id, 'exc': str(exc)})\n return False\n return True", "response": "Add the interfaces to a router."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ndelete the openstack router and remove the interfaces attached to it.", "response": "def delete_router(self, name, tenant_id, rout_id, subnet_lst):\n \"\"\"Delete the openstack router.\n\n Delete the router and remove the interfaces attached to it.\n \"\"\"\n ret = self.delete_intf_router(name, tenant_id, rout_id, subnet_lst)\n if not ret:\n return False\n\n try:\n ret = self.neutronclient.delete_router(rout_id)\n except Exception as exc:\n LOG.error(\"Failed to delete router %(name)s ret %(ret)s \"\n \"Exc %(exc)s\",\n {'name': name, 'ret': str(ret), 'exc': str(exc)})\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ndeletes the openstack router and remove the interfaces attached.", "response": "def delete_intf_router(self, name, tenant_id, rout_id, subnet_lst):\n \"\"\"Delete the openstack router and remove the interfaces attached. \"\"\"\n try:\n for subnet_id in subnet_lst:\n body = {'subnet_id': subnet_id}\n intf = self.neutronclient.remove_interface_router(rout_id,\n body=body)\n intf.get('id')\n except Exception as exc:\n LOG.error(\"Failed to delete router interface %(name)s, \"\n \" Exc %(exc)s\", {'name': name, 'exc': str(exc)})\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ndeletes the openstack router and its interfaces given its name.", "response": "def delete_router_by_name(self, rtr_name, tenant_id):\n \"\"\"Delete the openstack router and its interfaces given its name.\n\n The interfaces should be already removed prior to calling this\n function.\n \"\"\"\n try:\n routers = self.neutronclient.list_routers()\n rtr_list = routers.get('routers')\n for rtr in rtr_list:\n if rtr_name == rtr['name']:\n self.neutronclient.delete_router(rtr['id'])\n except Exception as exc:\n LOG.error(\"Failed to get and delete router by name %(name)s, \"\n \"Exc %(exc)s\",\n {'name': rtr_name, 'exc': str(exc)})\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_router_intf(self, router_id):\n try:\n body = {}\n self.neutronclient.show_router(router_id, body=body)\n except Exception as exc:\n LOG.error(\"Failed to show router interface %(id)s \"\n \"Exc %(exc)s\", {'id': router_id, 'exc': str(exc)})\n return", "response": "Retrieve the router interface. Incomplete."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nretrieve the router name. Incomplete.", "response": "def get_rtr_name(self, router_id):\n \"\"\"Retrieve the router name. Incomplete. \"\"\"\n try:\n body = {}\n router = self.neutronclient.show_router(router_id, body=body)\n return router.get('router').get('name')\n except Exception as exc:\n LOG.error(\"Failed to show router interface %(id)s \"\n \"Exc %(exc)s\", {'id': router_id, 'exc': str(exc)})"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef find_rtr_namespace(self, rout_id):\n if rout_id is None:\n return None\n args = ['ip', 'netns', 'list']\n try:\n ns_list = utils.execute(args, root_helper=self.root_helper)\n except Exception as exc:\n LOG.error(\"Unable to find the namespace list Exception %s\",\n exc)\n return None\n for ns in ns_list.split():\n if 'router' in ns and rout_id in ns:\n return ns", "response": "Find the namespace associated with the router."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef program_rtr(self, args, rout_id, namespace=None):\n if namespace is None:\n namespace = self.find_rtr_namespace(rout_id)\n if namespace is None:\n LOG.error(\"Unable to find namespace for router %s\", rout_id)\n return False\n final_args = ['ip', 'netns', 'exec', namespace] + args\n try:\n utils.execute(final_args, root_helper=self.root_helper)\n except Exception as e:\n LOG.error(\"Unable to execute %(cmd)s. \"\n \"Exception: %(exception)s\",\n {'cmd': final_args, 'exception': e})\n return False\n return True", "response": "Execute the command against the namespace."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef program_rtr_default_gw(self, tenant_id, rout_id, gw):\n args = ['route', 'add', 'default', 'gw', gw]\n ret = self.program_rtr(args, rout_id)\n if not ret:\n LOG.error(\"Program router returned error for %s\", rout_id)\n return False\n return True", "response": "Program the default gateway of a router."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nretrieve the subnets inside a network after applying the exclusion list.", "response": "def get_subnet_nwk_excl(self, tenant_id, excl_list, excl_part=False):\n \"\"\"Retrieve the subnets of a network.\n\n Get the subnets inside a network after applying the exclusion\n list.\n \"\"\"\n net_list = self.get_network_by_tenant(tenant_id)\n ret_subnet_list = []\n for net in net_list:\n if excl_part:\n name = net.get('name')\n part = name.partition('::')[2]\n if part:\n continue\n subnet_lst = self.get_subnets_for_net(net.get('id'))\n for subnet_elem in subnet_lst:\n subnet = subnet_elem.get('cidr').split('/')[0]\n subnet_and_mask = subnet_elem.get('cidr')\n if subnet not in excl_list:\n ret_subnet_list.append(subnet_and_mask)\n return ret_subnet_list"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nprograms the next hop for all networks of a tenant.", "response": "def program_rtr_all_nwk_next_hop(self, tenant_id, rout_id, next_hop,\n excl_list):\n \"\"\"Program the next hop for all networks of a tenant. \"\"\"\n namespace = self.find_rtr_namespace(rout_id)\n if namespace is None:\n LOG.error(\"Unable to find namespace for router %s\", rout_id)\n return False\n\n net_list = self.get_network_by_tenant(tenant_id)\n for net in net_list:\n subnet_lst = self.get_subnets_for_net(net.get('id'))\n for subnet_elem in subnet_lst:\n subnet = subnet_elem.get('cidr').split('/')[0]\n subnet_and_mask = subnet_elem.get('cidr')\n if subnet not in excl_list:\n args = ['route', 'add', '-net', subnet_and_mask, 'gw',\n next_hop]\n ret = self.program_rtr(args, rout_id, namespace=namespace)\n if not ret:\n LOG.error(\"Program router returned error for %s\",\n rout_id)\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef program_rtr_nwk_next_hop(self, rout_id, next_hop, cidr):\n namespace = self.find_rtr_namespace(rout_id)\n if namespace is None:\n LOG.error(\"Unable to find namespace for router %s\", rout_id)\n return False\n\n args = ['route', 'add', '-net', cidr, 'gw', next_hop]\n ret = self.program_rtr(args, rout_id, namespace=namespace)\n if not ret:\n LOG.error(\"Program router returned error for %s\", rout_id)\n return False\n return True", "response": "Program the next hop for all networks of a tenant."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nremove the next hop for all networks of a tenant.", "response": "def remove_rtr_nwk_next_hop(self, rout_id, next_hop, subnet_lst,\n excl_list):\n \"\"\"Remove the next hop for all networks of a tenant. \"\"\"\n namespace = self.find_rtr_namespace(rout_id)\n if namespace is None:\n LOG.error(\"Unable to find namespace for router %s\", rout_id)\n return False\n\n args = ['ip', 'route']\n ret = self.program_rtr_return(args, rout_id, namespace=namespace)\n if ret is None:\n LOG.error(\"Get routes return None %s\", rout_id)\n return False\n routes = ret.split('\\n')\n concat_lst = subnet_lst + excl_list\n for rout in routes:\n if len(rout) == 0:\n continue\n nwk = rout.split()[0]\n if nwk == 'default':\n continue\n nwk_no_mask = nwk.split('/')[0]\n if nwk_no_mask not in concat_lst and nwk not in concat_lst:\n args = ['route', 'del', '-net', nwk, 'gw', next_hop]\n ret = self.program_rtr(args, rout_id, namespace=namespace)\n if not ret:\n LOG.error(\"Program router returned error for %s\",\n rout_id)\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget the Firewall given its ID.", "response": "def get_fw(self, fw_id):\n \"\"\"Return the Firewall given its ID. \"\"\"\n fw = None\n try:\n fw = self.neutronclient.show_firewall(fw_id)\n except Exception as exc:\n LOG.error(\"Failed to get firewall list for id %(id)s, \"\n \"Exc %(exc)s\", {'id': fw_id, 'exc': str(exc)})\n return fw"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_fw_rule(self, rule_id):\n rule = None\n try:\n rule = self.neutronclient.show_firewall_rule(rule_id)\n except Exception as exc:\n LOG.error(\"Failed to get firewall rule for id %(id)s \"\n \"Exc %(exc)s\", {'id': rule_id, 'exc': str(exc)})\n return rule", "response": "Get the firewall rule given its ID."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting the firewall policy given its ID.", "response": "def get_fw_policy(self, policy_id):\n \"\"\"Return the firewall policy, given its ID. \"\"\"\n policy = None\n try:\n policy = self.neutronclient.show_firewall_policy(policy_id)\n except Exception as exc:\n LOG.error(\"Failed to get firewall plcy for id %(id)s \"\n \"Exc %(exc)s\",\n {'id': policy_id, 'exc': str(exc)})\n return policy"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nensure that the HA configuration is compliant.", "response": "def _ensure_create_ha_compliant(self, router, router_type):\n \"\"\"To be called in create_router() BEFORE router is created in DB.\"\"\"\n details = router.pop(ha.DETAILS, {})\n if details == ATTR_NOT_SPECIFIED:\n details = {}\n res = {ha.ENABLED: router.pop(ha.ENABLED, ATTR_NOT_SPECIFIED),\n ha.DETAILS: details}\n\n if not is_attr_set(res[ha.ENABLED]):\n res[ha.ENABLED] = router_type['ha_enabled_by_default']\n if res[ha.ENABLED] and not cfg.CONF.ha.ha_support_enabled:\n raise ha.HADisabled()\n if not res[ha.ENABLED]:\n return res\n if not is_attr_set(details.get(ha.TYPE, ATTR_NOT_SPECIFIED)):\n details[ha.TYPE] = cfg.CONF.ha.default_ha_mechanism\n if details[ha.TYPE] in cfg.CONF.ha.disabled_ha_mechanisms:\n raise ha.HADisabledHAType(ha_type=details[ha.TYPE])\n if not is_attr_set(details.get(ha.REDUNDANCY_LEVEL,\n ATTR_NOT_SPECIFIED)):\n details[ha.REDUNDANCY_LEVEL] = (\n cfg.CONF.ha.default_ha_redundancy_level)\n if not is_attr_set(details.get(ha.PROBE_CONNECTIVITY,\n ATTR_NOT_SPECIFIED)):\n details[ha.PROBE_CONNECTIVITY] = (\n cfg.CONF.ha.connectivity_probing_enabled_by_default)\n if not is_attr_set(details.get(ha.PROBE_TARGET, ATTR_NOT_SPECIFIED)):\n details[ha.PROBE_TARGET] = cfg.CONF.ha.default_probe_target\n if not is_attr_set(details.get(ha.PROBE_INTERVAL, ATTR_NOT_SPECIFIED)):\n details[ha.PROBE_INTERVAL] = cfg.CONF.ha.default_ping_interval\n return res"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nensuring that the HA attributes of the current HD router are compliant.", "response": "def _ensure_update_ha_compliant(self, router, current_router,\n r_hd_binding_db):\n \"\"\"To be called in update_router() BEFORE router has been\n updated in DB.\n \"\"\"\n if r_hd_binding_db.role == ROUTER_ROLE_HA_REDUNDANCY:\n return {ha.ENABLED: False}\n auto_enable_ha = r_hd_binding_db.router_type.ha_enabled_by_default\n requested_ha_details = router.pop(ha.DETAILS, {})\n # If ha_details are given then ha is assumed to be enabled even if\n # it is not explicitly specified or if auto_enable_ha says so.\n # Note that None is used to indicate that request did not include any\n # ha information was provided!\n requested_ha_enabled = router.pop(\n ha.ENABLED, True if requested_ha_details or auto_enable_ha is True\n else None)\n res = {}\n ha_currently_enabled = current_router.get(ha.ENABLED, False)\n # Note: must check for 'is True' as None implies attribute not given\n if requested_ha_enabled is True or ha_currently_enabled is True:\n if not cfg.CONF.ha.ha_support_enabled:\n raise ha.HADisabled()\n curr_ha_details = current_router.get(ha.DETAILS, {})\n if ha.TYPE in requested_ha_details:\n requested_ha_type = requested_ha_details[ha.TYPE]\n if (ha.TYPE in curr_ha_details and\n requested_ha_type != curr_ha_details[ha.TYPE]):\n raise ha.HATypeCannotBeChanged()\n elif requested_ha_type in cfg.CONF.ha.disabled_ha_mechanisms:\n raise ha.HADisabledHAType(ha_type=requested_ha_type)\n if requested_ha_enabled:\n res[ha.ENABLED] = requested_ha_enabled\n if requested_ha_details:\n res[ha.DETAILS] = requested_ha_details\n elif requested_ha_enabled is False:\n res[ha.ENABLED] = False\n return res"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _teardown_redundancy_router_gw_connectivity(self, context, router,\n router_db,\n plugging_driver):\n \"\"\"To be called in update_router() if the router gateway is to change\n BEFORE router has been updated in DB .\n \"\"\"\n if not router[ha.ENABLED]:\n # No HA currently enabled so we're done\n return\n e_context = context.elevated()\n # since gateway is about to change the ha group for the current gateway\n # is removed, a new one will be created later\n self._delete_ha_group(e_context, router_db.gw_port_id)\n # teardown connectivity for the gw ports on the redundancy routers\n # and remove those ports as new ones will be created later\n rr_ids = []\n for r_b_db in router_db.redundancy_bindings:\n if plugging_driver is not None:\n plugging_driver.teardown_logical_port_connectivity(\n e_context, r_b_db.redundancy_router.gw_port,\n r_b_db.redundancy_router.hosting_info.hosting_device_id)\n self._update_router_no_notify(\n e_context, r_b_db.redundancy_router_id,\n {'router': {EXTERNAL_GW_INFO: None, ha.ENABLED: False}})\n rr_ids.append(r_b_db.redundancy_router_id)\n self.notify_routers_updated(e_context, rr_ids)", "response": "Teardown connectivity for the redundancy router gateway ports on the redundancy routers"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nupdating the redundancy routers for the given context.", "response": "def _update_redundancy_routers(self, context, updated_router,\n update_specification, requested_ha_settings,\n updated_router_db, gateway_changed):\n \"\"\"To be called in update_router() AFTER router has been\n updated in DB.\n \"\"\"\n router_requested = update_specification['router']\n ha_settings_db = updated_router_db.ha_settings\n ha_enabled_requested = requested_ha_settings.get(ha.ENABLED, False)\n if not (updated_router[ha.ENABLED] or ha_enabled_requested):\n # No HA currently enabled and no HA requested so we're done\n return\n # The redundancy routers need interfaces on the same networks as the\n # user visible router.\n ports = self._get_router_interfaces(updated_router_db)\n e_context = context.elevated()\n if not updated_router[ha.ENABLED] and ha_enabled_requested:\n # No HA currently enabled but HA requested\n router_requested.update(requested_ha_settings)\n router_requested[EXTERNAL_GW_INFO] = (\n updated_router[EXTERNAL_GW_INFO])\n requested_ha_settings = self._ensure_create_ha_compliant(\n router_requested, updated_router_db.hosting_info.router_type)\n self._create_redundancy_routers(\n e_context, updated_router, requested_ha_settings,\n updated_router_db, ports, expire_db=True)\n return\n rr_ids = self._get_redundancy_router_ids(context, updated_router['id'])\n ha_details_update_spec = requested_ha_settings.get(ha.DETAILS)\n if (updated_router[ha.ENABLED] and not requested_ha_settings.get(\n ha.ENABLED, updated_router[ha.ENABLED])):\n # HA currently enabled but HA disable requested\n # delete ha settings and extra port for gateway (VIP) port\n self._delete_ha_group(e_context, updated_router_db.gw_port_id)\n self._remove_redundancy_routers(e_context, rr_ids, ports, True)\n with context.session.begin(subtransactions=True):\n context.session.delete(ha_settings_db)\n elif ha_details_update_spec:\n # HA currently enabled and HA setting update (other than\n # disable HA) requested\n old_redundancy_level = ha_settings_db.redundancy_level\n ha_settings_db.update(ha_details_update_spec)\n diff = (ha_details_update_spec.get(ha.REDUNDANCY_LEVEL,\n old_redundancy_level) -\n old_redundancy_level)\n with context.session.begin(subtransactions=True):\n context.session.add(ha_settings_db)\n if diff < 0:\n # Remove -diff redundancy routers\n #TODO(bobmel): Ensure currently active router is excluded\n to_remove = rr_ids[len(rr_ids) + diff:]\n rr_ids = rr_ids[:len(rr_ids) + diff]\n self._remove_redundancy_routers(e_context, to_remove, ports)\n elif diff > 0:\n # Add diff redundancy routers\n start = old_redundancy_level + 1\n stop = start + diff\n self._add_redundancy_routers(e_context, start, stop,\n updated_router, ports,\n ha_settings_db, False)\n if gateway_changed is True:\n self._change_ha_for_gateway(e_context, updated_router,\n updated_router_db, ha_settings_db,\n router_requested, expire=True)\n else:\n # Notify redundancy routers about changes\n self.notify_routers_updated(e_context, rr_ids)\n\n elif gateway_changed is True:\n # HA currently enabled (and to remain so) nor any HA setting update\n # and gateway has changed\n self._change_ha_for_gateway(e_context, updated_router,\n updated_router_db, ha_settings_db,\n router_requested)\n # pick up updates to other attributes where it makes sense\n # and push - right now it is only admin_state_up.\n other_updates_spec = {'router': {}}\n if 'admin_state_up' in update_specification['router']:\n other_updates_spec['router']['admin_state_up'] = (\n update_specification['router']['admin_state_up'])\n if 'name' in update_specification['router']:\n other_updates_spec['router']['name'] = (\n update_specification['router']['name'])\n if (other_updates_spec['router'] or\n 'routes' in update_specification['router']):\n self._process_other_router_updates(e_context, updated_router_db,\n other_updates_spec)\n # Ensure we get latest state from DB\n context.session.expire(updated_router_db)\n self._extend_router_dict_ha(updated_router, updated_router_db)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nadds redundancy routers to the user visible router and its interfaces.", "response": "def _add_redundancy_routers(self, context, start_index, stop_index,\n user_visible_router, ports=None,\n ha_settings_db=None, create_ha_group=True):\n \"\"\"Creates a redundancy router and its interfaces on\n the specified subnets.\n \"\"\"\n priority = (DEFAULT_MASTER_PRIORITY +\n (start_index - 1) * PRIORITY_INCREASE_STEP)\n r = copy.deepcopy(user_visible_router)\n # No tenant_id so redundancy routers are hidden from user\n r['tenant_id'] = ''\n name = r['name']\n redundancy_r_ids = []\n for i in range(start_index, stop_index):\n del r['id']\n # We don't replicate the user visible router's routes, instead\n # they are populated to redundancy routers for get router(s) ops\n r.pop('routes', None)\n # Redundancy routers will never have a route spec themselves\n # The redundancy routers must have HA disabled\n r[ha.ENABLED] = False\n r['name'] = name + REDUNDANCY_ROUTER_SUFFIX + str(i)\n # set role so that purpose of this router can be easily determined\n r[routerrole.ROUTER_ROLE_ATTR] = ROUTER_ROLE_HA_REDUNDANCY\n gw_info = r[EXTERNAL_GW_INFO]\n if gw_info and gw_info['external_fixed_ips']:\n # Ensure ip addresses are not specified as they cannot be\n # same as visible router's ip addresses.\n for e_fixed_ip in gw_info['external_fixed_ips']:\n e_fixed_ip.pop('ip_address', None)\n r = self.create_router(context, {'router': r})\n LOG.debug(\"Created redundancy router %(index)d with router id \"\n \"%(r_id)s\", {'index': i, 'r_id': r['id']})\n priority += PRIORITY_INCREASE_STEP\n r_b_b = RouterRedundancyBinding(\n redundancy_router_id=r['id'],\n priority=priority,\n user_router_id=user_visible_router['id'])\n context.session.add(r_b_b)\n redundancy_r_ids.append(r['id'])\n for port_db in ports or []:\n port = self._core_plugin._make_port_dict(port_db)\n self._add_redundancy_router_interfaces(\n context, user_visible_router, None, port,\n redundancy_r_ids, ha_settings_db, create_ha_group)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nremoves all interfaces of the specified redundancy routers and then the redundancy routers themselves.", "response": "def _remove_redundancy_routers(self, context, router_ids, ports,\n delete_ha_groups=False):\n \"\"\"Deletes all interfaces of the specified redundancy routers\n and then the redundancy routers themselves.\n \"\"\"\n subnets_info = [{'subnet_id': port['fixed_ips'][0]['subnet_id']}\n for port in ports]\n for r_id in router_ids:\n for i in range(len(subnets_info)):\n self.remove_router_interface(context, r_id, subnets_info[i])\n LOG.debug(\"Removed interface on %(s_id)s to redundancy router \"\n \"with %(r_id)s\",\n {'s_id': ports[i]['network_id'], 'r_id': r_id})\n # There is only one ha group per network so only delete once\n if delete_ha_groups and r_id == router_ids[0]:\n self._delete_ha_group(context, ports[i]['id'])\n self.delete_router(context, r_id)\n LOG.debug(\"Deleted redundancy router %s\", r_id)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ndeletes all redundancy routers and ha groups.", "response": "def _delete_redundancy_routers(self, context, router_db):\n \"\"\"To be called in delete_router() BEFORE router has been\n deleted in DB. The router should have not interfaces.\n \"\"\"\n e_context = context.elevated()\n for binding in router_db.redundancy_bindings:\n self.delete_router(e_context, binding.redundancy_router_id)\n LOG.debug(\"Deleted redundancy router %s\",\n binding.redundancy_router_id)\n if router_db.gw_port_id:\n # delete ha settings and extra port for gateway (VIP) port\n self._delete_ha_group(e_context, router_db.gw_port_id)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nadds redundancy router interfaces to the context.", "response": "def _add_redundancy_router_interfaces(self, context, router, itfc_info,\n new_port, redundancy_router_ids=None,\n ha_settings_db=None,\n create_ha_group=True):\n \"\"\"To be called in add_router_interface() AFTER interface has been\n added to router in DB.\n \"\"\"\n # There are essentially three cases where we add interface to a\n # redundancy router:\n # 1. HA is enabled on a user visible router that has one or more\n # interfaces.\n # 2. Redundancy level is increased so one or more redundancy routers\n # are added.\n # 3. An interface is added to a user visible router.\n #\n # For 1: An HA GROUP MUST BE CREATED and EXTRA PORTS MUST BE CREATED\n # for each redundancy router. The id of extra port should be\n # specified in the interface_info argument of the\n # add_router_interface call so that we ADD BY PORT.\n # For 2: HA group need NOT be created as it will already exist (since\n # there is already at least on redundancy router). EXTRA PORTS\n # MUST BE CREATED for each added redundancy router. The id\n # of extra port should be specified in the interface_info\n # argument of the add_router_interface call so that we ADD BY\n # PORT.\n # For 3: if the interface for the user_visible_router was added by ...\n # a) PORT: An HA GROUP MUST BE CREATED and and EXTRA PORTS MUST BE\n # CREATED for each redundancy router. The id of extra port\n # should be specified in the interface_info argument of\n # the add_router_interface call so that we ADD BY PORT.\n # b) SUBNET: There are two cases to consider. If the added interface\n # of the user_visible_router has ...\n # b1) 1 SUBNET: An HA GROUP MUST BE CREATED and and EXTRA\n # PORTS MUST BE CREATED for each redundancy\n # router. The id of extra port should be\n # specified in the interface_info argument of\n # the add_router_interface call so we ADD BY\n # PORT.\n # b2) >1 SUBNETS: HA group need NOT be created as it will\n # already exist (since the redundancy routers\n # should already have extra ports to which the\n # (IPv6) subnet is added. Extra ports need\n # thus NOT be created. The subnet id should be\n # added to the existing extra ports.\n router_id = router['id']\n if ha_settings_db is None:\n ha_settings_db = self._get_ha_settings_by_router_id(context,\n router_id)\n if ha_settings_db is None:\n return\n e_context = context.elevated()\n add_by_subnet = (itfc_info is not None and 'subnet_id' in itfc_info and\n len(new_port['fixed_ips']) > 1)\n if (add_by_subnet is False or (itfc_info is None and\n create_ha_group is True)):\n # generate ha settings and extra port for router (VIP) port\n self._create_ha_group(e_context, router, new_port, ha_settings_db)\n fixed_ips = self._get_fixed_ips_subnets(new_port['fixed_ips'])\n for r_id in (redundancy_router_ids or\n self._get_redundancy_router_ids(e_context, router_id)):\n if add_by_subnet is True:\n # need to add subnet to redundancy router port\n ports = self._core_plugin.get_ports(\n e_context,\n filters={'device_id': [r_id],\n 'network_id': [new_port['network_id']]},\n fields=['fixed_ips', 'id'])\n redundancy_port = ports[0]\n fixed_ips = redundancy_port['fixed_ips']\n fixed_ip = {'subnet_id': itfc_info['subnet_id']}\n fixed_ips.append(fixed_ip)\n self._core_plugin.update_port(\n e_context, redundancy_port['id'],\n {'port': {'fixed_ips': fixed_ips}})\n else:\n redundancy_port = self._create_hidden_port(\n e_context, new_port['network_id'], '', fixed_ips)\n interface_info = {'port_id': redundancy_port['id']}\n self.add_router_interface(e_context, r_id, interface_info)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _update_redundancy_router_interfaces(self, context, router,\n port, modified_port_data,\n redundancy_router_ids=None,\n ha_settings_db=None):\n \"\"\"To be called when the router interfaces are updated,\n like in the case of change in port admin_state_up status\n \"\"\"\n router_id = router['id']\n if ha_settings_db is None:\n ha_settings_db = self._get_ha_settings_by_router_id(context,\n router_id)\n if ha_settings_db is None:\n return\n e_context = context.elevated()\n\n rr_ids = self._get_redundancy_router_ids(e_context, router_id)\n port_info_list = self._core_plugin.get_ports(\n e_context, filters={'device_id': rr_ids,\n 'network_id': [port['network_id']]},\n fields=['device_id', 'id'])\n for port_info in port_info_list:\n self._core_plugin.update_port(e_context, port_info['id'],\n modified_port_data)\n self._update_hidden_port(e_context, port['id'], modified_port_data)", "response": "To be called when the router interfaces are updated"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _remove_redundancy_router_interfaces(self, context, router_id,\n old_port):\n \"\"\"To be called in delete_router_interface() BEFORE interface has been\n removed from router in DB.\n \"\"\"\n ha_settings = self._get_ha_settings_by_router_id(context, router_id)\n if ha_settings is None or old_port is None:\n return\n e_context = context.elevated()\n rr_ids = self._get_redundancy_router_ids(e_context, router_id)\n port_info_list = self._core_plugin.get_ports(\n e_context, filters={'device_id': rr_ids,\n 'network_id': [old_port['network_id']]},\n fields=['device_id', 'fixed_ips', 'id'])\n subnet_id = old_port['fixed_ips'][0]['subnet_id']\n for port_info in port_info_list:\n if port_info['fixed_ips'][0]['subnet_id'] == subnet_id:\n interface_info = {'port_id': port_info['id']}\n self.remove_router_interface(e_context, port_info['device_id'],\n interface_info)\n self._delete_ha_group(e_context, old_port['id'])", "response": "Removes the router interfaces from the DB."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the redundancy routers for the given context and router_id.", "response": "def _redundancy_routers_for_floatingip(\n self, context, router_id, redundancy_router_ids=None,\n ha_settings_db=None):\n \"\"\"To be called in update_floatingip() to get the\n redundant router ids.\n \"\"\"\n if ha_settings_db is None:\n ha_settings_db = self._get_ha_settings_by_router_id(context,\n router_id)\n if ha_settings_db is None:\n return\n\n e_context = context.elevated()\n router_ids = []\n for r_id in (redundancy_router_ids or\n self._get_redundancy_router_ids(e_context, router_id)):\n router_ids.append(r_id)\n\n return router_ids"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _populate_ha_information(self, context, router):\n r_r_b = self._get_redundancy_router_bindings(\n context, redundancy_router_id=router['id'])\n if not r_r_b:\n if router[ha.ENABLED]:\n # The router is a user visible router with HA enabled.\n user_router_id = router['id']\n fips = []\n else:\n # The router is a user visible router with HA disabled.\n # Nothing more to do here.\n return\n else:\n # The router is a redundancy router.\n # Need to fetch floatingip configurations from user visible router\n # so they can be added to the redundancy routers.\n user_router_id = r_r_b[0].user_router_id\n fips = self.get_floatingips(context,\n {'router_id': [user_router_id]})\n if router['id'] != user_router_id:\n # We add the HA settings from user visible router to\n # its redundancy routers.\n user_router_db = self._get_router(context, user_router_id)\n self._extend_router_dict_ha(router, user_router_db)\n # The interfaces of the user visible router must use the\n # IP configuration of the extra ports in the HA groups.\n hag_dbs = self._get_subnet_id_indexed_ha_groups(context,\n user_router_id)\n e_context = context.elevated()\n if router.get('gw_port'):\n modified_interfaces = []\n interface_port = self._populate_port_ha_information(\n e_context, router['gw_port'], router['id'], hag_dbs,\n user_router_id, modified_interfaces)\n if not interface_port:\n # The router has a gw_port but cannot find the port info yet\n # so mark this router to have incomplete info and bail.\n # The cfg_agent puts this in the updated_routers to ask again.\n router['status'] = cisco_constants.ROUTER_INFO_INCOMPLETE\n return\n if modified_interfaces:\n router['gw_port'] = interface_port\n modified_interfaces = []\n for itfc in router.get(bc.constants.INTERFACE_KEY, []):\n interface_port = self._populate_port_ha_information(\n e_context, itfc, router['id'], hag_dbs, user_router_id,\n modified_interfaces)\n if not interface_port:\n # the router has interfaces but cannot find the port info yet\n # so mark this router to have incomplete info and bail\n # the cfg_agent will put this in the updated_list to ask again\n router['status'] = cisco_constants.ROUTER_INFO_INCOMPLETE\n return\n if modified_interfaces:\n router[bc.constants.INTERFACE_KEY] = modified_interfaces\n if fips:\n router[bc.constants.FLOATINGIP_KEY] = fips", "response": "Populate the HA information of the user visible router with the user visible router."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate hidden port used in HA purposes.", "response": "def _create_hidden_port(self, context, network_id, device_id, fixed_ips,\n port_type=DEVICE_OWNER_ROUTER_INTF):\n \"\"\"Creates port used specially for HA purposes.\"\"\"\n port = {'port': {\n 'tenant_id': '', # intentionally not set\n 'network_id': network_id,\n 'mac_address': ATTR_NOT_SPECIFIED,\n 'fixed_ips': fixed_ips,\n 'device_id': device_id,\n 'device_owner': port_type,\n 'admin_state_up': True,\n 'name': ''}}\n if extensions.is_extension_supported(self._core_plugin,\n \"dns-integration\"):\n port['port'].update(dns_name='')\n core_plugin = bc.get_plugin()\n return core_plugin.create_port(context, port)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_router_for_floatingip(self, context, internal_port,\n internal_subnet, external_network_id):\n \"\"\"We need to over-load this function so that we only return the\n user visible router and never its redundancy routers (as they never\n have floatingips associated with them).\n \"\"\"\n gw_port = orm.aliased(models_v2.Port, name=\"gw_port\")\n routerport_qry = context.session.query(\n RouterPort.router_id, models_v2.IPAllocation.ip_address).join(\n models_v2.Port, models_v2.IPAllocation).filter(\n models_v2.Port.network_id == internal_port['network_id'],\n RouterPort.port_type.in_(bc.constants.ROUTER_INTERFACE_OWNERS),\n models_v2.IPAllocation.subnet_id == internal_subnet['id']\n ).join(gw_port, gw_port.device_id == RouterPort.router_id).filter(\n gw_port.network_id == external_network_id,\n gw_port.device_owner == bc.constants.DEVICE_OWNER_ROUTER_GW\n ).distinct()\n\n # Ensure that redundancy routers (in a ha group) are not returned,\n # since only the user visible router should have floatingips.\n # This can be done by checking that the id of routers does not\n # appear in the 'redundancy_router_id' column in the\n # 'cisco_router_redundancy_bindings' table.\n routerport_qry = routerport_qry.outerjoin(\n RouterRedundancyBinding,\n RouterRedundancyBinding.redundancy_router_id ==\n RouterPort.router_id)\n routerport_qry = routerport_qry.filter(\n RouterRedundancyBinding.redundancy_router_id == expr.null())\n\n first_router_id = None\n for router_id, interface_ip in routerport_qry:\n if interface_ip == internal_subnet['gateway_ip']:\n return router_id\n if not first_router_id:\n first_router_id = router_id\n if first_router_id:\n return first_router_id\n\n raise l3_exceptions.ExternalGatewayForFloatingIPNotFound(\n subnet_id=internal_subnet['id'],\n external_network_id=external_network_id,\n port_id=internal_port['id'])", "response": "Get the router for a given internal port and subnet."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nallocating the firewall device.", "response": "def allocate_fw_dev(self, fw_id):\n \"\"\"Allocate firewall device.\n\n Allocate the first Firewall device which has resources available.\n \"\"\"\n for cnt in self.res:\n used = self.res.get(cnt).get('used')\n if used < self.res.get(cnt).get('quota'):\n self.res[cnt]['used'] = used + 1\n self.res[cnt]['fw_id_lst'].append(fw_id)\n return self.res[cnt].get('obj_dict'), (\n self.res[cnt].get('mgmt_ip'))\n return None, None"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef populate_fw_dev(self, fw_id, mgmt_ip, new):\n for cnt in self.res:\n used = self.res.get(cnt).get('used')\n if mgmt_ip == self.res[cnt].get('mgmt_ip'):\n if new:\n self.res[cnt]['used'] = used + 1\n self.res[cnt]['fw_id_lst'].append(fw_id)\n return self.res[cnt].get('obj_dict'), (\n self.res[cnt].get('mgmt_ip'))\n return None, None", "response": "Populate the class after a restart."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the object dict and mgmt ip for a firewall.", "response": "def get_fw_dev_map(self, fw_id):\n \"\"\"Return the object dict and mgmt ip for a firewall. \"\"\"\n for cnt in self.res:\n if fw_id in self.res.get(cnt).get('fw_id_lst'):\n return self.res[cnt].get('obj_dict'), (\n self.res[cnt].get('mgmt_ip'))\n return None, None"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreleases the firewall resource.", "response": "def deallocate_fw_dev(self, fw_id):\n \"\"\"Release the firewall resource. \"\"\"\n for cnt in self.res:\n if fw_id in self.res.get(cnt).get('fw_id_lst'):\n self.res[cnt]['used'] = self.res[cnt]['used'] - 1\n self.res.get(cnt).get('fw_id_lst').remove(fw_id)\n return"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\npopulates the local cache from FW DB after restart.", "response": "def populate_local_sch_cache(self, fw_dict):\n \"\"\"Populate the local cache from FW DB after restart. \"\"\"\n for fw_id in fw_dict:\n fw_data = fw_dict.get(fw_id)\n mgmt_ip = fw_data.get('fw_mgmt_ip')\n dev_status = fw_data.get('device_status')\n if dev_status == 'SUCCESS':\n new = True\n else:\n new = False\n if mgmt_ip is not None:\n drvr_dict, mgmt_ip = self.sched_obj.populate_fw_dev(fw_id,\n mgmt_ip,\n new)\n if drvr_dict is None or mgmt_ip is None:\n LOG.info(\"Pop cache for FW sch: drvr_dict or mgmt_ip \"\n \"is None\")"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef drvr_initialize(self, cfg):\n cnt = 0\n for ip in self.obj_dict:\n cfg_dict = {}\n drvr_obj = self.obj_dict.get(ip).get('drvr_obj')\n cfg_dict['mgmt_ip_addr'] = ip\n if self.user_list is not None:\n cfg_dict['user'] = self.user_list[cnt]\n if self.pwd_list is not None:\n cfg_dict['pwd'] = self.pwd_list[cnt]\n if self.interface_in_list is not None:\n cfg_dict['interface_in'] = self.interface_in_list[cnt]\n if self.interface_out_list is not None:\n cfg_dict['interface_out'] = self.interface_out_list[cnt]\n drvr_obj.initialize(cfg_dict)\n cnt = cnt + 1", "response": "Initialize the driver routines."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef populate_event_que(self, que_obj):\n for ip in self.obj_dict:\n drvr_obj = self.obj_dict.get(ip).get('drvr_obj')\n drvr_obj.populate_event_que(que_obj)", "response": "Populates the event queue object."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef populate_dcnm_obj(self, dcnm_obj):\n for ip in self.obj_dict:\n drvr_obj = self.obj_dict.get(ip).get('drvr_obj')\n drvr_obj.populate_dcnm_obj(dcnm_obj)", "response": "Populates the DCNM object."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef is_device_virtual(self):\n for ip in self.obj_dict:\n drvr_obj = self.obj_dict.get(ip).get('drvr_obj')\n ret = drvr_obj.is_device_virtual()\n # No way to pin a device as of now, so return the first\n # TODO(padkrish)\n return ret", "response": "Returns if the device is physical or virtual."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nmodify the firewall cfg.", "response": "def modify_fw_device(self, tenant_id, fw_id, data):\n \"\"\"Modifies the firewall cfg. \"\"\"\n drvr_dict, mgmt_ip = self.sched_obj.get_fw_dev_map(fw_id)\n return drvr_dict.get('drvr_obj').modify_fw(tenant_id, data)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the router IDs scheduled to l3 agent on host", "response": "def get_cfg_router_ids(self, context, host, router_ids=None,\n hosting_device_ids=None):\n \"\"\"Returns IDs of routers scheduled to l3 agent on \"\"\"\n return self._l3plugin.cfg_list_router_ids_on_host(context, host,\n router_ids,\n hosting_device_ids)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef cfg_sync_routers(self, context, host, router_ids=None,\n hosting_device_ids=None):\n \"\"\"Sync routers according to filters to a specific Cisco cfg agent.\n\n :param context: contains user information\n :param host: originator of callback\n :param router_ids: list of router ids to return information about\n :param hosting_device_ids: list of hosting device ids to get\n routers for.\n :returns: a list of routers with their hosting devices, interfaces and\n floating_ips\n \"\"\"\n adm_context = bc.context.get_admin_context()\n try:\n routers = (\n self._l3plugin.list_active_sync_routers_on_hosting_devices(\n adm_context, host, router_ids, hosting_device_ids))\n except AttributeError:\n routers = []\n LOG.debug('Routers returned to Cisco cfg agent@%(agt)s:\\n %(routers)s',\n {'agt': host, 'routers': jsonutils.dumps(routers, indent=5)})\n return routers", "response": "Sync routers according to filters to a specific Cisco cfg agent."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef update_floatingip_statuses_cfg(self, context, router_id, fip_statuses):\n with context.session.begin(subtransactions=True):\n for (floatingip_id, status) in six.iteritems(fip_statuses):\n LOG.debug(\"New status for floating IP %(floatingip_id)s: \"\n \"%(status)s\", {'floatingip_id': floatingip_id,\n 'status': status})\n try:\n self._l3plugin.update_floatingip_status(\n context, floatingip_id, status)\n except l3_exceptions.FloatingIPNotFound:\n LOG.debug(\"Floating IP: %s no longer present.\",\n floatingip_id)\n # Find all floating IPs known to have been the given router\n # for which an update was not received. Set them DOWN mercilessly\n # This situation might occur for some asynchronous backends if\n # notifications were missed\n known_router_fips = self._l3plugin.get_floatingips(\n context, {'last_known_router_id': [router_id]})\n # Consider only floating ips which were disassociated in the API\n fips_to_disable = (fip['id'] for fip in known_router_fips\n if not fip['router_id'])\n for fip_id in fips_to_disable:\n LOG.debug(\"update_fip_statuses: disable: %s\", fip_id)\n self._l3plugin.update_floatingip_status(\n context, fip_id, bc.constants.FLOATINGIP_STATUS_DOWN)", "response": "Update the operational status for one or several floating IPs."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef update_port_statuses_cfg(self, context, port_ids, status):\n self._l3plugin.update_router_port_statuses(context, port_ids,\n status)", "response": "Update the operational statuses of a list of router ports."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_mysql_credentials(cfg_file):\n\n try:\n parser = ConfigParser.ConfigParser()\n cfg_fp = open(cfg_file)\n parser.readfp(cfg_fp)\n cfg_fp.close()\n except ConfigParser.NoOptionError:\n cfg_fp.close()\n print('Failed to find mysql connections credentials.')\n sys.exit(1)\n except IOError:\n print('ERROR: Cannot open %s.', cfg_file)\n sys.exit(1)\n\n value = parser.get('dfa_mysql', 'connection')\n\n try:\n # Find location of pattern in connection parameter as shown below:\n # http://username:password@host/databasename?characterset=encoding'\n sobj = re.search(r\"(://).*(@).*(/).*(\\?)\", value)\n\n # The list parameter contains:\n # indices[0], is the index of '://'\n # indices[1], is the index of '@'\n # indices[2], is the index of '/'\n # indices[3], is the index of '?'\n indices = [sobj.start(1), sobj.start(2), sobj.start(3), sobj.start(4)]\n\n # Get the credentials\n cred = value[indices[0] + 3:indices[1]].split(':')\n\n # Get the host name\n host = value[indices[1] + 1:indices[2]]\n\n # Get the database name\n db_name = value[indices[2] + 1:indices[3]]\n\n # Get the character encoding\n charset = value[indices[3] + 1:].split('=')[1]\n\n return cred[0], cred[1], host, db_name, charset\n except (ValueError, IndexError, AttributeError):\n print('Failed to find mysql connections credentials.')\n sys.exit(1)", "response": "Get the credentials and database name from options in config file."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nmodifying config file neutron and keystone to include enabler options.", "response": "def modify_conf(cfgfile, service_name, outfn):\n \"\"\"Modify config file neutron and keystone to include enabler options.\"\"\"\n\n if not cfgfile or not outfn:\n print('ERROR: There is no config file.')\n sys.exit(0)\n\n options = service_options[service_name]\n with open(cfgfile, 'r') as cf:\n lines = cf.readlines()\n\n for opt in options:\n op = opt.get('option')\n res = [line for line in lines if line.startswith(op)]\n if len(res) > 1:\n print('ERROR: There are more than one %s option.' % res)\n sys.exit(0)\n if res:\n (op, sep, val) = (res[0].strip('\\n').replace(' ', '').\n partition('='))\n new_val = None\n if opt.get('is_list'):\n # Value for this option can contain list of values.\n # Append the value if it does not exist.\n if not any(opt.get('value') == value for value in\n val.split(',')):\n new_val = ','.join((val, opt.get('value')))\n else:\n if val != opt.get('value'):\n new_val = opt.get('value')\n if new_val:\n opt_idx = lines.index(res[0])\n # The setting is different, replace it with new one.\n lines.pop(opt_idx)\n lines.insert(opt_idx, '='.join((opt.get('option'),\n new_val + '\\n')))\n else:\n # Option does not exist. Add the option.\n try:\n sec_idx = lines.index('[' + opt.get('section') + ']\\n')\n lines.insert(sec_idx + 1, '='.join(\n (opt.get('option'), opt.get('value') + '\\n')))\n except ValueError:\n print('Invalid %s section name.' % opt.get('section'))\n sys.exit(0)\n\n with open(outfn, 'w') as fwp:\n all_lines = ''\n for line in lines:\n all_lines += line\n\n fwp.write(all_lines)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_all_hosting_devices(self, context):\n cctxt = self.client.prepare()\n return cctxt.call(context,\n 'get_all_hosting_devices',\n host=self.host)", "response": "Get a list of all hosting devices."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_all_hosted_routers(self, context):\n cctxt = self.client.prepare()\n return cctxt.call(context, 'cfg_sync_all_hosted_routers',\n host=self.host)", "response": "Make a remote process call to retrieve the sync data for\n routers that have been scheduled to a hosting device."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget the ID for the ASR1k hardware router type.", "response": "def get_hardware_router_type_id(self, context):\n \"\"\"Get the ID for the ASR1k hardware router type.\"\"\"\n cctxt = self.client.prepare()\n return cctxt.call(context,\n 'get_hardware_router_type_id',\n host=self.host)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nprocesses heartbeat messages from agents on compute nodes.", "response": "def heartbeat(self, context, msg):\n \"\"\"Process heartbeat message from agents on compute nodes.\"\"\"\n\n args = jsonutils.loads(msg)\n when = args.get('when')\n agent = args.get('agent')\n # The configurations in here, only used once when creating entry\n # for an agent in DB for the first time.\n configurations = {'uplink': ''}\n LOG.debug('heartbeat received: %(time)s - %(agent)s', (\n {'time': when, 'agent': agent}))\n\n if self.obj.neutron_event:\n self.obj.neutron_event.create_rpc_client(agent)\n # Other option is to add the event to the queue for processig it later.\n\n self.obj.update_agent_status(agent, when)\n\n # Update the agents database.\n agent_info = dict(timestamp=utils.utc_time(when), host=agent,\n config=jsonutils.dumps(configurations))\n self.obj.update_agent_db(agent_info)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef request_uplink_info(self, context, agent):\n\n LOG.debug('request_uplink_info from %(agent)s', {'agent': agent})\n\n # Add the request into queue for processing.\n event_type = 'agent.request.uplink'\n payload = {'agent': agent}\n timestamp = time.ctime()\n data = (event_type, payload)\n pri = self.obj.PRI_LOW_START + 1\n self.obj.pqueue.put((pri, timestamp, data))\n LOG.debug('Added request uplink info into queue.')\n\n return 0", "response": "Process uplink message from an agent."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef set_static_ip_address(self, context, msg):\n args = jsonutils.loads(msg)\n macaddr = args.get('mac')\n ipaddr = args.get('ip')\n LOG.debug('set_static_ip_address received: %(mac)s %(ip)s', (\n {'mac': macaddr, 'ip': ipaddr}))\n\n # Add the request into queue for processing.\n event_type = 'cli.static_ip.set'\n payload = {'mac': macaddr, 'ip': ipaddr}\n timestamp = time.ctime()\n data = (event_type, payload)\n pri = self.obj.PRI_LOW_START\n self.obj.pqueue.put((pri, timestamp, data))\n LOG.debug('Added request to add static ip into queue.')\n\n return 0", "response": "Process a set static ip address request."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nupdating VM s result field in the DB.", "response": "def update_vm_result(self, context, msg):\n \"\"\"Update VM's result field in the DB.\n\n The result reflects the success of failure of operation when an\n agent processes the vm info.\n \"\"\"\n args = jsonutils.loads(msg)\n agent = context.get('agent')\n port_id = args.get('port_uuid')\n result = args.get('result')\n LOG.debug('update_vm_result received from %(agent)s: '\n '%(port_id)s %(result)s', {'agent': agent,\n 'port_id': port_id,\n 'result': result})\n\n # Add the request into queue for processing.\n event_type = 'agent.vm_result.update'\n payload = {'port_id': port_id, 'result': result}\n timestamp = time.ctime()\n data = (event_type, payload)\n # TODO(nlahouti) use value defined in constants\n pri = self.obj.PRI_LOW_START + 10\n self.obj.pqueue.put((pri, timestamp, data))\n LOG.debug('Added request vm result update into queue.')\n\n return 0"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef register_segment_dcnm(self, cfg, seg_id_min, seg_id_max):\n\n orch_id = cfg.dcnm.orchestrator_id\n try:\n segid_range = self.dcnm_client.get_segmentid_range(orch_id)\n if segid_range is None:\n self.dcnm_client.set_segmentid_range(orch_id, seg_id_min,\n seg_id_max)\n else:\n conf_min, _, conf_max = segid_range[\n \"segmentIdRanges\"].partition(\"-\")\n if int(conf_min) != seg_id_min or int(conf_max) != seg_id_max:\n self.dcnm_client.update_segmentid_range(orch_id,\n seg_id_min,\n seg_id_max)\n except dexc.DfaClientRequestFailed as exc:\n LOG.error(\"Segment ID range could not be created/updated\"\n \" on DCNM: %s\", exc)\n raise SystemExit(exc)", "response": "Register segmentation id pool with DCNM."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef project_create_func(self, proj_id, proj=None):\n\n if self.get_project_name(proj_id):\n LOG.info(\"project %s exists, returning\", proj_id)\n return\n\n if not proj:\n try:\n proj = self.keystone_event._service.projects.get(proj_id)\n except Exception:\n LOG.error(\"Failed to find project %s.\", proj_id)\n return\n\n # In the project name, dci_id may be included. Check if this is the\n # case and extact the dci_id from the name, and provide dci_id when\n # creating the project.\n proj_name, dci_id = self._get_dci_id_and_proj_name(proj.name)\n if proj_name in reserved_project_name:\n proj_name = \"_\".join((proj_name, self.cfg.dcnm.orchestrator_id))\n # The default partition name is 'os' (i.e. openstack) which reflects\n # it is created by openstack.\n part_name = self.cfg.dcnm.default_partition_name\n if len(':'.join((proj_name, part_name))) > 32:\n LOG.error('Invalid project name length: %s. The length of '\n 'org:part name is greater than 32',\n len(':'.join((proj_name, part_name))))\n return\n try:\n self.dcnm_client.create_project(self.cfg.dcnm.orchestrator_id,\n proj_name, part_name, dci_id,\n proj.description)\n except dexc.DfaClientRequestFailed:\n # Failed to send create project in DCNM.\n # Save the info and mark it as failure and retry it later.\n self.update_project_info_cache(proj_id, name=proj_name,\n dci_id=dci_id,\n result=constants.CREATE_FAIL)\n LOG.error(\"Failed to create project %s on DCNM.\", proj_name)\n else:\n self.update_project_info_cache(proj_id, name=proj_name,\n dci_id=dci_id)\n LOG.debug('project %(name)s %(dci)s %(desc)s', (\n {'name': proj_name, 'dci': dci_id, 'desc': proj.description}))\n self.project_create_notif(proj_id, proj_name)", "response": "Create project given project uuid"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef project_create_event(self, proj_info):\n\n LOG.debug(\"Processing create %(proj)s event.\", {'proj': proj_info})\n proj_id = proj_info.get('resource_info')\n\n self.project_create_func(proj_id)", "response": "Process a create project event."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef project_update_event(self, proj_info):\n\n LOG.debug(\"Processing project_update_event %(proj)s.\",\n {'proj': proj_info})\n proj_id = proj_info.get('resource_info')\n try:\n proj = self.keystone_event._service.projects.get(proj_id)\n except Exception:\n LOG.error(\"Failed to find project %s.\", proj_id)\n return\n\n new_proj_name, new_dci_id = self._get_dci_id_and_proj_name(proj.name)\n # Check if project name and dci_id are the same, there is no change.\n orig_proj_name = self.get_project_name(proj_id)\n orig_dci_id = self.get_dci_id(proj_id)\n if orig_proj_name == new_proj_name and new_dci_id == orig_dci_id:\n # This is an invalid update event.\n LOG.warning('Project update event for %(proj)s is received '\n 'without changing in the project name: '\n '%(orig_proj)s. Ignoring the event.',\n {'proj': proj_id, 'orig_proj': orig_proj_name})\n return\n\n if orig_proj_name != new_proj_name:\n # Project has new name and in DCNM the name of project cannot be\n # modified. It is an invalid update. Do not process the event.\n LOG.debug('Update request cannot be processed as name of project'\n ' is changed: %(proj)s %(orig_name)s %(orig_dci)s to '\n '%(new_name)s %(new_dci)s.', (\n {'proj': proj_id, 'orig_name': orig_proj_name,\n 'orig_dci': orig_dci_id, 'new_name': new_proj_name,\n 'new_dci': new_dci_id}))\n return\n\n # Valid update request.\n LOG.debug('Changing project DCI id for %(proj)s from %(orig_dci)s to '\n '%(new_dci)s.', {'proj': proj_id,\n 'orig_dci': orig_dci_id,\n 'new_dci': new_dci_id})\n\n try:\n self.dcnm_client.update_project(new_proj_name,\n self.cfg.dcnm.\n default_partition_name,\n dci_id=new_dci_id)\n except dexc.DfaClientRequestFailed:\n # Failed to update project in DCNM.\n # Save the info and mark it as failure and retry it later.\n LOG.error(\"Failed to update project %s on DCNM.\",\n new_proj_name)\n self.update_project_info_cache(proj_id, name=new_proj_name,\n dci_id=new_dci_id,\n opcode='update',\n result=constants.UPDATE_FAIL)\n else:\n self.update_project_info_cache(proj_id, name=new_proj_name,\n dci_id=new_dci_id,\n opcode='update')\n LOG.debug('Updated project %(proj)s %(name)s.',\n {'proj': proj_id, 'name': proj.name})", "response": "Process project update event."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nprocess a project delete event.", "response": "def project_delete_event(self, proj_info):\n \"\"\"Process project delete event.\"\"\"\n\n LOG.debug(\"Processing project_delete_event...\")\n proj_id = proj_info.get('resource_info')\n proj_name = self.get_project_name(proj_id)\n if proj_name:\n try:\n self.dcnm_client.delete_project(proj_name,\n self.cfg.dcnm.\n default_partition_name)\n except dexc.DfaClientRequestFailed:\n # Failed to delete project in DCNM.\n # Save the info and mark it as failure and retry it later.\n LOG.error(\"Failed to create project %s on DCNM.\",\n proj_name)\n self.update_project_info_cache(proj_id, name=proj_name,\n opcode='delete',\n result=constants.DELETE_FAIL)\n else:\n self.update_project_info_cache(proj_id, opcode='delete')\n LOG.debug('Deleted project:%s', proj_name)\n self.project_delete_notif(proj_id, proj_name)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef subnet_create_event(self, subnet_info):\n\n subnet = subnet_info.get('subnet')\n if subnet:\n self.create_subnet(subnet)\n else:\n # Check whether request is for subnets.\n subnets = subnet_info.get('subnets')\n if subnets:\n for subnet in subnets:\n self.create_subnet(subnet)", "response": "Process subnet create event."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef create_subnet(self, snet):\n\n snet_id = snet.get('id')\n # This checks if the source of the subnet creation is FW,\n # If yes, this event is ignored.\n if self.fw_api.is_subnet_source_fw(snet.get('tenant_id'),\n snet.get('cidr')):\n LOG.info(\"Service subnet %s, returning\", snet.get('cidr'))\n return\n if snet_id not in self.subnet:\n self.subnet[snet_id] = {}\n self.subnet[snet_id].update(snet)\n\n net = self.network.get(self.subnet[snet_id].get('network_id'))\n if not net:\n LOG.error('Network %(network_id)s does not exist.',\n {'network_id': self.subnet[snet_id].get('network_id')})\n return\n\n # Check if the network is created by DCNM.\n query_net = self.get_network(net.get('id'))\n if query_net.result != constants.SUBNET_PENDING:\n LOG.info(\"Subnet exists, returning\")\n return\n if query_net and query_net.source.lower() == 'dcnm':\n # The network is created by DCNM.\n # No need to process this event.\n LOG.info('create_subnet: network %(name)s '\n 'was created by DCNM. Ignoring processing the '\n 'event.', {'name': query_net.name})\n return\n\n tenant_name = self.get_project_name(snet['tenant_id'])\n subnet = utils.Dict2Obj(snet)\n dcnm_net = utils.Dict2Obj(net)\n if not tenant_name:\n LOG.error('Project %(tenant_id)s does not exist.',\n {'tenant_id': subnet.tenant_id})\n self.update_network_db(dcnm_net.id, constants.CREATE_FAIL)\n return\n\n try:\n self.dcnm_client.create_network(tenant_name, dcnm_net, subnet,\n self.dcnm_dhcp)\n self.update_network_db(net.get('id'), constants.RESULT_SUCCESS)\n except dexc.DfaClientRequestFailed:\n LOG.exception('Failed to create network %(net)s.',\n {'net': dcnm_net.name})\n # Update network database with failure result.\n self.update_network_db(dcnm_net.id, constants.CREATE_FAIL)\n self.network_sub_create_notif(snet.get('tenant_id'), tenant_name,\n snet.get('cidr'))", "response": "Create a subnet and return the related class."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nfunctioning to create the network in database and dcnm", "response": "def network_create_func(self, net):\n \"\"\"Create network in database and dcnm\n :param net: network dictionary\n \"\"\"\n net_id = net['id']\n net_name = net.get('name')\n network_db_elem = self.get_network(net_id)\n # Check if the source of network creation is FW and if yes, skip\n # this event.\n # Check if there's a way to read the DB from service class\n # TODO(padkrish)\n if self.fw_api.is_network_source_fw(network_db_elem, net_name):\n LOG.info(\"Service network %s, returning\", net_name)\n return\n if not network_db_elem:\n self.network[net_id] = {}\n self.network[net_id].update(net)\n\n net_name = net.get('name')\n tenant_id = net.get('tenant_id')\n\n # Extract segmentation_id from the network name\n net_ext_name = self.cfg.dcnm.dcnm_net_ext\n nobj = re.search(net_ext_name, net_name)\n try:\n seg_id = int((net_name[nobj.start(0) + len(net_ext_name) - 1:]\n if nobj else None))\n except (IndexError, TypeError, ValueError):\n seg_id = None\n\n # Check if network is already created.\n query_net = self.get_network_by_segid(seg_id) if seg_id else None\n if query_net:\n # The network is already created no need to process the event.\n if query_net.source.lower() == 'dcnm':\n # DCNM created the network. Only update network id in database.\n prev_id = query_net.network_id\n params = dict(columns=dict(network_id=net_id))\n self.update_network(prev_id, **params)\n\n # Update the network cache.\n prev_info = self.network.pop(prev_id)\n prev_info['id'] = net_id\n self.network[net_id] = prev_info\n\n # Update the network name. After extracting the segmentation_id\n # no need to keep it in the name. Removing it and update\n # the network.\n updated_net_name = (\n net_name[:nobj.start(0) + len(net_ext_name) - 1])\n try:\n body = {'network': {'name': updated_net_name, }}\n dcnm_net = self.neutronclient.update_network(\n net_id, body=body).get('network')\n LOG.debug('Updated network %(network)s', dcnm_net)\n except Exception as exc:\n LOG.exception('Failed to update network '\n '%(network)s. Reason %(err)s.',\n {'network': updated_net_name,\n 'err': str(exc)})\n return\n\n LOG.info('network_create_event: network %(name)s was created '\n 'by %(source)s. Ignoring processing the event.',\n {'name': net_name, 'source': 'dcnm'})\n return\n if network_db_elem:\n LOG.debug(\"Network %s exists, not processing\" % net_name)\n return\n\n # Check if project (i.e. tenant) exist.\n tenant_name = self.get_project_name(tenant_id)\n if not tenant_name:\n LOG.error('Failed to create network %(name)s. Project '\n '%(tenant_id)s does not exist.',\n {'name': net_name, 'tenant_id': tenant_id})\n return\n\n pseg_id = self.network[net_id].get('provider:segmentation_id')\n seg_id = self._get_segmentation_id(net_id, pseg_id, 'openstack')\n self.network[net_id]['segmentation_id'] = seg_id\n try:\n cfgp, fwd_mod = self.dcnm_client.get_config_profile_for_network(\n net.get('name'))\n self.network[net_id]['config_profile'] = cfgp\n self.network[net_id]['fwd_mod'] = fwd_mod\n self.add_network_db(net_id, self.network[net_id],\n 'openstack',\n constants.SUBNET_PENDING)\n LOG.debug('network_create_event: network=%s', self.network)\n except dexc.DfaClientRequestFailed:\n # Fail to get config profile from DCNM.\n # Save the network info with failure result and send the request\n # to DCNM later.\n self.add_network_db(net_id, self.network[net_id], 'openstack',\n constants.CREATE_FAIL)\n LOG.error('Failed to create network=%s.', self.network)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nprocess network delete event.", "response": "def network_delete_event(self, network_info):\n \"\"\"Process network delete event.\"\"\"\n\n net_id = network_info['network_id']\n if net_id not in self.network:\n LOG.error('network_delete_event: net_id %s does not exist.',\n net_id)\n return\n\n segid = self.network[net_id].get('segmentation_id')\n tenant_id = self.network[net_id].get('tenant_id')\n tenant_name = self.get_project_name(tenant_id)\n net = utils.Dict2Obj(self.network[net_id])\n if not tenant_name:\n LOG.error('Project %(tenant_id)s does not exist.',\n {'tenant_id': tenant_id})\n self.update_network_db(net.id, constants.DELETE_FAIL)\n return\n\n try:\n self.dcnm_client.delete_network(tenant_name, net)\n # Put back the segmentation id into the pool.\n self.seg_drvr.release_segmentation_id(segid)\n\n # Remove entry from database and cache.\n self.delete_network_db(net_id)\n del self.network[net_id]\n snets = [k for k in self.subnet if (\n self.subnet[k].get('network_id') == net_id)]\n [self.subnet.pop(s) for s in snets]\n except dexc.DfaClientRequestFailed:\n LOG.error('Failed to create network %(net)s.',\n {'net': net.name})\n self.update_network_db(net_id, constants.DELETE_FAIL)\n # deleting all related VMs\n instances = self.get_vms()\n instances_related = [k for k in instances if k.network_id == net_id]\n for vm in instances_related:\n LOG.debug(\"deleting vm %s because network is deleted\", vm.name)\n self.delete_vm_function(vm.port_id, vm)\n self.network_del_notif(tenant_id, tenant_name, net_id)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nprocess a DCNM network create event from DCNM.", "response": "def dcnm_network_create_event(self, network_info):\n \"\"\"Process network create event from DCNM.\"\"\"\n\n # 1. Add network info to database before sending request to\n # neutron to create the network.\n # Check if network is already created.\n pre_seg_id = network_info.get('segmentation_id')\n pre_project_name = network_info.get('project_name')\n pre_partition_name = network_info.get('partition_name')\n if not pre_seg_id or not pre_partition_name or not pre_project_name:\n LOG.error('Invalid network event: %s', network_info)\n return\n\n # Check if partition name is the one that openstack created.\n if pre_partition_name != self.cfg.dcnm.default_partition_name:\n LOG.error('Failed to create network. Partition %(part)s is '\n 'not %(os_part)s which is created by openstack.',\n {'part': pre_partition_name,\n 'os_part': self.cfg.dcnm.default_partition_name})\n return\n\n query_net = self.get_network_by_segid(pre_seg_id)\n if query_net:\n # The network is already created no need to process the event.\n LOG.info('dcnm_network_create_event: network %(name)s was '\n 'created. Ignoring processing the event.',\n {'name': query_net.name})\n return\n\n dcnm_net_info = self.dcnm_client.get_network(pre_project_name,\n pre_seg_id)\n if not dcnm_net_info:\n LOG.info('No network details for %(org)s and %(segid)s',\n {'org': pre_project_name, 'segid': pre_seg_id})\n return\n\n net_id = utils.get_uuid()\n pseg_id = dcnm_net_info.get('segmentId')\n seg_id = self._get_segmentation_id(net_id, pseg_id, 'DCNM')\n cfgp = dcnm_net_info.get('profileName')\n net_name = dcnm_net_info.get('networkName')\n fwd_mod = self.dcnm_client.config_profile_fwding_mode_get(cfgp)\n tenant_name = dcnm_net_info.get('organizationName')\n tenant_id = self.get_project_id(tenant_name)\n\n # Get the subnet details.\n subnet = dcnm_net_info.get('dhcpScope')\n if not subnet:\n # The dhcpScope is not provided. Calculating the cidr based on\n # gateway ip and netmask.\n gw_addr = dcnm_net_info.get('gateway')\n net_mask = dcnm_net_info.get('netmaskLength')\n cidr = utils.make_cidr(gw_addr, net_mask)\n if not cidr:\n LOG.error('Failed to create network: '\n 'cidr is None for %(gw)s %(mask)s',\n {'gw': gw_addr, 'mask': net_mask})\n return\n subnet = dict(gateway=gw_addr, subnet=cidr)\n\n # Check if parameters are provided.\n if not (net_name and tenant_id and seg_id and subnet):\n LOG.error('Invalid value: network %(name)s tenant_id '\n '%(tenant_id)s segmentation_id %(seg_id)s '\n 'subnet %(subnet)s.', {'name': net_name,\n 'tenant_id': tenant_id,\n 'seg_id': seg_id,\n 'subnet': subnet})\n return\n\n # Update network cache and add the network to the database.\n net_ext_name = self.cfg.dcnm.dcnm_net_ext\n self.network[net_id] = dict(segmentation_id=seg_id,\n config_profile=cfgp,\n fwd_mod=fwd_mod,\n tenant_id=tenant_id,\n name=net_name + net_ext_name,\n id=net_id,\n source='DCNM')\n self.add_network_db(net_id, self.network[net_id], 'DCNM',\n constants.RESULT_SUCCESS)\n\n # 2. Send network create request to neutron\n try:\n # With create_network (called below), the same request comes as\n # notification and it will be processed in the\n # create_network_event. The request should not be processed as it\n # is already processed here.\n # The only way to decide whether it is for a new network or not is\n # the segmentation_id (DCNM does not have uuid for network) which\n # is unique. For that reason it is needed to send segmentation_id\n # when creating network in openstack.\n # Moreover, we are using network_type=local and for that reason\n # provider:segmentation_id cannot be added as parameter when\n # creating network. One solution is to embed segmentation_id in the\n # network name. Then, when processing the notification, if the\n # request is from DCNM, the segmentation_id will be extracted from\n # network name. With that create_network_event can decide to\n # process or deny an event.\n updated_net_name = net_name + net_ext_name + str(seg_id)\n body = {'network': {'name': updated_net_name,\n 'tenant_id': tenant_id,\n 'admin_state_up': True}}\n dcnm_net = self.neutronclient.create_network(\n body=body).get('network')\n net_id = dcnm_net.get('id')\n\n except Exception as exc:\n # Failed to create network, do clean up.\n # Remove the entry from database and local cache.\n del self.network[net_id]\n self.delete_network_db(net_id)\n LOG.exception('dcnm_network_create_event: Failed to create '\n '%(network)s. Reason %(err)s.',\n {'network': body, 'err': str(exc)})\n return\n\n LOG.debug('dcnm_network_create_event: Created network %(network)s', (\n body))\n\n # 3. Send subnet create request to neutron.\n pool = subnet.get('ipRange')\n allocation_pools = []\n if pool:\n allocation_pools = [{'start': s, 'end': e} for s, e in\n [p.split('-') for p in pool.split(',')]]\n\n try:\n body = {'subnet': {'cidr': subnet.get('subnet'),\n 'gateway_ip': subnet.get('gateway'),\n 'ip_version': 4,\n 'network_id': net_id,\n 'tenant_id': tenant_id,\n 'enable_dhcp': not self.dcnm_dhcp,\n 'allocation_pools': allocation_pools, }}\n if not self.dcnm_dhcp:\n body.get('subnet').pop('allocation_pools')\n # Send requenst to create subnet in neutron.\n LOG.debug('Creating subnet %(subnet)s for DCNM request.', body)\n dcnm_subnet = self.neutronclient.create_subnet(\n body=body).get('subnet')\n subnet_id = dcnm_subnet.get('id')\n # Update subnet cache.\n self.subnet[subnet_id] = {}\n self.subnet[subnet_id].update(body.get('subnet'))\n except Exception as exc:\n # Failed to create network, do clean up if necessary.\n LOG.exception('Failed to create subnet %(subnet)s for DCNM '\n 'request. Error %(err)s',\n {'subnet': body['subnet'], 'err': str(exc)})\n\n LOG.debug('dcnm_network_create_event: Created subnet %(subnet)s', (\n body))"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nprocess network delete event from DCNM.", "response": "def dcnm_network_delete_event(self, network_info):\n \"\"\"Process network delete event from DCNM.\"\"\"\n seg_id = network_info.get('segmentation_id')\n if not seg_id:\n LOG.error('Failed to delete network. Invalid network '\n 'info %s.', network_info)\n query_net = self.get_network_by_segid(seg_id)\n if not query_net:\n LOG.info('dcnm_network_delete_event: network %(segid)s '\n 'does not exist.', {'segid': seg_id})\n return\n if self.fw_api.is_network_source_fw(query_net, query_net.name):\n LOG.info(\"Service network %s, returning\", query_net.name)\n return\n # Send network delete request to neutron\n try:\n del_net = self.network.pop(query_net.network_id)\n self.neutronclient.delete_network(query_net.network_id)\n self.delete_network_db(query_net.network_id)\n except Exception as exc:\n # Failed to delete network.\n # Put back the entry to the local cache???\n self.network[query_net.network_id] = del_net\n LOG.exception('dcnm_network_delete_event: Failed to delete '\n '%(network)s. Reason %(err)s.',\n {'network': query_net.name, 'err': str(exc)})"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfinding the ip address that assinged to a port via DHCP The port database will be updated with the ip address.", "response": "def update_port_ip_address(self):\n \"\"\"Find the ip address that assinged to a port via DHCP\n\n The port database will be updated with the ip address.\n \"\"\"\n leases = None\n req = dict(ip='0.0.0.0')\n instances = self.get_vms_for_this_req(**req)\n if instances is None:\n return\n\n for vm in instances:\n if not leases:\n # For the first time finding the leases file.\n leases = self._get_ip_leases()\n if not leases:\n # File does not exist.\n return\n\n for line in leases:\n if line.startswith('lease') and line.endswith('{\\n'):\n ip_addr = line.split()[1]\n if 'hardware ethernet' in line:\n if vm.mac == line.replace(';', '').split()[2]:\n LOG.info('Find IP address %(ip)s for %(mac)s',\n {'ip': ip_addr, 'mac': vm.mac})\n try:\n rule_info = dict(ip=ip_addr, mac=vm.mac,\n port=vm.port_id,\n status='up')\n self.neutron_event.update_ip_rule(str(vm.host),\n str(rule_info))\n except (rpc.MessagingTimeout, rpc.RPCException,\n rpc.RemoteError):\n LOG.error(\"RPC error: Failed to update\"\n \"rules.\")\n else:\n params = dict(columns=dict(ip=ip_addr))\n self.update_vm_db(vm.port_id, **params)\n\n # Send update to the agent.\n vm_info = dict(status=vm.status, vm_mac=vm.mac,\n segmentation_id=vm.segmentation_id,\n host=vm.host, port_uuid=vm.port_id,\n net_uuid=vm.network_id,\n oui=dict(ip_addr=ip_addr,\n vm_name=vm.name,\n vm_uuid=vm.instance_id,\n gw_mac=vm.gw_mac,\n fwd_mod=vm.fwd_mod,\n oui_id='cisco'))\n try:\n self.neutron_event.send_vm_info(vm.host,\n str(vm_info))\n except (rpc.MessagingTimeout, rpc.RPCException,\n rpc.RemoteError):\n LOG.error('Failed to send VM info to '\n 'agent.')"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsending VM info to the compute host.", "response": "def send_vm_info(self, vm_info):\n \"\"\"Send vm info to the compute host.\n it will return True/False\n \"\"\"\n agent_host = vm_info.get('host')\n if not agent_host:\n LOG.info(\"vm/port is not bound to host, not sending vm info\")\n return True\n try:\n self.neutron_event.send_vm_info(agent_host,\n str(vm_info))\n except (rpc.MessagingTimeout, rpc.RPCException,\n rpc.RemoteError):\n # Failed to send info to the agent. Keep the data in the\n # database as failure to send it later.\n LOG.error('Failed to send VM info to agent %s', agent_host)\n return False\n else:\n return True"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef request_vms_info(self, payload):\n\n # This request is received from an agent when it runs for the first\n # time and uplink is detected.\n agent = payload.get('agent')\n LOG.debug('request_vms_info: Getting VMs info for %s', agent)\n req = dict(host=payload.get('agent'))\n instances = self.get_vms_for_this_req(**req)\n vm_info = []\n for vm in instances:\n vm_info.append(dict(status=vm.status,\n vm_mac=vm.mac,\n segmentation_id=vm.segmentation_id,\n host=vm.host,\n port_uuid=vm.port_id,\n net_uuid=vm.network_id,\n oui=dict(ip_addr=vm.ip,\n vm_name=vm.name,\n vm_uuid=vm.instance_id,\n gw_mac=vm.gw_mac,\n fwd_mod=vm.fwd_mod,\n oui_id='cisco')))\n try:\n self.neutron_event.send_vm_info(agent, str(vm_info))\n except (rpc.MessagingTimeout, rpc.RPCException, rpc.RemoteError):\n LOG.error('Failed to send VM info to agent.')", "response": "Get the VMs from the database and send the info to the agent."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef request_uplink_info(self, payload):\n\n # This request is received from an agent when it run for the first\n # Send the uplink name (physical port name that connectes compute\n # node and switch fabric),\n agent = payload.get('agent')\n config_res = self.get_agent_configurations(agent)\n LOG.debug('configurations on %(agent)s is %(cfg)s', (\n {'agent': agent, 'cfg': config_res}))\n try:\n self.neutron_event.send_msg_to_agent(agent,\n constants.UPLINK_NAME,\n config_res)\n except (rpc.MessagingTimeout, rpc.RPCException, rpc.RemoteError):\n LOG.error(\"RPC error: Failed to send uplink name to agent.\")", "response": "Get the uplink from the database and send the info to the agent."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef set_static_ip_address(self, payload):\n\n # This request is received from CLI for setting ip address of an\n # instance.\n macaddr = payload.get('mac')\n ipaddr = payload.get('ip')\n\n # Find the entry associated with the mac in the database.\n req = dict(mac=macaddr)\n instances = self.get_vms_for_this_req(**req)\n for vm in instances:\n LOG.info('Updating IP address: %(ip)s %(mac)s.',\n {'ip': ipaddr, 'mac': macaddr})\n # Send request to update the rule.\n try:\n rule_info = dict(ip=ipaddr, mac=macaddr,\n port=vm.port_id,\n status='up')\n self.neutron_event.update_ip_rule(str(vm.host),\n str(rule_info))\n except (rpc.MessagingTimeout, rpc.RPCException,\n rpc.RemoteError):\n LOG.error(\"RPC error: Failed to update rules.\")\n else:\n # Update the database.\n params = dict(columns=dict(ip=ipaddr))\n self.update_vm_db(vm.port_id, **params)\n\n # Send update to the agent.\n vm_info = dict(status=vm.status, vm_mac=vm.mac,\n segmentation_id=vm.segmentation_id,\n host=vm.host, port_uuid=vm.port_id,\n net_uuid=vm.network_id,\n oui=dict(ip_addr=ipaddr,\n vm_name=vm.name,\n vm_uuid=vm.instance_id,\n gw_mac=vm.gw_mac,\n fwd_mod=vm.fwd_mod,\n oui_id='cisco'))\n try:\n self.neutron_event.send_vm_info(vm.host,\n str(vm_info))\n except (rpc.MessagingTimeout, rpc.RPCException,\n rpc.RemoteError):\n LOG.error('Failed to send VM info to agent.')", "response": "Set the static ip address for a VM."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef vm_result_update(self, payload):\n\n port_id = payload.get('port_id')\n result = payload.get('result')\n\n if port_id and result:\n # Update the VM's result field.\n params = dict(columns=dict(result=result))\n self.update_vm_db(port_id, **params)", "response": "Update the result field in VM database."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nadd a lbaas port to the agent.", "response": "def add_lbaas_port(self, port_id, lb_id):\n \"\"\"Give port id, get port info and send vm info to agent.\n\n :param port_id: port id of vip port\n :param lb_id: vip id for v1 and lbaas_id for v2\n \"\"\"\n port_info = self.neutronclient.show_port(port_id)\n port = port_info.get('port')\n if not port:\n LOG.error(\"Can not retrieve port info for port %s\" % port_id)\n return\n LOG.debug(\"lbaas add port, %s\", port)\n if not port['binding:host_id']:\n LOG.info(\"No host bind for lbaas port, octavia case\")\n return\n port[\"device_id\"] = lb_id\n\n vm_info = self._make_vm_info(port, 'up', constants.LBAAS_PREFIX)\n self.port[port_id] = vm_info\n if self.send_vm_info(vm_info):\n self.add_vms_db(vm_info, constants.RESULT_SUCCESS)\n else:\n self.add_vms_db(vm_info, constants.CREATE_FAIL)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef delete_lbaas_port(self, lb_id):\n lb_id = lb_id.replace('-', '')\n req = dict(instance_id=lb_id)\n instances = self.get_vms_for_this_req(**req)\n for vm in instances:\n LOG.info(\"deleting lbaas vm %s \" % vm.name)\n self.delete_vm_function(vm.port_id, vm)", "response": "send vm down event and delete db."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef vip_create_event(self, vip_info):\n vip_data = vip_info.get('vip')\n port_id = vip_data.get('port_id')\n vip_id = vip_data.get('id')\n self.add_lbaas_port(port_id, vip_id)", "response": "Process vip create event."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef listener_create_event(self, listener_info):\n listener_data = listener_info.get('listener')\n lb_list = listener_data.get('loadbalancers')\n for lb in lb_list:\n lb_id = lb.get('id')\n req = dict(instance_id=(lb_id.replace('-', '')))\n instances = self.get_vms_for_this_req(**req)\n if not instances:\n lb_info = self.neutronclient.show_loadbalancer(lb_id)\n if lb_info:\n port_id = lb_info[\"loadbalancer\"][\"vip_port_id\"]\n self.add_lbaas_port(port_id, lb_id)\n else:\n LOG.info(\"lbaas port for lb %s already added\" % lb_id)", "response": "Process listener create event."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef listener_delete_event(self, listener_info):\n lb_list = self.neutronclient.list_loadbalancers()\n for lb in lb_list.get('loadbalancers'):\n if not lb.get(\"listeners\"):\n lb_id = lb.get('id')\n LOG.info(\"Deleting lb %s port\" % lb_id)\n self.delete_lbaas_port(lb_id)", "response": "Process listener delete event."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef pool_create_event(self, pool_info):\n pool_data = pool_info.get('pool')\n listeners = pool_data.get('listeners')\n for listener in listeners:\n l_id = listener.get('id')\n l_info = self.neutronclient.show_listener(l_id)\n self.listener_create_event(l_info)", "response": "Process pool create event."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates threads on server.", "response": "def create_threads(self):\n \"\"\"Create threads on server.\"\"\"\n\n # Create thread for neutron notifications.\n neutron_thrd = utils.EventProcessingThread('Neutron_Event',\n self.neutron_event,\n 'event_handler',\n self._excpq)\n self.dfa_threads.append(neutron_thrd)\n\n # Create thread for processing notification events.\n qp_thrd = utils.EventProcessingThread('Event_Queue', self,\n 'process_queue', self._excpq)\n self.dfa_threads.append(qp_thrd)\n\n # Create thread for keystone notifications.\n keys_thrd = utils.EventProcessingThread('Keystone_Event',\n self.keystone_event,\n 'event_handler', self._excpq)\n self.dfa_threads.append(keys_thrd)\n\n # Create thread to process RPC calls.\n hb_thrd = utils.EventProcessingThread('RPC_Server', self, 'start_rpc',\n self._excpq)\n self.dfa_threads.append(hb_thrd)\n\n # Create thread to listen to dcnm network events.\n if self.dcnm_event is not None:\n dcnmL_thrd = utils.EventProcessingThread('DcnmListener',\n self.dcnm_event,\n 'process_amqp_msgs',\n self._excpq)\n self.dfa_threads.append(dcnmL_thrd)\n\n # Create periodic task to process failure cases in create/delete\n # networks and projects.\n fr_thrd = utils.PeriodicTask(interval=constants.FAIL_REC_INTERVAL,\n func=self.add_events,\n event_queue=self.pqueue,\n priority=self.PRI_LOW_START + 10,\n excq=self._excpq)\n\n # Start all the threads.\n for t in self.dfa_threads:\n t.start()\n\n # Run the periodic tasks.\n fr_thrd.run()"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _allocate_specified_segment(self, session, seg_id, source):\n try:\n with session.begin(subtransactions=True):\n alloc = (session.query(self.model).filter_by(\n segmentation_id=seg_id).first())\n if alloc:\n if alloc.allocated:\n # Segment already allocated\n return\n else:\n # Segment not allocated\n count = (session.query(self.model).\n filter_by(allocated=False,\n segmentation_id=seg_id).update(\n {\"allocated\": True}))\n if count:\n return alloc\n\n # Segment to create or already allocated\n alloc = self.model(segmentation_id=seg_id,\n allocated=True, source=source)\n session.add(alloc)\n\n except db_exc.DBDuplicateEntry:\n # Segment already allocated (insert failure)\n alloc = None\n\n return alloc", "response": "Allocate specified segment.\n\n If segment exists, then try to allocate it and return db object\n If segment does not exists, then try to create it and return db object\n If allocation/creation failed (duplicates), then return None"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nallocate a segment from pool. MimeType.", "response": "def _allocate_segment(self, session, net_id, source):\n \"\"\"Allocate segment from pool.\n\n Return allocated db object or None.\n \"\"\"\n\n with session.begin(subtransactions=True):\n hour_lapse = utils.utc_time_lapse(self.seg_timeout)\n count = (session.query(self.model).filter(\n self.model.delete_time < hour_lapse).update(\n {\"delete_time\": None}))\n\n select = (session.query(self.model).filter_by(allocated=False,\n delete_time=None))\n\n # Selected segment can be allocated before update by someone else,\n # We retry until update success or DB_MAX_RETRIES retries\n for attempt in range(DB_MAX_RETRIES + 1):\n alloc = select.first()\n if not alloc:\n LOG.info(\"No segment resource available\")\n # No resource available\n return\n\n count = (session.query(self.model).\n filter_by(segmentation_id=alloc.segmentation_id,\n allocated=False).update({\"allocated\": True,\n \"network_id\": net_id,\n \"source\": source}))\n if count:\n return alloc\n\n LOG.error(\"ERROR: Failed to allocate segment for net %(net)s\"\n \" source %(src)s\",\n {'net': net_id, 'src': source})"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nallocate subnet from pool. Return allocated db object or None.", "response": "def allocate_subnet(self, subnet_lst, net_id=None):\n \"\"\"Allocate subnet from pool.\n\n Return allocated db object or None.\n \"\"\"\n\n session = db.get_session()\n query_str = None\n for sub in subnet_lst:\n sub_que = (self.model.subnet_address != sub)\n if query_str is not None:\n query_str = query_str & sub_que\n else:\n query_str = sub_que\n with session.begin(subtransactions=True):\n select = (session.query(self.model).filter(\n (self.model.allocated == 0) & query_str))\n\n # Selected segment can be allocated before update by someone else,\n # We retry until update success or DB_MAX_RETRIES retries\n for attempt in range(DB_MAX_RETRIES + 1):\n alloc = select.first()\n if not alloc:\n LOG.info(\"No subnet resource available\")\n return\n count = (session.query(self.model).\n filter_by(subnet_address=alloc.subnet_address,\n allocated=False).update({\"allocated\": True,\n \"network_id\": net_id}))\n if count:\n return alloc.subnet_address\n\n LOG.error(\"ERROR: Failed to allocate subnet for net %(net)s\",\n {'net': net_id})\n return None"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nadd or update an entry to the topology DB.", "response": "def add_update_topology_db(self, **params):\n \"\"\"Add or update an entry to the topology DB. \"\"\"\n topo_dict = params.get('columns')\n session = db.get_session()\n host = topo_dict.get('host')\n protocol_interface = topo_dict.get('protocol_interface')\n with session.begin(subtransactions=True):\n try:\n # Check if entry exists.\n session.query(DfaTopologyDb).filter_by(\n host=host, protocol_interface=protocol_interface).one()\n session.query(DfaTopologyDb).filter_by(\n host=host, protocol_interface=protocol_interface).update(\n topo_dict)\n except orm_exc.NoResultFound:\n LOG.info(\"Creating new topology entry for host \"\n \"%(host)s on Interface %(intf)s\",\n {'host': host, 'intf': protocol_interface})\n topo_disc = DfaTopologyDb(\n host=host, protocol_interface=protocol_interface,\n phy_interface=topo_dict.get('phy_interface'),\n created=topo_dict.get('created'),\n heartbeat=topo_dict.get('heartbeat'),\n remote_mgmt_addr=topo_dict.get('remote_mgmt_addr'),\n remote_system_name=topo_dict.get('remote_system_name'),\n remote_system_desc=topo_dict.get('remote_system_desc'),\n remote_port_id_mac=topo_dict.get('remote_port_id_mac'),\n remote_chassis_id_mac=topo_dict.get(\n 'remote_chassis_id_mac'),\n remote_port=topo_dict.get('remote_port'),\n remote_evb_cfgd=topo_dict.get('remote_evb_cfgd'),\n remote_evb_mode=topo_dict.get('remote_evb_mode'),\n configurations=topo_dict.get('configurations'))\n session.add(topo_disc)\n except orm_exc.MultipleResultsFound:\n LOG.error(\"More than one enty found for agent %(host)s.\"\n \"Interface %(intf)s\",\n {'host': host, 'intf': protocol_interface})\n except Exception as exc:\n LOG.error(\"Exception in add_update_topology_db %s\", exc)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nconvert a list of topology objects to a list of dict.", "response": "def _convert_topo_obj_dict(self, topology_objs):\n \"\"\"Convert topology object to dict. \"\"\"\n topo_lst = []\n for topo_obj in topology_objs:\n topo_dct = {\n 'host': topo_obj.host,\n 'protocol_interface': topo_obj.protocol_interface,\n 'phy_interface': topo_obj.phy_interface,\n 'created': topo_obj.created, 'heartbeat': topo_obj.heartbeat,\n 'remote_mgmt_addr': topo_obj.remote_mgmt_addr,\n 'remote_system_name': topo_obj.remote_system_name,\n 'remote_system_desc': topo_obj.remote_system_desc,\n 'remote_port_id_mac': topo_obj.remote_port_id_mac,\n 'remote_chassis_id_mac': topo_obj.remote_chassis_id_mac,\n 'remote_port': topo_obj.remote_port,\n 'remote_evb_cfgd': topo_obj.remote_evb_cfgd,\n 'remote_evb_mode': topo_obj.remote_evb_mode,\n 'configurations': topo_obj.configurations}\n topo_lst.append(topo_dct)\n return topo_lst"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nquerying an entry to the topology DB.", "response": "def query_topology_db(self, dict_convert=False, **req):\n \"\"\"Query an entry to the topology DB. \"\"\"\n session = db.get_session()\n with session.begin(subtransactions=True):\n try:\n # Check if entry exists.\n topo_disc = session.query(DfaTopologyDb).filter_by(**req).all()\n except orm_exc.NoResultFound:\n LOG.info(\"No Topology results found for %s\", req)\n return None\n if dict_convert:\n return self._convert_topo_obj_dict(topo_disc)\n return topo_disc"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef delete_topology_entry(self, **req):\n session = db.get_session()\n with session.begin(subtransactions=True):\n try:\n rows = session.query(DfaTopologyDb).filter_by(**req).all()\n except orm_exc.NoResultFound:\n LOG.info(\"No Topology results found for %s\", req)\n return\n try:\n for row in rows:\n session.delete(row)\n except Exception as exc:\n LOG.error(\"Exception raised %s\", str(exc))", "response": "Delete the entries from the topology DB."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nfunctioning to enable LLDP on the interface.", "response": "def enable_lldp(self, port_name, is_ncb=True, is_nb=False):\n \"\"\"Function to enable LLDP on the interface. \"\"\"\n reply = None\n if is_ncb:\n reply = self.run_lldptool([\"-L\", \"-i\", port_name, \"-g\", \"ncb\",\n \"adminStatus=rxtx\"])\n elif is_nb:\n reply = self.run_lldptool([\"-L\", \"-i\", port_name, \"-g\", \"nb\",\n \"adminStatus=rxtx\"])\n else:\n LOG.error(\"Both NCB and NB are not selected to \"\n \"enable LLDP\")\n return False\n if reply is None:\n return False\n exp_str = \"adminstatus=rxtx\"\n if exp_str in reply.replace(\" \", \"\").lower():\n return True\n else:\n return False"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_lldp_tlv(self, port_name, is_ncb=True, is_nb=False):\n reply = None\n if is_ncb:\n reply = self.run_lldptool([\"get-tlv\", \"-n\", \"-i\", port_name,\n \"-g\", \"ncb\"])\n elif is_nb:\n reply = self.run_lldptool([\"get-tlv\", \"-n\", \"-i\", port_name,\n \"-g\", \"nb\"])\n else:\n LOG.error(\"Both NCB and NB are not selected to \"\n \"query LLDP\")\n return reply", "response": "Function to Query LLDP TLV on the interface."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfunctioning for invoking the lldptool utility.", "response": "def run_lldptool(self, args):\n \"\"\"Function for invoking the lldptool utility. \"\"\"\n full_args = ['lldptool'] + args\n try:\n return utils.execute(full_args, root_helper=self.root_helper)\n except Exception as exc:\n LOG.error(\"Unable to execute %(cmd)s. \"\n \"Exception: %(exception)s\",\n {'cmd': full_args, 'exception': str(exc)})"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking if the common TLV format is valid.", "response": "def _check_common_tlv_format(self, tlv_complete_data, tlv_data_pattern,\n tlv_string):\n \"\"\"Check for the common TLV format. \"\"\"\n if tlv_complete_data is None:\n return False, None\n tlv_string_split = tlv_complete_data.split(tlv_string)\n if len(tlv_string_split) < 2:\n return False, None\n next_tlv_list = tlv_string_split[1].split('TLV')[0]\n tlv_val_set = next_tlv_list.split(tlv_data_pattern)\n if len(tlv_val_set) < 2:\n return False, None\n return True, tlv_val_set"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_remote_evb_mode(self, tlv_data):\n ret, parsed_val = self._check_common_tlv_format(\n tlv_data, \"mode:\", \"EVB Configuration TLV\")\n if not ret:\n return None\n mode_val = parsed_val[1].split()[0].strip()\n return mode_val", "response": "Returns the EVB mode in the TLV."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_remote_mgmt_addr(self, tlv_data):\n ret, parsed_val = self._check_common_tlv_format(\n tlv_data, \"IPv4:\", \"Management Address TLV\")\n if not ret:\n return None\n addr_fam = 'IPv4:'\n addr = parsed_val[1].split('\\n')[0].strip()\n return addr_fam + addr", "response": "Returns Remote Mgmt Addr from the TLV."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning Remote Sys Desc from the TLV.", "response": "def get_remote_sys_desc(self, tlv_data):\n \"\"\"Returns Remote Sys Desc from the TLV. \"\"\"\n ret, parsed_val = self._check_common_tlv_format(\n tlv_data, \"\\n\", \"System Description TLV\")\n if not ret:\n return None\n return parsed_val[1].strip()"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_remote_sys_name(self, tlv_data):\n ret, parsed_val = self._check_common_tlv_format(\n tlv_data, \"\\n\", \"System Name TLV\")\n if not ret:\n return None\n return parsed_val[1].strip()", "response": "Returns Remote Sys Name from the TLV."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_remote_port(self, tlv_data):\n ret, parsed_val = self._check_common_tlv_format(\n tlv_data, \"\\n\", \"Port Description TLV\")\n if not ret:\n return None\n return parsed_val[1].strip()", "response": "Returns Remote Port from the TLV."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning Remote Chassis ID MAC from the TLV.", "response": "def get_remote_chassis_id_mac(self, tlv_data):\n \"\"\"Returns Remote Chassis ID MAC from the TLV. \"\"\"\n ret, parsed_val = self._check_common_tlv_format(\n tlv_data, \"MAC:\", \"Chassis ID TLV\")\n if not ret:\n return None\n mac = parsed_val[1].split('\\n')\n return mac[0].strip()"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_remote_port_id_local(self, tlv_data):\n ret, parsed_val = self._check_common_tlv_format(\n tlv_data, \"Local:\", \"Port ID TLV\")\n if not ret:\n return None\n local = parsed_val[1].split('\\n')\n return local[0].strip()", "response": "Returns Remote Port ID Local from the TLV."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef format_interface_name(intf_type, port, ch_grp=0):\n if ch_grp > 0:\n return 'port-channel:%s' % str(ch_grp)\n\n return '%s:%s' % (intf_type.lower(), port)", "response": "Method to format an interface name given type port and channel - group."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef split_interface_name(interface, ch_grp=0):\n\n interface = interface.lower()\n if ch_grp != 0:\n intf_type = 'port-channel'\n port = str(ch_grp)\n elif ':' in interface:\n intf_type, port = interface.split(':')\n elif interface.startswith('ethernet'):\n interface = interface.replace(\" \", \"\")\n _, intf_type, port = interface.partition('ethernet')\n elif interface.startswith('port-channel'):\n interface = interface.replace(\" \", \"\")\n _, intf_type, port = interface.partition('port-channel')\n else:\n intf_type, port = 'ethernet', interface\n\n return intf_type, port", "response": "Method to split interface name into type and id."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nnotifies the cfg agent that the hosting device is handling the hosting device.", "response": "def _host_notification(self, context, method, payload, host):\n \"\"\"Notify the cfg agent that is handling the hosting device.\"\"\"\n LOG.debug('Notify Cisco cfg agent at %(host)s the message '\n '%(method)s', {'host': host, 'method': method})\n cctxt = self.client.prepare(server=host)\n cctxt.cast(context, method, payload=payload)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _agent_notification(self, context, method, hosting_devices, operation):\n admin_context = context.is_admin and context or context.elevated()\n for hosting_device in hosting_devices:\n agents = self._dmplugin.get_cfg_agents_for_hosting_devices(\n admin_context, hosting_device['id'], admin_state_up=True,\n schedule=True)\n for agent in agents:\n LOG.debug('Notify %(agent_type)s at %(topic)s.%(host)s the '\n 'message %(method)s',\n {'agent_type': agent.agent_type,\n 'topic': agent.topic,\n 'host': agent.host,\n 'method': method})\n cctxt = self.client.prepare(server=agent.host)\n cctxt.cast(context, method)", "response": "Notify all cfg agents that have been approved."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nupdates cfg agent on to enable or disable it.", "response": "def agent_updated(self, context, admin_state_up, host):\n \"\"\"Updates cfg agent on to enable or disable it.\"\"\"\n self._host_notification(context, 'agent_updated',\n {'admin_state_up': admin_state_up}, host)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nnotify cfg agent that some hosting devices are unassigned from the cfg agent.", "response": "def hosting_devices_unassigned_from_cfg_agent(self, context, ids, host):\n \"\"\"Notify cfg agent to no longer handle some hosting devices.\n\n This notification relieves the cfg agent in of responsibility\n to monitor and configure hosting devices with id specified in .\n \"\"\"\n self._host_notification(context,\n 'hosting_devices_unassigned_from_cfg_agent',\n {'hosting_device_ids': ids}, host)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef hosting_devices_assigned_to_cfg_agent(self, context, ids, host):\n self._host_notification(context,\n 'hosting_devices_assigned_to_cfg_agent',\n {'hosting_device_ids': ids}, host)", "response": "Notify cfg agent that some hosting devices have been assigned to a cfg agent."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nnotifies cfg agent that some hosting devices have been removed from the hosting device pool.", "response": "def hosting_devices_removed(self, context, hosting_data, deconfigure,\n host):\n \"\"\"Notify cfg agent that some hosting devices have been removed.\n\n This notification informs the cfg agent in that the\n hosting devices in the dictionary have been removed\n from the hosting device pool. The dictionary also\n contains the ids of the affected logical resources for each hosting\n devices::\n\n {'hd_id1': {'routers': [id1, id2, ...],\n 'fw': [id1, ...],\n ...},\n 'hd_id2': {'routers': [id3, id4, ...]},\n 'fw': [id1, ...],\n ...},\n ...}\n\n The argument is True if any configurations for the\n logical resources should be removed from the hosting devices\n \"\"\"\n if hosting_data:\n self._host_notification(context, 'hosting_devices_removed',\n {'hosting_data': hosting_data,\n 'deconfigure': deconfigure}, host)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_hosting_device_configuration(self, context, id):\n admin_context = context.is_admin and context or context.elevated()\n agents = self._dmplugin.get_cfg_agents_for_hosting_devices(\n admin_context, [id], admin_state_up=True, schedule=True)\n if agents:\n cctxt = self.client.prepare(server=agents[0].host)\n return cctxt.call(context, 'get_hosting_device_configuration',\n payload={'hosting_device_id': id})", "response": "Fetch configuration of hosting device with id."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nstores the policy. Policy is maintained as a dictionary of pol ID.", "response": "def store_policy(self, pol_id, policy):\n \"\"\"Store the policy.\n\n Policy is maintained as a dictionary of pol ID.\n \"\"\"\n if pol_id not in self.policies:\n self.policies[pol_id] = policy\n self.policy_cnt += 1"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nstoring the rules. Policy is maintained as a dictionary of Rule ID.", "response": "def store_rule(self, rule_id, rule):\n \"\"\"Store the rules.\n\n Policy is maintained as a dictionary of Rule ID.\n \"\"\"\n if rule_id not in self.rules:\n self.rules[rule_id] = rule\n self.rule_cnt += 1"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef delete_rule(self, rule_id):\n if rule_id not in self.rules:\n LOG.error(\"No Rule id present for deleting %s\", rule_id)\n return\n del self.rules[rule_id]\n self.rule_cnt -= 1", "response": "Delete the specific Rule from the dictionary indexed by rule id."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn if firewall index by ID is present in dictionary.", "response": "def is_fw_present(self, fw_id):\n \"\"\"Returns if firewall index by ID is present in dictionary. \"\"\"\n if self.fw_id is None or self.fw_id != fw_id:\n return False\n else:\n return True"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_fw(self, proj_name, pol_id, fw_id, fw_name, fw_type, rtr_id):\n self.tenant_name = proj_name\n self.fw_id = fw_id\n self.fw_name = fw_name\n self.fw_created = True\n self.active_pol_id = pol_id\n self.fw_type = fw_type\n self.router_id = rtr_id", "response": "Fills up the local attributes when FW is created."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ndeletes the FW local attributes.", "response": "def delete_fw(self, fw_id):\n \"\"\"Deletes the FW local attributes. \"\"\"\n self.fw_id = None\n self.fw_name = None\n self.fw_created = False\n self.active_pol_id = None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef delete_policy(self, pol_id):\n if pol_id not in self.policies:\n LOG.error(\"Invalid policy %s\", pol_id)\n return\n del self.policies[pol_id]\n self.policy_cnt -= 1", "response": "Deletes the policy from the local dictionary."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef one_rule_present(self, pol_id):\n pol_dict = self.policies[pol_id]\n for rule in pol_dict['rule_dict']:\n if self.is_rule_present(rule):\n return True\n return False", "response": "Returns if at least one rule is present in the policy."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef update_fw_params(self, rtr_id=-1, fw_type=-1):\n if rtr_id != -1:\n self.router_id = rtr_id\n if fw_type != -1:\n self.fw_type = fw_type", "response": "Updates the FW parameters."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef populate_cfg_dcnm(self, cfg, dcnm_obj):\n if not self.fw_init:\n return\n self.dcnm_obj = dcnm_obj\n self.fabric.store_dcnm(dcnm_obj)\n self.populate_dcnm_obj(dcnm_obj)", "response": "This routine stores the DCNM object and populates the cfg with the object."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _create_fw_fab_dev_te(self, tenant_id, drvr_name, fw_dict):\n is_fw_virt = self.is_device_virtual()\n ret = self.fabric.prepare_fabric_fw(tenant_id, fw_dict, is_fw_virt,\n fw_constants.RESULT_FW_CREATE_INIT)\n\n if not ret:\n LOG.error(\"Prepare Fabric failed\")\n return\n else:\n self.update_fw_db_final_result(fw_dict.get('fw_id'), (\n fw_constants.RESULT_FW_CREATE_DONE))\n ret = self.create_fw_device(tenant_id, fw_dict.get('fw_id'),\n fw_dict)\n if ret:\n self.fwid_attr[tenant_id].fw_drvr_created(True)\n self.update_fw_db_dev_status(fw_dict.get('fw_id'), 'SUCCESS')\n LOG.info(\"FW device create returned success for tenant %s\",\n tenant_id)\n else:\n LOG.error(\"FW device create returned failure for tenant %s\",\n tenant_id)", "response": "Prepares the Fabric and creates the device and updates the database with the final result."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate the Firewall and creates the fabric and device.", "response": "def _check_create_fw(self, tenant_id, drvr_name):\n \"\"\"Creates the Firewall, if all conditions are met.\n\n This function first checks if all the configuration are done\n for a FW to be launched. After that it creates the FW entry in the\n DB. After that, it calls the routine to prepare the fabric and\n configure the device.\n \"\"\"\n if self.fwid_attr[tenant_id].is_fw_drvr_create_needed():\n fw_dict = self.fwid_attr[tenant_id].get_fw_dict()\n try:\n with self.fwid_attr[tenant_id].mutex_lock:\n ret = self.add_fw_db(fw_dict.get('fw_id'), fw_dict,\n fw_constants.RESULT_FW_CREATE_INIT)\n if not ret:\n LOG.error(\"Adding FW DB failed for tenant %s\",\n tenant_id)\n return\n self._create_fw_fab_dev(tenant_id, drvr_name, fw_dict)\n except Exception as exc:\n LOG.error(\"Exception raised in create fw %s\", str(exc))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _delete_fw_fab_dev(self, tenant_id, drvr_name, fw_dict):\n is_fw_virt = self.is_device_virtual()\n if self.fwid_attr[tenant_id].is_fw_drvr_created():\n ret = self.delete_fw_device(tenant_id, fw_dict.get('fw_id'),\n fw_dict)\n if not ret:\n LOG.error(\"Error in delete_fabric_fw device for tenant \"\n \"%s\", tenant_id)\n return False\n else:\n self.fwid_attr[tenant_id].fw_drvr_created(False)\n self.update_fw_db_dev_status(fw_dict.get('fw_id'), '')\n ret = self.fabric.delete_fabric_fw(tenant_id, fw_dict, is_fw_virt,\n fw_constants.RESULT_FW_DELETE_INIT)\n if not ret:\n LOG.error(\"Error in delete_fabric_fw for tenant %s\",\n tenant_id)\n return False\n self.update_fw_db_final_result(fw_dict.get('fw_id'), (\n fw_constants.RESULT_FW_DELETE_DONE))\n self.delete_fw(fw_dict.get('fw_id'))\n return True", "response": "Deletes the Firewall.\n\n This routine calls the fabric class to delete the fabric when\n a firewall is deleted. It also calls the device manager to\n unconfigure the device. It updates the database with the final\n result."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _check_delete_fw(self, tenant_id, drvr_name):\n fw_dict = self.fwid_attr[tenant_id].get_fw_dict()\n ret = False\n try:\n with self.fwid_attr[tenant_id].mutex_lock:\n self.update_fw_db_final_result(fw_dict.get('fw_id'), (\n fw_constants.RESULT_FW_DELETE_INIT))\n ret = self._delete_fw_fab_dev(tenant_id, drvr_name, fw_dict)\n except Exception as exc:\n LOG.error(\"Exception raised in delete fw %s\", str(exc))\n return ret", "response": "Deletes the Firewall if all conditioms are met."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _check_update_fw(self, tenant_id, drvr_name):\n if self.fwid_attr[tenant_id].is_fw_complete():\n fw_dict = self.fwid_attr[tenant_id].get_fw_dict()\n self.modify_fw_device(tenant_id, fw_dict.get('fw_id'), fw_dict)", "response": "Update the Firewall cfg by calling the driver."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ntops level FW create function.", "response": "def fw_create(self, data, fw_name=None, cache=False):\n \"\"\"Top level FW create function. \"\"\"\n LOG.debug(\"FW create %s\", data)\n try:\n self._fw_create(fw_name, data, cache)\n except Exception as exc:\n LOG.error(\"Exception in fw_create %s\", str(exc))"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nupdates routine for the Firewall.", "response": "def _fw_update(self, drvr_name, data):\n \"\"\"Update routine for the Firewall.\n\n Check if FW is already cfgd using the below function\n if self.fwid_attr[tenant_id].is_fw_complete() or\n is_fw_drvr_create_needed():\n The above two functions will take care of whether FW is already\n cfgd or about to be cfgd in case of error.\n If yes, this may be a change in policies attached to FW.\n If no, do a check, create after storing the parameters like\n rtr_id.\n \"\"\"\n fw = data.get('firewall')\n tenant_id = fw.get('tenant_id')\n if self.fwid_attr[tenant_id].is_fw_complete() or \\\n self.fwid_attr[tenant_id].is_fw_drvr_create_needed():\n prev_info_complete = True\n else:\n prev_info_complete = False\n\n tenant_obj = self.fwid_attr[tenant_id]\n if 'router_ids' in fw and len(fw.get('router_ids')) != 0:\n rtr_id = fw.get('router_ids')[0]\n name = dfa_dbm.DfaDBMixin.get_project_name(self, tenant_id)\n rtr_name = '_'.join([fw_constants.TENANT_EDGE_RTR, name])\n\n fw_rtr_name = self.os_helper.get_rtr_name(rtr_id)\n fw_type = None\n if fw_rtr_name == rtr_name:\n fw_type = fw_constants.FW_TENANT_EDGE\n tenant_obj.update_fw_params(rtr_id, fw_type)\n\n if not prev_info_complete:\n self._check_create_fw(tenant_id, drvr_name)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef fw_update(self, data, fw_name=None):\n LOG.debug(\"FW Update %s\", data)\n self._fw_update(fw_name, data)", "response": "Top level FW update function."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _fw_rule_decode_store(self, data):\n fw_rule = data.get('firewall_rule')\n rule = {'protocol': fw_rule.get('protocol'),\n 'source_ip_address': fw_rule.get('source_ip_address'),\n 'destination_ip_address': fw_rule.get(\n 'destination_ip_address'),\n 'source_port': fw_rule.get('source_port'),\n 'destination_port': fw_rule.get('destination_port'),\n 'action': fw_rule.get('action'),\n 'enabled': fw_rule.get('enabled'),\n 'name': fw_rule.get('name')}\n return rule", "response": "Misc function to decode the firewall rule from Openstack."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fw_rule_create(self, data, fw_name=None, cache=False):\n LOG.debug(\"FW Rule create %s\", data)\n self._fw_rule_create(fw_name, data, cache)", "response": "Top level rule creation routine."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _fw_rule_delete(self, drvr_name, data):\n rule_id = data.get('firewall_rule_id')\n tenant_id = self.tenant_db.get_rule_tenant(rule_id)\n\n if tenant_id not in self.fwid_attr:\n LOG.error(\"Invalid tenant id for FW delete %s\", tenant_id)\n return\n tenant_obj = self.fwid_attr[tenant_id]\n # Guess actual FW/policy need not be deleted if this is the active\n # rule, Openstack does not allow it to be deleted\n tenant_obj.delete_rule(rule_id)\n self.tenant_db.del_rule_tenant(rule_id)", "response": "Function that deletes the firewall rule from the local cache."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef fw_rule_delete(self, data, fw_name=None):\n LOG.debug(\"FW Rule delete %s\", data)\n self._fw_rule_delete(fw_name, data)", "response": "Top level rule delete function."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfunctions to update the firewall rules.", "response": "def _fw_rule_update(self, drvr_name, data):\n \"\"\"Firewall Rule update routine.\n\n Function to decode the updated rules and call routines that\n in turn calls the device routines to update rules.\n \"\"\"\n LOG.debug(\"FW Update %s\", data)\n tenant_id = data.get('firewall_rule').get('tenant_id')\n fw_rule = data.get('firewall_rule')\n rule = self._fw_rule_decode_store(data)\n rule_id = fw_rule.get('id')\n if tenant_id not in self.fwid_attr or not (\n self.fwid_attr[tenant_id].is_rule_present(rule_id)):\n LOG.error(\"Incorrect update info for tenant %s\", tenant_id)\n return\n self.fwid_attr[tenant_id].rule_update(rule_id, rule)\n self._check_update_fw(tenant_id, drvr_name)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef fw_rule_update(self, data, fw_name=None):\n LOG.debug(\"FW Update Debug\")\n self._fw_rule_update(fw_name, data)", "response": "Top level rule update routine."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _fw_policy_delete(self, drvr_name, data):\n policy_id = data.get('firewall_policy_id')\n tenant_id = self.tenant_db.get_policy_tenant(policy_id)\n\n if tenant_id not in self.fwid_attr:\n LOG.error(\"Invalid tenant id for FW delete %s\", tenant_id)\n return\n tenant_obj = self.fwid_attr[tenant_id]\n # Guess actual FW need not be deleted since if this is the active\n # policy, Openstack does not allow it to be deleted\n tenant_obj.delete_policy(policy_id)\n self.tenant_db.del_policy_tenant(policy_id)", "response": "Routine to delete the firewall policy from local cache."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ntopping level policy delete routine.", "response": "def fw_policy_delete(self, data, fw_name=None):\n \"\"\"Top level policy delete routine. \"\"\"\n LOG.debug(\"FW Policy Debug\")\n self._fw_policy_delete(fw_name, data)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _fw_policy_create(self, drvr_name, data, cache):\n policy = {}\n fw_policy = data.get('firewall_policy')\n tenant_id = fw_policy.get('tenant_id')\n LOG.info(\"Creating policy for tenant %s\", tenant_id)\n policy_id = fw_policy.get('id')\n policy_name = fw_policy.get('name')\n pol_rule_dict = fw_policy.get('firewall_rules')\n if tenant_id not in self.fwid_attr:\n self.fwid_attr[tenant_id] = FwMapAttr(tenant_id)\n policy['name'] = policy_name\n policy['rule_dict'] = pol_rule_dict\n self.fwid_attr[tenant_id].store_policy(policy_id, policy)\n if not cache:\n self._check_create_fw(tenant_id, drvr_name)\n self.tenant_db.store_policy_tenant(policy_id, tenant_id)\n for rule in pol_rule_dict:\n rule_id = rule\n if not self.fwid_attr[tenant_id].is_rule_present(rule_id):\n rule_data = self.os_helper.get_fw_rule(rule_id)\n if rule_data is not None:\n self.fw_rule_create(rule_data, cache=cache)", "response": "This routine creates the firewall policy and rule objects."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ntop level policy create routine.", "response": "def fw_policy_create(self, data, fw_name=None, cache=False):\n \"\"\"Top level policy create routine. \"\"\"\n LOG.debug(\"FW Policy Debug\")\n self._fw_policy_create(fw_name, data, cache)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nconverting the Firewall DB to an event message format.", "response": "def convert_fwdb_event_msg(self, rule, tenant_id, rule_id, policy_id):\n \"\"\"Convert the Firewall DB to a event message format.\n\n From inputs from DB, this will create a FW rule dictionary that\n resembles the actual data from Openstack when a rule is created.\n This is usually called after restart, in order to populate local\n cache.\n \"\"\"\n rule.update({'tenant_id': tenant_id, 'id': rule_id,\n 'firewall_policy_id': policy_id})\n fw_rule_data = {'firewall_rule': rule}\n return fw_rule_data"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef convert_fwdb(self, tenant_id, name, policy_id, fw_id):\n fw_dict = {'tenant_id': tenant_id, 'name': name, 'id': fw_id,\n 'firewall_policy_id': policy_id,\n 'admin_state_up': True}\n fw_data = {'firewall': fw_dict}\n return fw_data", "response": "Convert the Firewall DB to a query response."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef populate_local_cache(self):\n fw_dict = self.get_all_fw_db()\n LOG.info(\"Populating FW Mgr Local Cache\")\n for fw_id in fw_dict:\n fw_data = fw_dict.get(fw_id)\n tenant_id = fw_data.get('tenant_id')\n rule_dict = fw_data.get('rules').get('rules')\n policy_id = fw_data.get('rules').get('firewall_policy_id')\n for rule in rule_dict:\n fw_evt_data = self.convert_fwdb_event_msg(rule_dict.get(rule),\n tenant_id, rule,\n policy_id)\n LOG.info(\"Populating Rules for tenant %s\", tenant_id)\n self.fw_rule_create(fw_evt_data, cache=True)\n fw_os_data = self.os_helper.get_fw(fw_id)\n # If enabler is stopped and FW is deleted, then the above routine\n # will fail.\n if fw_os_data is None:\n fw_os_data = self.convert_fwdb(tenant_id, fw_data.get('name'),\n policy_id, fw_id)\n LOG.info(\"Populating FW for tenant %s\", tenant_id)\n self.fw_create(fw_os_data, cache=True)\n if fw_data.get('device_status') == 'SUCCESS':\n self.fwid_attr[tenant_id].fw_drvr_created(True)\n else:\n self.fwid_attr[tenant_id].fw_drvr_created(False)\n return fw_dict", "response": "This method populates the local cache with the data from the database."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef retry_failure_fab_dev_create(self, tenant_id, fw_data, fw_dict):\n result = fw_data.get('result').split('(')[0]\n is_fw_virt = self.is_device_virtual()\n # Fabric portion\n if result == fw_constants.RESULT_FW_CREATE_INIT:\n name = dfa_dbm.DfaDBMixin.get_project_name(self, tenant_id)\n ret = self.fabric.retry_failure(tenant_id, name, fw_dict,\n is_fw_virt, result)\n if not ret:\n LOG.error(\"Retry failure returned fail for tenant %s\",\n tenant_id)\n return\n else:\n result = fw_constants.RESULT_FW_CREATE_DONE\n self.update_fw_db_final_result(fw_dict.get('fw_id'), result)\n # Device portion\n if result == fw_constants.RESULT_FW_CREATE_DONE:\n if fw_data.get('device_status') != 'SUCCESS':\n ret = self.create_fw_device(tenant_id, fw_dict.get('fw_id'),\n fw_dict)\n if ret:\n self.fwid_attr[tenant_id].fw_drvr_created(True)\n self.update_fw_db_dev_status(fw_dict.get('fw_id'),\n 'SUCCESS')\n LOG.info(\"Retry failue return success for create\"\n \" tenant %s\", tenant_id)", "response": "This routine retries the failure of fabric device create."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef retry_failure_fab_dev_delete(self, tenant_id, fw_data, fw_dict):\n result = fw_data.get('result').split('(')[0]\n name = dfa_dbm.DfaDBMixin.get_project_name(self, tenant_id)\n fw_dict['tenant_name'] = name\n is_fw_virt = self.is_device_virtual()\n if result == fw_constants.RESULT_FW_DELETE_INIT:\n if self.fwid_attr[tenant_id].is_fw_drvr_created():\n ret = self.delete_fw_device(tenant_id, fw_dict.get('fw_id'),\n fw_dict)\n if ret:\n # Device portion\n self.update_fw_db_dev_status(fw_dict.get('fw_id'),\n '')\n self.fwid_attr[tenant_id].fw_drvr_created(False)\n LOG.info(\"Retry failue dev return success for delete\"\n \" tenant %s\", tenant_id)\n else:\n return\n name = dfa_dbm.DfaDBMixin.get_project_name(self, tenant_id)\n ret = self.fabric.retry_failure(tenant_id, name, fw_dict,\n is_fw_virt, result)\n if not ret:\n LOG.error(\"Retry failure returned fail for tenant %s\",\n tenant_id)\n return\n result = fw_constants.RESULT_FW_DELETE_DONE\n self.update_fw_db_final_result(fw_dict.get('fw_id'), result)\n self.delete_fw(fw_dict.get('fw_id'))\n self.fwid_attr[tenant_id].delete_fw(fw_dict.get('fw_id'))\n self.tenant_db.del_fw_tenant(fw_dict.get('fw_id'))", "response": "This routine retries the failure cases for delete."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef fw_retry_failures_create(self):\n for tenant_id in self.fwid_attr:\n try:\n with self.fwid_attr[tenant_id].mutex_lock:\n if self.fwid_attr[tenant_id].is_fw_drvr_create_needed():\n fw_dict = self.fwid_attr[tenant_id].get_fw_dict()\n if fw_dict:\n fw_obj, fw_data = self.get_fw(fw_dict.get('fw_id'))\n self.retry_failure_fab_dev_create(tenant_id,\n fw_data,\n fw_dict)\n else:\n LOG.error(\"FW data not found for tenant %s\",\n tenant_id)\n except Exception as exc:\n LOG.error(\"Exception in retry failure create %s\",\n str(exc))", "response": "This module is called when the create attempt fails."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fill_fw_dict_from_db(self, fw_data):\n rule_dict = fw_data.get('rules').get('rules')\n fw_dict = {'fw_id': fw_data.get('fw_id'),\n 'fw_name': fw_data.get('name'),\n 'firewall_policy_id': fw_data.get('firewall_policy_id'),\n 'fw_type': fw_data.get('fw_type'),\n 'router_id': fw_data.get('router_id'), 'rules': {}}\n for rule in rule_dict:\n fw_dict['rules'][rule] = rule_dict.get(rule)\n return fw_dict", "response": "This routine fills a local fw_dict with data from DB."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fw_retry_failures(self):\n if not self.fw_init:\n return\n try:\n self.fw_retry_failures_create()\n self.fw_retry_failures_delete()\n except Exception as exc:\n LOG.error(\"Exception in retry failures %s\", str(exc))", "response": "Top level retry routine called."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfinds the start points matching the linespec regex", "response": "def _find_starts(self, linespec):\n \"\"\"\n Finds the start points.\n\n Start points matching the linespec regex are returned as list in the\n following format:\n [(item, index), (item, index).....\n \"\"\"\n linespec += \".*\"\n start_points = []\n for item in self._indent_list:\n match = re.search(linespec, item[1])\n if match:\n entry = (item, self._indent_list.index(item))\n start_points.append(entry)\n return start_points"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nfind lines that match the linespec regex.", "response": "def find_lines(self, linespec):\n \"\"\"Find lines that match the linespec regex.\"\"\"\n res = []\n linespec += \".*\"\n for line in self.cfg:\n match = re.search(linespec, line)\n if match:\n res.append(match.group(0))\n return res"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef find_objects(self, linespec):\n # Note(asr1kteam): In this code we are only adding children one-level\n # deep to a given parent (linespec), as that satisfies the IOS conf\n # parsing.\n # Note(asr1kteam): Not tested with tabs in the config. Currently used\n # with IOS config where we haven't seen tabs, but may be needed for a\n # more general case.\n res = []\n self._build_indent_based_list()\n for item, index in self._find_starts(linespec):\n parent = LineItem(item[1])\n next_ident_level = self._find_next_indent_level(index)\n if next_ident_level:\n # We start iterating from the next element\n for item in self._indent_list[(index + 1):]:\n if item[0] == next_ident_level:\n parent.add_children(LineItem(item[1]))\n elif item[0] > next_ident_level: # We skip higher indent\n continue\n else: # Indent level is same or lesser than item\n break\n res.append(parent)\n return res", "response": "Find lines that match the linespec regex."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef find_children(self, linespec):\n res = []\n for parent in self.find_objects(linespec):\n res.append(parent.line)\n res.extend([child.line for child in parent.children])\n return res", "response": "Find lines and immediate children that match the linespec regex."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nfunction to enable LLDP on the interface.", "response": "def enable_lldp(self, port_name, is_ncb=True, is_nb=False):\n \"\"\"Function to enable LLDP on the interface. \"\"\"\n\n if is_ncb:\n self.run_lldptool([\"-L\", \"-i\", port_name, \"-g\", \"ncb\",\n \"adminStatus=rxtx\"])\n if is_nb:\n self.run_lldptool([\"-L\", \"-i\", port_name, \"-g\", \"nb\",\n \"adminStatus=rxtx\"])"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfunctions to enable LLDP on the interface.", "response": "def enable_lldp(self):\n \"\"\"Function to enable LLDP on the interface. \"\"\"\n if self.is_ncb:\n self.run_lldptool([\"-L\", \"-i\", self.port_name, \"-g\", \"ncb\",\n \"adminStatus=rxtx\"])\n if self.is_nb:\n self.run_lldptool([\"-L\", \"-i\", self.port_name, \"-g\", \"nb\",\n \"adminStatus=rxtx\"])"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfunctions to enable EVB on the interface.", "response": "def enable_evb(self):\n \"\"\"Function to enable EVB on the interface. \"\"\"\n if self.is_ncb:\n self.run_lldptool([\"-T\", \"-i\", self.port_name, \"-g\", \"ncb\",\n \"-V\", \"evb\", \"enableTx=yes\"])\n ret = self.enable_gpid()\n return ret\n else:\n LOG.error(\"EVB cannot be set on NB\")\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfunctions to enable Group ID on the interface.", "response": "def enable_gpid(self):\n \"\"\"Function to enable Group ID on the interface.\n\n This is needed to use the MAC, GID, VID Filter.\n \"\"\"\n if self.is_ncb:\n self.run_lldptool([\"-T\", \"-i\", self.port_name, \"-g\", \"ncb\",\n \"-V\", \"evb\", \"-c\", \"evbgpid=yes\"])\n return True\n else:\n LOG.error(\"GPID cannot be set on NB\")\n return False"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _vdp_refrsh_hndlr(self):\n LOG.debug(\"Refresh handler\")\n try:\n if not self.vdp_vif_map:\n LOG.debug(\"vdp_vif_map not created, returning\")\n return\n vdp_vif_map = dict.copy(self.vdp_vif_map)\n oui_vif_map = dict.copy(self.oui_vif_map)\n for key in six.iterkeys(vdp_vif_map):\n lvdp_dict = vdp_vif_map.get(key)\n loui_dict = oui_vif_map.get(key)\n if not lvdp_dict:\n return\n if not loui_dict:\n oui_id = \"\"\n oui_data = \"\"\n else:\n oui_id = loui_dict.get('oui_id')\n oui_data = loui_dict.get('oui_data')\n with self.mutex_lock:\n if key in self.vdp_vif_map:\n LOG.debug(\"Sending Refresh for VSI %s\", lvdp_dict)\n vdp_vlan, fail_reason = self.send_vdp_assoc(\n vsiid=lvdp_dict.get('vsiid'),\n mgrid=lvdp_dict.get('mgrid'),\n typeid=lvdp_dict.get('typeid'),\n typeid_ver=lvdp_dict.get('typeid_ver'),\n vsiid_frmt=lvdp_dict.get('vsiid_frmt'),\n filter_frmt=lvdp_dict.get('filter_frmt'),\n gid=lvdp_dict.get('gid'),\n mac=lvdp_dict.get('mac'),\n vlan=0, oui_id=oui_id, oui_data=oui_data,\n sw_resp=True)\n # check validity.\n if not utils.is_valid_vlan_tag(vdp_vlan):\n LOG.error(\"Returned vlan %(vlan)s is invalid.\",\n {'vlan': vdp_vlan})\n # Need to invoke CB. So no return here.\n vdp_vlan = 0\n exist_vdp_vlan = lvdp_dict.get('vdp_vlan')\n exist_fail_reason = lvdp_dict.get('fail_reason')\n callback_count = lvdp_dict.get('callback_count')\n # Condition will be hit only during error cases when switch\n # reloads or when compute reloads\n if vdp_vlan != exist_vdp_vlan or (\n fail_reason != exist_fail_reason or\n callback_count > vdp_const.CALLBACK_THRESHOLD):\n # Invoke the CB Function\n cb_fn = lvdp_dict.get('vsw_cb_fn')\n cb_data = lvdp_dict.get('vsw_cb_data')\n if cb_fn:\n cb_fn(cb_data, vdp_vlan, fail_reason)\n lvdp_dict['vdp_vlan'] = vdp_vlan\n lvdp_dict['fail_reason'] = fail_reason\n lvdp_dict['callback_count'] = 0\n else:\n lvdp_dict['callback_count'] += 1\n except Exception as e:\n LOG.error(\"Exception in Refrsh %s\", str(e))", "response": "Periodic refresh of vNIC events to VDP."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run_lldptool(self, args):\n full_args = ['lldptool'] + args\n try:\n utils.execute(full_args, root_helper=self.root_helper)\n except Exception as e:\n LOG.error(\"Unable to execute %(cmd)s. \"\n \"Exception: %(exception)s\",\n {'cmd': full_args, 'exception': e})", "response": "Function for invoking the lldptool utility."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef store_oui(self, port_uuid, oui_type, oui_data):\n self.oui_vif_map[port_uuid] = {'oui_id': oui_type,\n 'oui_data': oui_data}", "response": "Function for storing the OUI ID and Opaque Data for a given port"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nstores the VDP VSI information for the given port in the VDP VSI store.", "response": "def store_vdp_vsi(self, port_uuid, mgrid, typeid, typeid_ver,\n vsiid_frmt, vsiid, filter_frmt, gid, mac, vlan,\n new_network, reply, oui_id, oui_data, vsw_cb_fn,\n vsw_cb_data, reason):\n \"\"\"Stores the vNIC specific info for VDP Refresh.\n\n :param uuid: vNIC UUID\n :param mgrid: MGR ID\n :param typeid: Type ID\n :param typeid_ver: Version of the Type ID\n :param vsiid_frmt: Format of the following VSI argument\n :param vsiid: VSI value\n :param filter_frmt: Filter Format\n :param gid: Group ID the vNIC belongs to\n :param mac: MAC Address of the vNIC\n :param vlan: VLAN of the vNIC\n :param new_network: Is this the first vNIC of this network\n :param reply: Response from the switch\n :param oui_id: OUI Type\n :param oui_data: OUI Data\n :param vsw_cb_fn: Callback function from the app.\n :param vsw_cb_data: Callback data for the app.\n :param reason: Failure Reason\n \"\"\"\n if port_uuid in self.vdp_vif_map:\n LOG.debug(\"Not Storing VDP VSI MAC %(mac)s UUID %(uuid)s\",\n {'mac': mac, 'uuid': vsiid})\n if new_network:\n vdp_vlan = reply\n else:\n vdp_vlan = vlan\n vdp_dict = {'vdp_vlan': vdp_vlan,\n 'mgrid': mgrid,\n 'typeid': typeid,\n 'typeid_ver': typeid_ver,\n 'vsiid_frmt': vsiid_frmt,\n 'vsiid': vsiid,\n 'filter_frmt': filter_frmt,\n 'mac': mac,\n 'gid': gid,\n 'vsw_cb_fn': vsw_cb_fn,\n 'vsw_cb_data': vsw_cb_data,\n 'fail_reason': reason,\n 'callback_count': 0}\n self.vdp_vif_map[port_uuid] = vdp_dict\n LOG.debug(\"Storing VDP VSI MAC %(mac)s UUID %(uuid)s VDP VLAN \"\n \"%(vlan)s\", {'mac': mac, 'uuid': vsiid, 'vlan': vdp_vlan})\n if oui_id:\n self.store_oui(port_uuid, oui_id, oui_data)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef clear_oui(self, port_uuid):\n if port_uuid in self.oui_vif_map:\n del self.oui_vif_map[port_uuid]\n else:\n LOG.debug(\"OUI does not exist\")", "response": "Clears the OUI specific info."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nclear the VDP VSI information for the given port.", "response": "def clear_vdp_vsi(self, port_uuid):\n \"\"\"Stores the vNIC specific info for VDP Refresh.\n\n :param uuid: vNIC UUID\n \"\"\"\n try:\n LOG.debug(\"Clearing VDP VSI MAC %(mac)s UUID %(uuid)s\",\n {'mac': self.vdp_vif_map[port_uuid].get('mac'),\n 'uuid': self.vdp_vif_map[port_uuid].get('vsiid')})\n del self.vdp_vif_map[port_uuid]\n except Exception:\n LOG.error(\"VSI does not exist\")\n self.clear_oui(port_uuid)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ngenerate a list of OUI arguments for a Cisco VDP VM.", "response": "def gen_cisco_vdp_oui(self, oui_id, oui_data):\n \"\"\"Cisco specific handler for constructing OUI arguments. \"\"\"\n oui_list = []\n vm_name = oui_data.get('vm_name')\n if vm_name is not None:\n oui_str = \"oui=%s,\" % oui_id\n oui_name_str = oui_str + \"vm_name=\" + vm_name\n oui_list.append(oui_name_str)\n ip_addr = oui_data.get('ip_addr')\n if ip_addr is not None:\n oui_str = \"oui=%s,\" % oui_id\n ip_addr_str = oui_str + \"ipv4_addr=\" + ip_addr\n oui_list.append(ip_addr_str)\n vm_uuid = oui_data.get('vm_uuid')\n if vm_uuid is not None:\n oui_str = \"oui=%s,\" % oui_id\n vm_uuid_str = oui_str + \"vm_uuid=\" + vm_uuid\n oui_list.append(vm_uuid_str)\n return oui_list"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef gen_oui_str(self, oui_list):\n oui_str = []\n for oui in oui_list:\n oui_str.append('-c')\n oui_str.append(oui)\n return oui_str", "response": "Generate the OUI string for vdptool."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef construct_vdp_dict(self, mode, mgrid, typeid, typeid_ver, vsiid_frmt,\n vsiid, filter_frmt, gid, mac, vlan, oui_id,\n oui_data):\n \"\"\"Constructs the VDP Message.\n\n Please refer http://www.ieee802.org/1/pages/802.1bg.html VDP\n Section for more detailed information\n :param mode: Associate or De-associate\n :param mgrid: MGR ID\n :param typeid: Type ID\n :param typeid_ver: Version of the Type ID\n :param vsiid_frmt: Format of the following VSI argument\n :param vsiid: VSI value\n :param filter_frmt: Filter Format\n :param gid: Group ID the vNIC belongs to\n :param mac: MAC Address of the vNIC\n :param vlan: VLAN of the vNIC\n :param oui_id: OUI Type\n :param oui_data: OUI Data\n :return vdp_keyword_str: Dictionary of VDP arguments and values\n \"\"\"\n vdp_keyword_str = {}\n if mgrid is None:\n mgrid = self.vdp_opts.get('mgrid')\n mgrid_str = \"mgrid2=%s\" % mgrid\n if typeid is None:\n typeid = self.vdp_opts.get('typeid')\n typeid_str = \"typeid=%s\" % typeid\n if typeid_ver is None:\n typeid_ver = self.vdp_opts.get('typeidver')\n typeid_ver_str = \"typeidver=%s\" % typeid_ver\n if int(vsiid_frmt) == int(self.vdp_opts.get('vsiidfrmt')):\n vsiid_str = \"uuid=%s\" % vsiid\n else:\n # Only format supported for now\n LOG.error(\"Unsupported VSIID Format1\")\n return vdp_keyword_str\n if vlan == constants.INVALID_VLAN:\n vlan = 0\n if int(filter_frmt) == vdp_const.VDP_FILTER_GIDMACVID:\n if not mac or gid == 0:\n LOG.error(\"Incorrect Filter Format Specified\")\n return vdp_keyword_str\n else:\n f = \"filter=%s-%s-%s\"\n filter_str = f % (vlan, mac, gid)\n elif int(filter_frmt) == vdp_const.VDP_FILTER_GIDVID:\n if gid == 0:\n LOG.error(\"NULL GID Specified\")\n return vdp_keyword_str\n else:\n filter_str = \"filter=\" + '%d' % vlan + \"--\" + '%ld' % gid\n elif int(filter_frmt) == vdp_const.VDP_FILTER_MACVID:\n if not mac:\n LOG.error(\"NULL MAC Specified\")\n return vdp_keyword_str\n else:\n filter_str = \"filter=\" + '%d' % vlan + \"-\" + mac\n elif int(filter_frmt) == vdp_const.VDP_FILTER_VID:\n filter_str = \"filter=\" + '%d' % vlan\n else:\n LOG.error(\"Incorrect Filter Format Specified\")\n return vdp_keyword_str\n oui_list = []\n if oui_id is not None and oui_data is not None:\n if oui_id is 'cisco':\n oui_list = self.gen_cisco_vdp_oui(oui_id, oui_data)\n mode_str = \"mode=\" + mode\n vdp_keyword_str = dict(mode=mode_str, mgrid=mgrid_str,\n typeid=typeid_str, typeid_ver=typeid_ver_str,\n vsiid=vsiid_str, filter=filter_str,\n oui_list=oui_list)\n return vdp_keyword_str", "response": "Constructs the VDP Message."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nconstruct and sends a VDP Query Message.", "response": "def send_vdp_query_msg(self, mode, mgrid, typeid, typeid_ver, vsiid_frmt,\n vsiid, filter_frmt, gid, mac, vlan, oui_id,\n oui_data):\n \"\"\"Constructs and Sends the VDP Query Message.\n\n Please refer http://www.ieee802.org/1/pages/802.1bg.html VDP\n Section for more detailed information\n :param mode: Associate or De-associate\n :param mgrid: MGR ID\n :param typeid: Type ID\n :param typeid_ver: Version of the Type ID\n :param vsiid_frmt: Format of the following VSI argument\n :param vsiid: VSI value\n :param filter_frmt: Filter Format\n :param gid: Group ID the vNIC belongs to\n :param mac: MAC Address of the vNIC\n :param vlan: VLAN of the vNIC\n :param oui_id: OUI Type\n :param oui_data: OUI Data\n :param sw_resp: Flag indicating if response is required from the daemon\n :return reply: Reply from vdptool\n \"\"\"\n if not self.is_ncb:\n LOG.error(\"EVB cannot be set on NB\")\n return\n vdp_key_str = self.construct_vdp_dict(mode, mgrid, typeid,\n typeid_ver, vsiid_frmt, vsiid,\n filter_frmt, gid, mac, vlan,\n None, None)\n if len(vdp_key_str) == 0:\n LOG.error(\"NULL List\")\n return\n reply = self.run_vdptool([\"-t\", \"-i\", self.port_name, \"-R\", \"-V\", mode,\n \"-c\", vdp_key_str['mode'],\n \"-c\", vdp_key_str['mgrid'],\n \"-c\", vdp_key_str['typeid'],\n \"-c\", vdp_key_str['typeid_ver'],\n \"-c\", vdp_key_str['vsiid']])\n return reply"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nconstruct and sends a VDP Message.", "response": "def send_vdp_msg(self, mode, mgrid, typeid, typeid_ver, vsiid_frmt,\n vsiid, filter_frmt, gid, mac, vlan, oui_id, oui_data,\n sw_resp):\n \"\"\"Constructs and Sends the VDP Message.\n\n Please refer http://www.ieee802.org/1/pages/802.1bg.html VDP\n Section for more detailed information\n :param mode: Associate or De-associate\n :param mgrid: MGR ID\n :param typeid: Type ID\n :param typeid_ver: Version of the Type ID\n :param vsiid_frmt: Format of the following VSI argument\n :param vsiid: VSI value\n :param filter_frmt: Filter Format\n :param gid: Group ID the vNIC belongs to\n :param mac: MAC Address of the vNIC\n :param vlan: VLAN of the vNIC\n :param oui_id: OUI Type\n :param oui_data: OUI Data\n :param sw_resp: Flag indicating if response is required from the daemon\n :return reply: Reply from vdptool\n \"\"\"\n if not self.is_ncb:\n LOG.error(\"EVB cannot be set on NB\")\n return\n vdp_key_str = self.construct_vdp_dict(mode, mgrid, typeid,\n typeid_ver, vsiid_frmt, vsiid,\n filter_frmt, gid, mac, vlan,\n oui_id, oui_data)\n if len(vdp_key_str) == 0:\n LOG.error(\"NULL List\")\n return\n oui_cmd_str = self.gen_oui_str(vdp_key_str['oui_list'])\n if sw_resp:\n # If filter is not VID and if VLAN is 0, Query for the TLV first,\n # if found VDP will return the VLAN. Add support for this once\n # vdptool has the support for querying exact VSI filters\n # fixme(padkrish)\n\n reply = self.run_vdptool([\"-T\", \"-i\", self.port_name, \"-W\",\n \"-V\", mode, \"-c\", vdp_key_str['mode'],\n \"-c\", vdp_key_str['mgrid'], \"-c\",\n vdp_key_str['typeid'],\n \"-c\", vdp_key_str['typeid_ver'], \"-c\",\n vdp_key_str['vsiid'], \"-c\",\n \"hints=none\", \"-c\",\n vdp_key_str['filter']],\n oui_args=oui_cmd_str)\n else:\n reply = self.run_vdptool([\"-T\", \"-i\", self.port_name,\n \"-V\", mode, \"-c\", vdp_key_str['mode'],\n \"-c\", vdp_key_str['mgrid'], \"-c\",\n vdp_key_str['typeid'],\n \"-c\", vdp_key_str['typeid_ver'], \"-c\",\n vdp_key_str['vsiid'], \"-c\",\n \"hints=none\", \"-c\",\n vdp_key_str['filter']],\n oui_args=oui_cmd_str)\n return reply"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncrossing Check the reply against the input vsiid and mac for get query.", "response": "def crosscheck_query_vsiid_mac(self, reply, vsiid, mac):\n \"\"\"Cross Check the reply against the input vsiid,mac for get query. \"\"\"\n vsiid_reply = reply.partition(\"uuid\")[2].split()[0][4:]\n if vsiid != vsiid_reply:\n fail_reason = vdp_const.vsi_mismatch_failure_reason % (\n vsiid, vsiid_reply)\n LOG.error(\"%s\", fail_reason)\n return False, fail_reason\n mac_reply = reply.partition(\"filter\")[2].split('-')[1]\n if mac != mac_reply:\n fail_reason = vdp_const.mac_mismatch_failure_reason % (\n mac, mac_reply)\n LOG.error(\"%s\", fail_reason)\n return False, fail_reason\n return True, None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nparsing the failure reason from VDP.", "response": "def get_vdp_failure_reason(self, reply):\n \"\"\"Parse the failure reason from VDP. \"\"\"\n try:\n fail_reason = reply.partition(\n \"filter\")[0].replace('\\t', '').split('\\n')[-2]\n if len(fail_reason) == 0:\n fail_reason = vdp_const.retrieve_failure_reason % (reply)\n except Exception:\n fail_reason = vdp_const.retrieve_failure_reason % (reply)\n return fail_reason"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if the filter is valid.", "response": "def check_filter_validity(self, reply, filter_str):\n \"\"\"Check for the validify of the filter. \"\"\"\n try:\n f_ind = reply.index(filter_str)\n l_ind = reply.rindex(filter_str)\n except Exception:\n fail_reason = vdp_const.filter_failure_reason % (reply)\n LOG.error(\"%s\", fail_reason)\n return False, fail_reason\n if f_ind != l_ind:\n # Currently not supported if reply contains a filter keyword\n fail_reason = vdp_const.multiple_filter_failure_reason % (reply)\n LOG.error(\"%s\", fail_reason)\n return False, fail_reason\n return True, None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_vlan_from_associate_reply(self, reply, vsiid, mac):\n try:\n verify_flag, fail_reason = self.crosscheck_reply_vsiid_mac(\n reply, vsiid, mac)\n if not verify_flag:\n return constants.INVALID_VLAN, fail_reason\n mode_str = reply.partition(\"mode = \")[2].split()[0]\n if mode_str != \"assoc\":\n fail_reason = self.get_vdp_failure_reason(reply)\n return constants.INVALID_VLAN, fail_reason\n except Exception:\n fail_reason = vdp_const.mode_failure_reason % (reply)\n LOG.error(\"%s\", fail_reason)\n return constants.INVALID_VLAN, fail_reason\n check_filter, fail_reason = self.check_filter_validity(\n reply, \"filter = \")\n if not check_filter:\n return constants.INVALID_VLAN, fail_reason\n try:\n vlan_val = reply.partition(\"filter = \")[2].split('-')[0]\n vlan = int(vlan_val)\n except ValueError:\n fail_reason = vdp_const.format_failure_reason % (reply)\n LOG.error(\"%s\", fail_reason)\n return constants.INVALID_VLAN, fail_reason\n return vlan, None", "response": "Parse the associate reply from VDP daemon to get the VLAN value."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef check_hints(self, reply):\n try:\n f_ind = reply.index(\"hints\")\n l_ind = reply.rindex(\"hints\")\n except Exception:\n fail_reason = vdp_const.hints_failure_reason % (reply)\n LOG.error(\"%s\", fail_reason)\n return False, fail_reason\n if f_ind != l_ind:\n # Currently not supported if reply contains a filter keyword\n fail_reason = vdp_const.multiple_hints_failure_reason % (reply)\n LOG.error(\"%s\", fail_reason)\n return False, fail_reason\n try:\n hints_compl = reply.partition(\"hints\")[2]\n hints_val = reply.partition(\"hints\")[2][0:4]\n len_hints = int(hints_val)\n hints_val = hints_compl[4:4 + len_hints]\n hints = int(hints_val)\n if hints != 0:\n fail_reason = vdp_const.nonzero_hints_failure % (hints)\n return False, fail_reason\n except ValueError:\n fail_reason = vdp_const.format_failure_reason % (reply)\n LOG.error(\"%s\", fail_reason)\n return False, fail_reason\n return True, None", "response": "Parse the hints to check for errors."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nparses the query reply from VDP daemon to get the VLAN value.", "response": "def get_vlan_from_query_reply(self, reply, vsiid, mac):\n \"\"\"Parse the query reply from VDP daemon to get the VLAN value. \"\"\"\n hints_ret, fail_reason = self.check_hints(reply)\n if not hints_ret:\n LOG.error(\"Incorrect hints found %s\", reply)\n return constants.INVALID_VLAN, fail_reason\n check_filter, fail_reason = self.check_filter_validity(reply, \"filter\")\n if not check_filter:\n return constants.INVALID_VLAN, fail_reason\n try:\n verify_flag, fail_reason = self.crosscheck_query_vsiid_mac(\n reply, vsiid, mac)\n if not verify_flag:\n return constants.INVALID_VLAN, fail_reason\n filter_val = reply.partition(\"filter\")[2]\n len_fil = len(filter_val)\n vlan_val = filter_val[4:len_fil].split('-')[0]\n vlan = int(vlan_val)\n except ValueError:\n fail_reason = vdp_const.format_failure_reason % (reply)\n LOG.error(\"%s\", fail_reason)\n return constants.INVALID_VLAN, fail_reason\n return vlan, None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nsend the VDP Associate Message.", "response": "def send_vdp_assoc(self, vsiid=None, mgrid=None, typeid=None,\n typeid_ver=None, vsiid_frmt=vdp_const.VDP_VSIFRMT_UUID,\n filter_frmt=vdp_const.VDP_FILTER_GIDMACVID, gid=0,\n mac=\"\", vlan=0, oui_id=\"\", oui_data=\"\", sw_resp=False):\n \"\"\"Sends the VDP Associate Message.\n\n Please refer http://www.ieee802.org/1/pages/802.1bg.html VDP\n Section for more detailed information\n :param vsiid: VSI value, Only UUID supported for now\n :param mgrid: MGR ID\n :param typeid: Type ID\n :param typeid_ver: Version of the Type ID\n :param vsiid_frmt: Format of the following VSI argument\n :param filter_frmt: Filter Format. Only supported for now\n :param gid: Group ID the vNIC belongs to\n :param mac: MAC Address of the vNIC\n :param vlan: VLAN of the vNIC\n :param oui_id: OUI Type\n :param oui_data: OUI Data\n :param sw_resp: Flag indicating if response is required from the daemon\n :return vlan: VLAN value returned by vdptool which in turn is given\n : by Switch\n \"\"\"\n\n if sw_resp and filter_frmt == vdp_const.VDP_FILTER_GIDMACVID:\n reply = self.send_vdp_query_msg(\"assoc\", mgrid, typeid, typeid_ver,\n vsiid_frmt, vsiid, filter_frmt,\n gid, mac, vlan, oui_id, oui_data)\n vlan_resp, fail_reason = self.get_vlan_from_query_reply(\n reply, vsiid, mac)\n if vlan_resp != constants.INVALID_VLAN:\n return vlan_resp, fail_reason\n reply = self.send_vdp_msg(\"assoc\", mgrid, typeid, typeid_ver,\n vsiid_frmt, vsiid, filter_frmt, gid, mac,\n vlan, oui_id, oui_data, sw_resp)\n if sw_resp:\n vlan, fail_reason = self.get_vlan_from_associate_reply(\n reply, vsiid, mac)\n return vlan, fail_reason\n return None, None"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef send_vdp_vnic_up(self, port_uuid=None, vsiid=None,\n mgrid=None, typeid=None, typeid_ver=None,\n vsiid_frmt=vdp_const.VDP_VSIFRMT_UUID,\n filter_frmt=vdp_const.VDP_FILTER_GIDMACVID,\n gid=0, mac=\"\", vlan=0, oui=None,\n new_network=False, vsw_cb_fn=None, vsw_cb_data=None):\n \"\"\"Interface function to apps, called for a vNIC UP.\n\n This currently sends an VDP associate message.\n Please refer http://www.ieee802.org/1/pages/802.1bg.html VDP\n Section for more detailed information\n :param uuid: uuid of the vNIC\n :param vsiid: VSI value, Only UUID supported for now\n :param mgrid: MGR ID\n :param typeid: Type ID\n :param typeid_ver: Version of the Type ID\n :param vsiid_frmt: Format of the following VSI argument\n :param filter_frmt: Filter Format. Only supported for now\n :param gid: Group ID the vNIC belongs to\n :param mac: MAC Address of the vNIC\n :param vlan: VLAN of the vNIC\n :param oui_id: OUI Type\n :param oui_data: OUI Data\n :param sw_resp: Flag indicating if response is required from the daemon\n :return reply: VLAN reply from vdptool\n \"\"\"\n if oui is None:\n oui = {}\n oui_id = None\n oui_data = None\n if 'oui_id' in oui:\n oui_id = oui['oui_id']\n oui_data = oui\n reply, fail_reason = self.send_vdp_assoc(\n vsiid=vsiid, mgrid=mgrid, typeid=typeid, typeid_ver=typeid_ver,\n vsiid_frmt=vsiid_frmt, filter_frmt=filter_frmt, gid=gid, mac=mac,\n vlan=vlan, oui_id=oui_id, oui_data=oui_data, sw_resp=new_network)\n self.store_vdp_vsi(port_uuid, mgrid, typeid, typeid_ver,\n vsiid_frmt, vsiid, filter_frmt, gid, mac, vlan,\n new_network, reply, oui_id, oui_data, vsw_cb_fn,\n vsw_cb_data, fail_reason)\n return reply, fail_reason", "response": "This interface is called for a vNIC UP message. This interface is called by apps when a vNIC is up and is ready to send a VDP association message."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef send_vdp_vnic_down(self, port_uuid=None, vsiid=None, mgrid=None,\n typeid=None, typeid_ver=None,\n vsiid_frmt=vdp_const.VDP_VSIFRMT_UUID,\n filter_frmt=vdp_const.VDP_FILTER_GIDMACVID,\n gid=0, mac=\"\", vlan=0, oui=\"\"):\n \"\"\"Interface function to apps, called for a vNIC DOWN.\n\n This currently sends an VDP dis-associate message.\n Please refer http://www.ieee802.org/1/pages/802.1bg.html VDP\n Section for more detailed information\n :param uuid: uuid of the vNIC\n :param vsiid: VSI value, Only UUID supported for now\n :param mgrid: MGR ID\n :param typeid: Type ID\n :param typeid_ver: Version of the Type ID\n :param vsiid_frmt: Format of the following VSI argument\n :param filter_frmt: Filter Format. Only supported for now\n :param gid: Group ID the vNIC belongs to\n :param mac: MAC Address of the vNIC\n :param vlan: VLAN of the vNIC\n :param oui_id: OUI Type\n :param oui_data: OUI Data\n :param sw_resp: Flag indicating if response is required from the daemon\n \"\"\"\n # Correct non-zero VLAN needs to be specified\n try:\n with self.mutex_lock:\n self.send_vdp_deassoc(vsiid=vsiid, mgrid=mgrid, typeid=typeid,\n typeid_ver=typeid_ver,\n vsiid_frmt=vsiid_frmt,\n filter_frmt=filter_frmt, gid=gid,\n mac=mac, vlan=vlan)\n self.clear_vdp_vsi(port_uuid)\n except Exception as e:\n LOG.error(\"VNIC Down exception %s\", e)", "response": "This function is called by apps when a vNIC Down is received. This function is called by apps when a vNIC is down."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run_vdptool(self, args, oui_args=None):\n if oui_args is None:\n oui_args = []\n full_args = ['vdptool'] + args + oui_args\n try:\n return utils.execute(full_args, root_helper=self.root_helper)\n except Exception as e:\n LOG.error(\"Unable to execute %(cmd)s. \"\n \"Exception: %(exception)s\",\n {'cmd': full_args, 'exception': e})", "response": "Function that runs the vdptool utility."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets the routers for a specific context.", "response": "def get_routers(self, context, router_ids=None, hd_ids=None):\n \"\"\"Make a remote process call to retrieve the sync data for routers.\n\n :param context: session context\n :param router_ids: list of routers to fetch\n :param hd_ids: hosting device ids, only routers assigned to these\n hosting devices will be returned.\n \"\"\"\n cctxt = self.client.prepare(version='1.1')\n return cctxt.call(context, 'cfg_sync_routers', host=self.host,\n router_ids=router_ids, hosting_device_ids=hd_ids)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nupdates the operational status for one or several floating IPs.", "response": "def update_floatingip_statuses(self, context, router_id, fip_statuses):\n \"\"\"Make a remote process call to update operational status for one or\n several floating IPs.\n\n @param context: contains user information\n @param router_id: id of router associated with the floatingips\n @param fip_statuses: dict with floatingip_id as key and status as value\n \"\"\"\n cctxt = self.client.prepare(version='1.1')\n return cctxt.call(context, 'update_floatingip_statuses_cfg',\n router_id=router_id, fip_statuses=fip_statuses)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nsend the status update to the DB", "response": "def send_update_port_statuses(self, context, port_ids, status):\n \"\"\"Call the pluging to update the port status which updates the DB.\n\n :param context: contains user information\n :param port_ids: list of ids of the ports associated with the status\n :param status: value of the status for the given port list (port_ids)\n \"\"\"\n cctxt = self.client.prepare(version='1.1')\n return cctxt.call(context, 'update_port_statuses_cfg',\n port_ids=port_ids, status=status)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndealing with router deletion RPC message.", "response": "def router_deleted(self, context, routers):\n \"\"\"Deal with router deletion RPC message.\"\"\"\n LOG.debug('Got router deleted notification for %s', routers)\n self._update_removed_routers_cache(routers)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ndeal with routers modification and creation RPC message.", "response": "def routers_updated(self, context, routers):\n \"\"\"Deal with routers modification and creation RPC message.\"\"\"\n LOG.debug('Got routers updated notification :%s', routers)\n if routers:\n # This is needed for backward compatibility\n if isinstance(routers[0], dict):\n routers = [router['id'] for router in routers]\n self._update_updated_routers_cache(routers)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef collect_state(self, configurations):\n num_ex_gw_ports = 0\n num_interfaces = 0\n num_floating_ips = 0\n router_infos = self.router_info.values()\n num_routers = len(router_infos)\n num_hd_routers = collections.defaultdict(int)\n for ri in router_infos:\n ex_gw_port = ri.router.get('gw_port')\n if ex_gw_port:\n num_ex_gw_ports += 1\n num_interfaces += len(ri.router.get(\n bc.constants.INTERFACE_KEY, []))\n num_floating_ips += len(ri.router.get(\n bc.constants.FLOATINGIP_KEY, []))\n hd = ri.router['hosting_device']\n if hd:\n num_hd_routers[hd['id']] += 1\n routers_per_hd = dict((hd_id, {'routers': num})\n for hd_id, num in num_hd_routers.items())\n non_responding = self._dev_status.get_backlogged_hosting_devices()\n configurations['total routers'] = num_routers\n configurations['total ex_gw_ports'] = num_ex_gw_ports\n configurations['total interfaces'] = num_interfaces\n configurations['total floating_ips'] = num_floating_ips\n configurations['hosting_devices'] = routers_per_hd\n configurations['non_responding_hosting_devices'] = non_responding\n return configurations", "response": "Collect state from this helper."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfetches router dict from the routing plugin.", "response": "def _fetch_router_info(self, router_ids=None, device_ids=None,\n all_routers=False):\n \"\"\"Fetch router dict from the routing plugin.\n\n :param router_ids: List of router_ids of routers to fetch\n :param device_ids: List of device_ids whose routers to fetch\n :param all_routers: If True fetch all the routers for this agent.\n :return: List of router dicts of format:\n [ {router_dict1}, {router_dict2},.....]\n \"\"\"\n try:\n if all_routers:\n LOG.debug('Fetching all routers')\n router_ids = self.plugin_rpc.get_router_ids(self.context)\n routers = self._fetch_router_chunk_data(router_ids)\n\n elif router_ids:\n routers = self._fetch_router_chunk_data(router_ids)\n\n elif device_ids:\n return self.plugin_rpc.get_routers(self.context,\n hd_ids=device_ids)\n except oslo_messaging.MessagingTimeout:\n if self.sync_routers_chunk_size > SYNC_ROUTERS_MIN_CHUNK_SIZE:\n self.sync_routers_chunk_size = max(\n int(round(self.sync_routers_chunk_size / 2)),\n SYNC_ROUTERS_MIN_CHUNK_SIZE)\n LOG.warning('Server failed to return info for routers in '\n 'required time, decreasing chunk size to: %s',\n self.sync_routers_chunk_size)\n else:\n LOG.warning('Server failed to return info for routers in '\n 'required time even with min chunk size: %s. '\n 'It might be under very high load or just '\n 'inoperable',\n self.sync_routers_chunk_size)\n raise\n except oslo_messaging.MessagingException:\n LOG.exception(\"RPC Error in fetching routers from plugin\")\n self.fullsync = True\n raise n_exc.AbortSyncRouters()\n\n LOG.debug(\"Periodic_sync_routers_task successfully completed\")\n # adjust chunk size after successful sync\n if (self.sync_routers_chunk_size <\n cfg.CONF.cfg_agent.max_device_sync_batch_size):\n self.sync_routers_chunk_size = min(\n self.sync_routers_chunk_size + SYNC_ROUTERS_MIN_CHUNK_SIZE,\n cfg.CONF.cfg_agent.max_device_sync_batch_size)\n return routers"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfetches router data from the routing plugin in chunks.", "response": "def _fetch_router_chunk_data(self, router_ids=None):\n\n \"\"\"Fetch router data from the routing plugin in chunks.\n\n :param router_ids: List of router_ids of routers to fetch\n :return: List of router dicts of format:\n [ {router_dict1}, {router_dict2},.....]\n \"\"\"\n\n curr_router = []\n if len(router_ids) > self.sync_routers_chunk_size:\n # fetch routers by chunks to reduce the load on server and\n # to start router processing earlier\n for i in range(0, len(router_ids),\n self.sync_routers_chunk_size):\n routers = self.plugin_rpc.get_routers(\n self.context, (router_ids[i:i +\n self.sync_routers_chunk_size]))\n LOG.debug('Processing :%r', routers)\n for r in routers:\n curr_router.append(r)\n else:\n curr_router = self.plugin_rpc.get_routers(\n self.context, router_ids=router_ids)\n return curr_router"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _handle_sync_devices(self, routers):\n sync_devices_list = list(self.sync_devices)\n LOG.debug(\"Fetching routers on:%s\", sync_devices_list)\n fetched_routers = self._fetch_router_info(device_ids=sync_devices_list)\n\n if fetched_routers:\n LOG.debug(\"[sync_devices] Fetched routers :%s\",\n pp.pformat(fetched_routers))\n\n # clear router_config cache\n for router_dict in fetched_routers:\n self._del_from_updated_routers_cache(router_dict['id'])\n self._del_from_removed_routers_cache(router_dict['id'])\n LOG.debug(\"[sync_devices] invoking \"\n \"_router_removed(%s)\",\n router_dict['id'])\n self._router_removed(router_dict['id'],\n deconfigure=False)\n\n self._cleanup_invalid_cfg(fetched_routers)\n routers.extend(fetched_routers)\n self.sync_devices.clear()\n LOG.debug(\"[sync_devices] %s finished\",\n sync_devices_list)\n else:\n # If the initial attempt to sync a device\n # failed, retry again (by not clearing sync_devices)\n # Normal updated_routers processing is still allowed\n # to happen\n self.sync_devices_attempts += 1\n\n if (self.sync_devices_attempts >=\n cfg.CONF.cfg_agent.max_device_sync_attempts):\n\n LOG.debug(\"Max number [%d / %d ] of sync_devices \"\n \"attempted. No further retries will \"\n \"be attempted.\" %\n (self.sync_devices_attempts,\n cfg.CONF.cfg_agent.max_device_sync_attempts))\n self.sync_devices.clear()\n self.sync_devices_attempts = 0\n else:\n LOG.debug(\"Fetched routers was blank for sync attempt \"\n \"[%d / %d], will attempt resync of %s devices \"\n \"again in the next iteration\" %\n (self.sync_devices_attempts,\n cfg.CONF.cfg_agent.max_device_sync_attempts,\n pp.pformat(self.sync_devices)))", "response": "This method handles the actual sync_devices operation."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nextract router ids from the removed devices info dict.", "response": "def _get_router_ids_from_removed_devices_info(removed_devices_info):\n \"\"\"Extract router_ids from the removed devices info dict.\n\n :param removed_devices_info: Dict of removed devices and their\n associated resources.\n Format:\n {\n 'hosting_data': {'hd_id1': {'routers': [id1, id2, ...]},\n 'hd_id2': {'routers': [id3, id4, ...]},\n ...\n },\n 'deconfigure': True/False\n }\n :return removed_router_ids: List of removed router ids\n \"\"\"\n removed_router_ids = []\n for hd_id, resources in removed_devices_info['hosting_data'].items():\n removed_router_ids += resources.get('routers', [])\n return removed_router_ids"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _sort_resources_per_hosting_device(resources):\n hosting_devices = {}\n for key in resources.keys():\n for r in resources.get(key) or []:\n if r.get('hosting_device') is None:\n continue\n hd_id = r['hosting_device']['id']\n hosting_devices.setdefault(hd_id, {})\n hosting_devices[hd_id].setdefault(key, []).append(r)\n return hosting_devices", "response": "This function will sort the resources on hosting device."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadjust the router list for global routers.", "response": "def _adjust_router_list_for_global_router(self, routers):\n \"\"\"\n Pushes 'Global' routers to the end of the router list, so that\n deleting default route occurs before deletion of external nw subintf\n \"\"\"\n #ToDo(Hareesh): Simplify if possible\n for r in routers:\n if r[ROUTER_ROLE_ATTR] == c_constants.ROUTER_ROLE_GLOBAL:\n LOG.debug(\"Global router:%s found. Moved to the end of list \"\n \"for processing\", r['id'])\n routers.remove(r)\n routers.append(r)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _process_routers(self, routers, removed_routers,\n device_id=None, all_routers=False):\n \"\"\"Process the set of routers.\n\n Iterating on the set of routers received and comparing it with the\n set of routers already in the routing service helper, new routers\n which are added are identified. Before processing check the\n reachability (via ping) of hosting device where the router is hosted.\n If device is not reachable it is backlogged.\n\n For routers which are only updated, call `_process_router()` on them.\n\n When all_routers is set to True (because of a full sync),\n this will result in the detection and deletion of routers which\n have been removed.\n\n Whether the router can only be assigned to a particular hosting device\n is decided and enforced by the plugin. No checks are done here.\n\n :param routers: The set of routers to be processed\n :param removed_routers: the set of routers which where removed\n :param device_id: Id of the hosting device\n :param all_routers: Flag for specifying a partial list of routers\n :return: None\n \"\"\"\n try:\n ids_previously_hosted_routers = (\n set(r_id for r_id, rdata in self.router_info.items()\n if rdata.router.get('hosting_device',\n {}).get('id') == device_id))\n\n if all_routers:\n prev_router_ids = ids_previously_hosted_routers\n else:\n prev_router_ids = (ids_previously_hosted_routers &\n set([router['id'] for router in routers]))\n cur_router_ids = set()\n deleted_routerids_list = []\n\n for r in routers:\n if not r['admin_state_up']:\n continue\n cur_router_ids.add(r['id'])\n\n # identify list of routers(ids) that no longer exist\n for router_id in prev_router_ids - cur_router_ids:\n deleted_routerids_list.append(router_id)\n if removed_routers:\n self._adjust_router_list_for_global_router(removed_routers)\n for router in removed_routers:\n deleted_routerids_list.append(router['id'])\n\n self._adjust_router_list_for_global_router(routers)\n # First process create/updated routers\n for r in routers:\n LOG.debug(\"Processing router[id:%(id)s, role:%(role)s]\",\n {'id': r['id'], 'role': r[ROUTER_ROLE_ATTR]})\n if r['id'] in deleted_routerids_list:\n continue\n if r['status'] == c_constants.ROUTER_INFO_INCOMPLETE:\n # The plugin could not fill in all the info due to\n # timing and db settling down. So put this router\n # back in updated_routers, we will pull again on the\n # sync time.\n LOG.debug(\"Router: %(id)s INFO_INCOMPLETE\",\n {'id': r['id']})\n self._update_updated_routers_cache([r['id']])\n continue\n try:\n if not r['admin_state_up']:\n continue\n cur_router_ids.add(r['id'])\n hd = r['hosting_device']\n if not self._dev_status.is_hosting_device_reachable(hd):\n LOG.info(\"Router: %(id)s is on an unreachable \"\n \"hosting device. \", {'id': r['id']})\n continue\n if r['id'] not in self.router_info:\n self._router_added(r['id'], r)\n ri = self.router_info[r['id']]\n ri.router = r\n self._process_router(ri)\n except ncc_errors.SessionCloseError as e:\n LOG.exception(\n \"ncclient Unexpected session close %s\", e)\n if not self._dev_status.is_hosting_device_reachable(\n r['hosting_device']):\n LOG.debug(\"Lost connectivity to hosting device %s\" %\n r['hosting_device']['id'])\n # Will rely on heartbeat to detect hd state\n # and schedule resync when hd comes back\n else:\n # retry the router update on the next pass\n self._update_updated_routers_cache([r['id']])\n LOG.debug(\"RETRY_RTR_UPDATE %s\" % (r['id']))\n\n continue\n except KeyError as e:\n LOG.exception(\"Key Error, missing key: %s\", e)\n self._update_updated_routers_cache([r['id']])\n continue\n except cfg_exceptions.DriverException as e:\n LOG.exception(\"Driver Exception on router:%(id)s. \"\n \"Error is %(e)s\", {'id': r['id'],\n 'e': e})\n self._update_updated_routers_cache([r['id']])\n continue\n LOG.debug(\"Done processing router[id:%(id)s, role:%(role)s]\",\n {'id': r['id'], 'role': r[ROUTER_ROLE_ATTR]})\n # Finally process removed routers\n for router_id in deleted_routerids_list:\n LOG.debug(\"Processing deleted router:%s\", router_id)\n self._router_removed(router_id)\n except Exception:\n LOG.exception(\"Exception in processing routers on device:%s\",\n device_id)\n self.sync_devices.add(device_id)", "response": "Process the set of routers and remove routers which are not in the routing service helper."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nsend update notifications to set the operational status of the router ports provided.", "response": "def _send_update_port_statuses(self, port_ids, status):\n \"\"\"Sends update notifications to set the operational status of the\n list of router ports provided. To make each notification doesn't exceed\n the RPC length, each message contains a maximum of MAX_PORTS_IN_BATCH\n port ids.\n\n :param port_ids: List of ports to update the status\n :param status: operational status to update\n (ex: bc.constants.PORT_STATUS_ACTIVE)\n \"\"\"\n if not port_ids:\n return\n\n MAX_PORTS_IN_BATCH = 50\n list_chunks_ports = [port_ids[i:i + MAX_PORTS_IN_BATCH]\n for i in six.moves.range(0, len(port_ids), MAX_PORTS_IN_BATCH)]\n for chunk_ports in list_chunks_ports:\n self.plugin_rpc.send_update_port_statuses(self.context,\n chunk_ports, status)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nprocess a router and update router_info.", "response": "def _process_router(self, ri):\n \"\"\"Process a router, apply latest configuration and update router_info.\n\n Get the router dict from RouterInfo and proceed to detect changes\n from the last known state. When new ports or deleted ports are\n detected, `internal_network_added()` or `internal_networks_removed()`\n are called accordingly. Similarly changes in ex_gw_port causes\n `external_gateway_added()` or `external_gateway_removed()` calls.\n Next, floating_ips and routes are processed. Also, latest state is\n stored in ri.internal_ports and ri.ex_gw_port for future comparisons.\n\n :param ri : RouterInfo object of the router being processed.\n :return:None\n :raises: networking_cisco.plugins.cisco.cfg_agent.cfg_exceptions.\n DriverException if the configuration operation fails.\n \"\"\"\n try:\n ex_gw_port = ri.router.get('gw_port')\n ri.ha_info = ri.router.get('ha_info', None)\n gateway_set = ex_gw_port and not ri.ex_gw_port\n gateway_cleared = not ex_gw_port and ri.ex_gw_port\n internal_ports = ri.router.get(bc.constants.INTERFACE_KEY, [])\n # Once the gateway is set, then we know which VRF this router\n # belongs to. Keep track of it in our lists of routers, organized\n # as a dictionary by VRF name\n if gateway_set:\n self._add_rid_to_vrf_list(ri)\n\n new_ports, old_ports, change_details = (\n self._get_internal_port_changes(ri, internal_ports))\n\n list_port_ids_up = []\n\n non_global_router_roles = [None,\n c_constants.ROUTER_ROLE_HA_REDUNDANCY]\n if ri.router[ROUTER_ROLE_ATTR] in non_global_router_roles:\n self._process_new_ports(ri, new_ports, ex_gw_port,\n list_port_ids_up, change_details)\n self._process_old_ports(ri, old_ports, ex_gw_port,\n change_details)\n else:\n self._process_new_ports_global(ri, new_ports, ex_gw_port,\n list_port_ids_up)\n self._process_old_ports_global(ri, old_ports, ex_gw_port)\n\n if gateway_set:\n self._process_gateway_set(ri, ex_gw_port,\n list_port_ids_up)\n elif gateway_cleared:\n self._process_gateway_cleared(ri, ri.ex_gw_port)\n\n self._send_update_port_statuses(list_port_ids_up,\n bc.constants.PORT_STATUS_ACTIVE)\n if ex_gw_port:\n self._process_router_floating_ips(ri, ex_gw_port)\n\n global_router_roles = [c_constants.ROUTER_ROLE_GLOBAL,\n c_constants.ROUTER_ROLE_LOGICAL_GLOBAL]\n if ri.router[ROUTER_ROLE_ATTR] not in global_router_roles:\n self._enable_disable_ports(ri, ex_gw_port, internal_ports)\n\n if gateway_cleared:\n # Remove this router from the list of routers by VRF\n self._remove_rid_from_vrf_list(ri)\n\n ri.ex_gw_port = ex_gw_port\n self._routes_updated(ri)\n except cfg_exceptions.HAParamsMissingException as e:\n self._update_updated_routers_cache([ri.router_id])\n LOG.warning(e)\n except cfg_exceptions.DriverException as e:\n with excutils.save_and_reraise_exception():\n self._update_updated_routers_cache([ri.router_id])\n LOG.error(e)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nprocess a router s floating ips and update the plugin s internal cache with the result of the operation.", "response": "def _process_router_floating_ips(self, ri, ex_gw_port):\n \"\"\"Process a router's floating ips.\n\n Compare floatingips configured in device (i.e., those fips in\n the ri.floating_ips \"cache\") with the router's updated floating ips\n (in ri.router.floating_ips) and determine floating_ips which were\n added or removed. Notify driver of the change via\n `floating_ip_added()` or `floating_ip_removed()`. Also update plugin\n with status of fips.\n\n :param ri: RouterInfo object of the router being processed.\n :param ex_gw_port: Port dict of the external gateway port.\n :return: None\n :raises: networking_cisco.plugins.cisco.cfg_agent.cfg_exceptions.\n DriverException if the configuration operation fails.\n \"\"\"\n\n # fips that exist in neutron db (i.e., the desired \"truth\")\n current_fips = ri.router.get(bc.constants.FLOATINGIP_KEY, [])\n # ids of fips that exist in neutron db\n current_fip_ids = {fip['id'] for fip in current_fips}\n # ids of fips that are configured in device\n configured_fip_ids = {fip['id'] for fip in ri.floating_ips}\n\n id_to_current_fip_map = {}\n\n fips_to_add = []\n # iterate of fips that exist in neutron db\n for configured_fip in current_fips:\n if configured_fip['port_id']:\n # store to later check if this fip has been remapped\n id_to_current_fip_map[configured_fip['id']] = configured_fip\n if configured_fip['id'] not in configured_fip_ids:\n # Ensure that we add only after remove, in case same\n # fixed_ip is mapped to different floating_ip within\n # the same loop cycle. If add occurs before first,\n # cfg will fail because of existing entry with\n # identical fixed_ip\n fips_to_add.append(configured_fip)\n\n fip_ids_to_remove = configured_fip_ids - current_fip_ids\n LOG.debug(\"fip_ids_to_add: %s\" % fips_to_add)\n LOG.debug(\"fip_ids_to_remove: %s\" % fip_ids_to_remove)\n\n fips_to_remove = []\n fip_statuses = {}\n # iterate over fips that are configured in device\n for configured_fip in ri.floating_ips:\n if configured_fip['id'] in fip_ids_to_remove:\n fips_to_remove.append(configured_fip)\n self._floating_ip_removed(\n ri, ri.ex_gw_port, configured_fip['floating_ip_address'],\n configured_fip['fixed_ip_address'])\n fip_statuses[configured_fip['id']] = (\n bc.constants.FLOATINGIP_STATUS_DOWN)\n LOG.debug(\"Add to fip_statuses DOWN id:%s fl_ip:%s fx_ip:%s\",\n configured_fip['id'],\n configured_fip['floating_ip_address'],\n configured_fip['fixed_ip_address'])\n else:\n # handle possibly required remapping of a fip\n # ip address that fip currently is configured for\n configured_fixed_ip = configured_fip['fixed_ip_address']\n new_fip = id_to_current_fip_map[configured_fip['id']]\n # ip address that fip should be configured for\n current_fixed_ip = new_fip['fixed_ip_address']\n if (current_fixed_ip and configured_fixed_ip and\n current_fixed_ip != configured_fixed_ip):\n floating_ip = configured_fip['floating_ip_address']\n self._floating_ip_removed(ri, ri.ex_gw_port,\n floating_ip, configured_fixed_ip)\n fip_statuses[configured_fip['id']] = (\n bc.constants.FLOATINGIP_STATUS_DOWN)\n fips_to_remove.append(configured_fip)\n fips_to_add.append(new_fip)\n\n for configured_fip in fips_to_remove:\n # remove fip from \"cache\" of fips configured in device\n ri.floating_ips.remove(configured_fip)\n\n for configured_fip in fips_to_add:\n self._floating_ip_added(ri, ex_gw_port,\n configured_fip['floating_ip_address'],\n configured_fip['fixed_ip_address'])\n # add fip to \"cache\" of fips configured in device\n ri.floating_ips.append(configured_fip)\n fip_statuses[configured_fip['id']] = (\n bc.constants.FLOATINGIP_STATUS_ACTIVE)\n LOG.debug(\"Add to fip_statuses ACTIVE id:%s fl_ip:%s fx_ip:%s\",\n configured_fip['id'],\n configured_fip['floating_ip_address'],\n configured_fip['fixed_ip_address'])\n\n if fip_statuses:\n LOG.debug(\"Sending floatingip_statuses_update: %s\", fip_statuses)\n self.plugin_rpc.update_floatingip_statuses(\n self.context, ri.router_id, fip_statuses)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _router_added(self, router_id, router):\n ri = RouterInfo(router_id, router)\n driver = self.driver_manager.set_driver(router)\n if router[ROUTER_ROLE_ATTR] in [\n c_constants.ROUTER_ROLE_GLOBAL,\n c_constants.ROUTER_ROLE_LOGICAL_GLOBAL]:\n # No need to create a vrf for Global or logical global routers\n LOG.debug(\"Skipping router_added device processing for %(id)s as \"\n \"its role is %(role)s\",\n {'id': router_id, 'role': router[ROUTER_ROLE_ATTR]})\n else:\n driver.router_added(ri)\n self.router_info[router_id] = ri", "response": "Operations when a router is added."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nremove a router from the service.", "response": "def _router_removed(self, router_id, deconfigure=True):\n \"\"\"Operations when a router is removed.\n\n Get the RouterInfo object corresponding to the router in the service\n helpers's router_info dict. If deconfigure is set to True,\n remove this router's configuration from the hosting device.\n :param router_id: id of the router\n :param deconfigure: if True, the router's configuration is deleted from\n the hosting device.\n :return: None\n \"\"\"\n ri = self.router_info.get(router_id)\n if ri is None:\n LOG.warning(\"Info for router %s was not found. \"\n \"Skipping router removal.\", router_id)\n return\n ri.router['gw_port'] = None\n ri.router[bc.constants.INTERFACE_KEY] = []\n ri.router[bc.constants.FLOATINGIP_KEY] = []\n try:\n hd = ri.router['hosting_device']\n # We proceed to removing the configuration from the device\n # only if (a) deconfigure is set to True (default)\n # (b) the router's hosting device is reachable.\n if (deconfigure and\n self._dev_status.is_hosting_device_reachable(hd)):\n self._process_router(ri)\n driver = self.driver_manager.get_driver(router_id)\n driver.router_removed(ri)\n self.driver_manager.remove_driver(router_id)\n del self.router_info[router_id]\n self._del_from_removed_routers_cache(router_id)\n except cfg_exceptions.DriverException:\n LOG.warning(\"Router remove for router_id: %s was incomplete. \"\n \"Adding the router to removed_routers list\",\n router_id)\n self._update_removed_routers_cache([router_id])\n # remove this router from updated_routers if it is there. It might\n # end up there too if exception was thrown earlier inside\n # `_process_router()`\n self._del_from_updated_routers_cache(router_id)\n except ncc_errors.SessionCloseError as e:\n LOG.exception(\"ncclient Unexpected session close %s\"\n \" while attempting to remove router\", e)\n if not self._dev_status.is_hosting_device_reachable(hd):\n LOG.debug(\"Lost connectivity to Hosting Device %s\" % hd['id'])\n # rely on heartbeat to detect HD state\n # and schedule resync when the device comes back\n else:\n # retry the router removal on the next pass\n self._update_removed_routers_cache([router_id])\n LOG.debug(\"Interim connectivity lost to hosting device %s, \"\n \"enqueuing router %s in removed_routers set\" %\n pp.pformat(hd), router_id)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _routes_updated(self, ri):\n new_routes = ri.router['routes']\n old_routes = ri.routes\n adds, removes = bc.common_utils.diff_list_of_dict(old_routes,\n new_routes)\n for route in adds:\n LOG.debug(\"Added route entry is '%s'\", route)\n # remove replaced route from deleted route\n for del_route in removes:\n if route['destination'] == del_route['destination']:\n removes.remove(del_route)\n driver = self.driver_manager.get_driver(ri.id)\n driver.routes_updated(ri, 'replace', route)\n\n for route in removes:\n LOG.debug(\"Removed route entry is '%s'\", route)\n driver = self.driver_manager.get_driver(ri.id)\n driver.routes_updated(ri, 'delete', route)\n ri.routes = new_routes", "response": "Updates the state of routes in the router."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_running_config(self, conn):\n config = conn.get_config(source=\"running\")\n if config:\n root = ET.fromstring(config._raw)\n running_config = root[0][0]\n rgx = re.compile(\"\\r*\\n+\")\n ioscfg = rgx.split(running_config.text)\n return ioscfg", "response": "Get the ASR1k s current running config as multiline string"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nchecks running - cfg derived ip_addr and netmask against neutron - db gw_port.", "response": "def subintf_real_ip_check_gw_port(self, gw_port, ip_addr, netmask):\n \"\"\"\n checks running-cfg derived ip_addr and netmask against neutron-db\n gw_port\n \"\"\"\n if gw_port is not None:\n found = False\n for i in range(len(gw_port['fixed_ips'])):\n target_ip = gw_port['fixed_ips'][i]['ip_address']\n if ip_addr == target_ip:\n found = True\n break\n if found is False:\n LOG.info(\"Subintf real IP is incorrect, deleting\")\n return False\n subnet_id = gw_port['fixed_ips'][i]['subnet_id']\n subnet = next(\n sn for sn in gw_port['subnets'] if sn['id'] == subnet_id)\n target_net = netaddr.IPNetwork(subnet['cidr'])\n if netmask != str(target_net.netmask):\n LOG.info(\"Subintf has incorrect netmask, deleting\")\n return False\n\n return True\n\n return False"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nretrying calling the decorated function using an exponential backoff. Reference: http://www.saltycrane.com/blog/2009/11/trying-out-retry -decorator-python/ :param ExceptionToCheck: the exception to check. may be a tuple of exceptions to check :param tries: number of times to try (not retry) before giving up :param delay: initial delay between retries in seconds :param backoff: backoff multiplier e.g. value of 2 will double the delay each retry", "response": "def retry(ExceptionToCheck, tries=4, delay=3, backoff=2):\n \"\"\"Retry calling the decorated function using an exponential backoff.\n\n Reference: http://www.saltycrane.com/blog/2009/11/trying-out-retry\n -decorator-python/\n\n :param ExceptionToCheck: the exception to check. may be a tuple of\n exceptions to check\n :param tries: number of times to try (not retry) before giving up\n :param delay: initial delay between retries in seconds\n :param backoff: backoff multiplier e.g. value of 2 will double the delay\n each retry\n \"\"\"\n def deco_retry(f):\n @wraps(f)\n def f_retry(*args, **kwargs):\n mtries, mdelay = tries, delay\n while mtries > 1:\n try:\n return f(*args, **kwargs)\n except ExceptionToCheck as e:\n LOG.debug(\"%(err_mess)s. Retry calling function \"\n \"'%(f_name)s' in %(delta)d seconds.\",\n {'err_mess': str(e), 'f_name': f.__name__,\n 'delta': mdelay})\n time.sleep(mdelay)\n mtries -= 1\n mdelay *= backoff\n LOG.debug(\"Last retry calling function '%s'.\", f.__name__)\n return f(*args, **kwargs)\n\n return f_retry # true decorator\n\n return deco_retry"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nmake SSH connection to the IOS XE device.", "response": "def _get_connection(self):\n \"\"\"Make SSH connection to the IOS XE device.\n\n The external ncclient library is used for creating this connection.\n This method keeps state of any existing connections and reuses them if\n already connected. Also interfaces (except management) are typically\n disabled by default when it is booted. So if connecting for the first\n time, driver will enable all other interfaces and keep that status in\n the `_itfcs_enabled` flag.\n \"\"\"\n try:\n if self._ncc_connection and self._ncc_connection.connected:\n return self._ncc_connection\n else:\n # ncclient needs 'name' to be 'csr' in order to communicate\n # with the device in the correct way.\n self._ncc_connection = manager.connect(\n host=self._host_ip, port=self._host_ssh_port,\n username=self._username, password=self._password,\n device_params={'name': \"csr\"}, timeout=self._timeout)\n if not self._itfcs_enabled:\n self._itfcs_enabled = self._enable_itfcs(\n self._ncc_connection)\n return self._ncc_connection\n except Exception as e:\n conn_params = {'host': self._host_ip, 'port': self._host_ssh_port,\n 'user': self._username,\n 'timeout': self._timeout, 'reason': e.message}\n raise cfg_exc.ConnectionException(**conn_params)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting a list of interfaces on this hosting device.", "response": "def _get_interfaces(self):\n \"\"\"Get a list of interfaces on this hosting device.\n\n :return: List of the interfaces\n \"\"\"\n ios_cfg = self._get_running_config()\n parse = HTParser(ios_cfg)\n itfcs_raw = parse.find_lines(\"^interface GigabitEthernet\")\n itfcs = [raw_if.strip().split(' ')[1] for raw_if in itfcs_raw]\n LOG.debug(\"Interfaces on hosting device: %s\", itfcs)\n return itfcs"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting the ip address of an interface.", "response": "def _get_interface_ip(self, interface_name):\n \"\"\"Get the ip address for an interface.\n\n :param interface_name: interface_name as a string\n :return: ip address of interface as a string\n \"\"\"\n ios_cfg = self._get_running_config()\n parse = HTParser(ios_cfg)\n children = parse.find_children(\"^interface %s\" % interface_name)\n for line in children:\n if 'ip address' in line:\n ip_address = line.strip().split(' ')[2]\n LOG.debug(\"IP Address:%s\", ip_address)\n return ip_address\n LOG.warning(\"Cannot find interface: %s\", interface_name)\n return None"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _interface_exists(self, interface):\n ios_cfg = self._get_running_config()\n parse = HTParser(ios_cfg)\n itfcs_raw = parse.find_lines(\"^interface \" + interface)\n return len(itfcs_raw) > 0", "response": "Check whether an interface exists."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the current VRFs configured in the device.", "response": "def _get_vrfs(self):\n \"\"\"Get the current VRFs configured in the device.\n\n :return: A list of vrf names as string\n \"\"\"\n vrfs = []\n ios_cfg = self._get_running_config()\n parse = HTParser(ios_cfg)\n vrfs_raw = parse.find_lines(\"^vrf definition\")\n for line in vrfs_raw:\n # raw format ['ip vrf ',....]\n vrf_name = line.strip().split(' ')[2]\n vrfs.append(vrf_name)\n LOG.info(\"VRFs:%s\", vrfs)\n return vrfs"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _get_capabilities(self):\n conn = self._get_connection()\n capabilities = []\n for c in conn.server_capabilities:\n capabilities.append(c)\n LOG.debug(\"Server capabilities: %s\", capabilities)\n return capabilities", "response": "Get the servers NETCONF capabilities."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _get_running_config(self, split=True):\n conn = self._get_connection()\n config = conn.get_config(source=\"running\")\n if config:\n root = ET.fromstring(config._raw)\n running_config = root[0][0]\n if split is True:\n rgx = re.compile(\"\\r*\\n+\")\n ioscfg = rgx.split(running_config.text)\n else:\n ioscfg = running_config.text\n return ioscfg", "response": "Get the IOS XE device s current running config as multiline string."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck if an ACL exists in the running config.", "response": "def _check_acl(self, acl_no, network, netmask):\n \"\"\"Check a ACL config exists in the running config.\n\n :param acl_no: access control list (ACL) number\n :param network: network which this ACL permits\n :param netmask: netmask of the network\n :return:\n \"\"\"\n exp_cfg_lines = ['ip access-list standard ' + str(acl_no),\n ' permit ' + str(network) + ' ' + str(netmask)]\n ios_cfg = self._get_running_config()\n parse = HTParser(ios_cfg)\n acls_raw = parse.find_children(exp_cfg_lines[0])\n if acls_raw:\n if exp_cfg_lines[1] in acls_raw:\n return True\n LOG.error(\"Mismatch in ACL configuration for %s\", acl_no)\n return False\n LOG.debug(\"%s is not present in config\", acl_no)\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks if a partial config string exists in the running config.", "response": "def _cfg_exists(self, cfg_str):\n \"\"\"Check a partial config string exists in the running config.\n\n :param cfg_str: config string to check\n :return : True or False\n \"\"\"\n ios_cfg = self._get_running_config()\n parse = HTParser(ios_cfg)\n cfg_raw = parse.find_lines(\"^\" + cfg_str)\n LOG.debug(\"_cfg_exists(): Found lines %s\", cfg_raw)\n return len(cfg_raw) > 0"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef caller_name(self, skip=2):\n stack = inspect.stack()\n start = 0 + skip\n if len(stack) < start + 1:\n return ''\n parentframe = stack[start][0]\n\n name = []\n module = inspect.getmodule(parentframe)\n # `modname` can be None when frame is executed directly in console\n # TODO(asr1kteam): consider using __main__\n if module:\n name.append(module.__name__)\n # detect classname\n if 'self' in parentframe.f_locals:\n # I don't know any way to detect call from the object method\n # XXX: there seems to be no way to detect static method call,\n # it will be just a function call\n name.append(parentframe.f_locals['self'].__class__.__name__)\n codename = parentframe.f_code.co_name\n if codename != '': # top level usually\n name.append(codename) # function or a method\n del parentframe\n return \".\".join(name)", "response": "Get a name of a caller in the format module. class. method. caller. caller"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _get_instances_for_project(self, project_id):\n search_opts = {'marker': None,\n 'all_tenants': True,\n 'project_id': project_id}\n try:\n servers = self._novaclnt.servers.list(True, search_opts)\n LOG.debug('_get_instances_for_project: servers=%s', servers)\n return servers\n except nexc.Unauthorized:\n emsg = (_LE('Failed to get novaclient:Unauthorised '\n 'project_id=%(proj)s user=%(user)s'),\n {'proj': self._project_id, 'user': self._user_name})\n LOG.exception(emsg)\n raise nexc.ClientException(emsg)\n except nexc.AuthorizationFailure as err:\n emsg = (_LE(\"Failed to get novaclient %s\"))\n LOG.exception(emsg, err)\n raise nexc.ClientException(emsg % err)", "response": "Get all instances for a given project."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_instance_for_uuid(self, uuid, project_id):\n instance_name = self._inst_info_cache.get((uuid, project_id))\n if instance_name:\n return instance_name\n instances = self._get_instances_for_project(project_id)\n for inst in instances:\n if inst.id.replace('-', '') == uuid:\n LOG.debug('get_instance_for_uuid: name=%s', inst.name)\n instance_name = inst.name\n self._inst_info_cache[(uuid, project_id)] = instance_name\n return instance_name\n return instance_name", "response": "Return instance name for given uuid of an instance and project."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef l3_tenant_id(cls):\n if cls._l3_tenant_uuid is None:\n if hasattr(cfg.CONF.keystone_authtoken, 'project_domain_id'):\n # TODO(sridar): hack for now to determing if keystone v3\n # API is to be used.\n cls._l3_tenant_uuid = cls._get_tenant_id_using_keystone_v3()\n else:\n cls._l3_tenant_uuid = cls._get_tenant_id_using_keystone_v2()\n return cls._l3_tenant_uuid", "response": "Returns id of tenant owning hosting device resources."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning id of the management network.", "response": "def mgmt_nw_id(cls):\n \"\"\"Returns id of the management network.\"\"\"\n if cls._mgmt_nw_uuid is None:\n tenant_id = cls.l3_tenant_id()\n if not tenant_id:\n return\n net = bc.get_plugin().get_networks(\n bc.context.get_admin_context(),\n {'tenant_id': [tenant_id],\n 'name': [cfg.CONF.general.management_network]},\n ['id', 'subnets'])\n if len(net) == 1:\n num_subnets = len(net[0]['subnets'])\n if num_subnets == 0:\n LOG.error('The management network has no subnet. '\n 'Please assign one.')\n return\n elif num_subnets > 1:\n LOG.info('The management network has %d subnets. The '\n 'first one will be used.', num_subnets)\n cls._mgmt_nw_uuid = net[0].get('id')\n cls._mgmt_subnet_uuid = net[0]['subnets'][0]\n elif len(net) > 1:\n # Management network must have a unique name.\n LOG.error('The management network for does not have '\n 'unique name. Please ensure that it is.')\n else:\n # Management network has not been created.\n LOG.error('There is no virtual management network. Please '\n 'create one.')\n return cls._mgmt_nw_uuid"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef mgmt_sec_grp_id(cls):\n if not extensions.is_extension_supported(bc.get_plugin(),\n \"security-group\"):\n return\n if cls._mgmt_sec_grp_id is None:\n # Get the id for the _mgmt_security_group_id\n tenant_id = cls.l3_tenant_id()\n res = bc.get_plugin().get_security_groups(\n bc.context.get_admin_context(),\n {'tenant_id': [tenant_id],\n 'name': [cfg.CONF.general.default_security_group]},\n ['id'])\n if len(res) == 1:\n sec_grp_id = res[0].get('id', None)\n cls._mgmt_sec_grp_id = sec_grp_id\n elif len(res) > 1:\n # the mgmt sec group must be unique.\n LOG.error('The security group for the management network '\n 'does not have unique name. Please ensure that '\n 'it is.')\n else:\n # Service VM Mgmt security group is not present.\n LOG.error('There is no security group for the management '\n 'network. Please create one.')\n return cls._mgmt_sec_grp_id", "response": "Returns id of the security group used by the management network."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_hosting_device_driver(self, context, id):\n if id is None:\n return\n try:\n return self._hosting_device_drivers[id]\n except KeyError:\n try:\n template = self._get_hosting_device_template(context, id)\n self._hosting_device_drivers[id] = importutils.import_object(\n template['device_driver'])\n except (ImportError, TypeError, n_exc.NeutronException):\n LOG.exception(\"Error loading hosting device driver for \"\n \"hosting device template %s\", id)\n return self._hosting_device_drivers.get(id)", "response": "Returns device driver for hosting device template with id."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_hosting_device_plugging_driver(self, context, id):\n if id is None:\n return\n try:\n return self._plugging_drivers[id]\n except KeyError:\n try:\n template = self._get_hosting_device_template(context, id)\n self._plugging_drivers[id] = importutils.import_object(\n template['plugging_driver'])\n except (ImportError, TypeError, n_exc.NeutronException):\n LOG.exception(\"Error loading plugging driver for hosting \"\n \"device template %s\", id)\n return self._plugging_drivers.get(id)", "response": "Returns the plugging driver for hosting device template with id."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nacquiring num slots in hosting_device.", "response": "def acquire_hosting_device_slots(self, context, hosting_device, resource,\n resource_type, resource_service, num,\n exclusive=False):\n \"\"\"Assign slots in to logical .\n\n If exclusive is True the hosting device is bound to the resource's\n tenant. Otherwise it is not bound to any tenant.\n\n Returns True if allocation was granted, False otherwise.\n \"\"\"\n bound = hosting_device['tenant_bound']\n if ((bound is not None and bound != resource['tenant_id']) or\n (exclusive and not self._exclusively_used(context, hosting_device,\n resource['tenant_id']))):\n LOG.debug(\n 'Rejecting allocation of %(num)d slots in tenant %(bound)s '\n 'hosting device %(device)s to logical resource %(r_id)s due '\n 'to exclusive use conflict.',\n {'num': num,\n 'bound': 'unbound' if bound is None else bound + ' bound',\n 'device': hosting_device['id'], 'r_id': resource['id']})\n return False\n with context.session.begin(subtransactions=True):\n res_info = {'resource': resource, 'type': resource_type,\n 'service': resource_service}\n slot_info, query = self._get_or_create_slot_allocation(\n context, hosting_device, res_info)\n if slot_info is None:\n LOG.debug('Rejecting allocation of %(num)d slots in hosting '\n 'device %(device)s to logical resource %(r_id)s',\n {'num': num, 'device': hosting_device['id'],\n 'r_id': resource['id']})\n return False\n new_allocation = num + slot_info.num_allocated\n if hosting_device['template']['slot_capacity'] < new_allocation:\n LOG.debug('Rejecting allocation of %(num)d slots in '\n 'hosting device %(device)s to logical resource '\n '%(r_id)s due to insufficent slot availability.',\n {'num': num, 'device': hosting_device['id'],\n 'r_id': resource['id']})\n self._dispatch_pool_maintenance_job(hosting_device['template'])\n return False\n # handle any changes to exclusive usage by tenant\n if exclusive and bound is None:\n self._update_hosting_device_exclusivity(\n context, hosting_device, resource['tenant_id'])\n bound = resource['tenant_id']\n elif not exclusive and bound is not None:\n self._update_hosting_device_exclusivity(context,\n hosting_device, None)\n bound = None\n slot_info.num_allocated = new_allocation\n context.session.add(slot_info)\n self._dispatch_pool_maintenance_job(hosting_device['template'])\n # report success\n LOG.info('Allocated %(num)d additional slots in tenant %(bound)s'\n 'bound hosting device %(hd_id)s. In total %(total)d '\n 'slots are now allocated in that hosting device for '\n 'logical resource %(r_id)s.',\n {'num': num, 'bound': 'un-' if bound is None else bound + ' ',\n 'total': new_allocation, 'hd_id': hosting_device['id'],\n 'r_id': resource['id']})\n return True"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfrees the specified number of slots in the hosting device from the given logical resource. Returns True if deallocation was successful.", "response": "def release_hosting_device_slots(self, context, hosting_device, resource,\n num):\n \"\"\"Free slots in from logical resource .\n\n Returns True if deallocation was successful. False otherwise.\n \"\"\"\n with context.session.begin(subtransactions=True):\n num_str = str(num) if num >= 0 else \"all\"\n res_info = {'resource': resource}\n slot_info, query = self._get_or_create_slot_allocation(\n context, hosting_device, res_info, create=False)\n if slot_info is None:\n LOG.debug('Rejecting de-allocation of %(num)s slots in '\n 'hosting device %(device)s for logical resource '\n '%(id)s', {'num': num_str,\n 'device': hosting_device['id'],\n 'id': resource['id']})\n return False\n if num >= 0:\n new_allocation = slot_info.num_allocated - num\n else:\n # if a negative num is specified all slot allocations for\n # the logical resource in the hosting device is removed\n new_allocation = 0\n if new_allocation < 0:\n LOG.debug('Rejecting de-allocation of %(num)s slots in '\n 'hosting device %(device)s for logical resource '\n '%(id)s since only %(alloc)d slots are allocated.',\n {'num': num_str, 'device': hosting_device['id'],\n 'id': resource['id'],\n 'alloc': slot_info.num_allocated})\n self._dispatch_pool_maintenance_job(hosting_device['template'])\n return False\n elif new_allocation == 0:\n result = query.delete()\n LOG.info('De-allocated %(num)s slots from hosting device '\n '%(hd_id)s. %(total)d slots are now allocated in '\n 'that hosting device.',\n {'num': num_str, 'total': new_allocation,\n 'hd_id': hosting_device['id']})\n if (hosting_device['tenant_bound'] is not None and\n context.session.query(hd_models.SlotAllocation).filter_by(\n hosting_device_id=hosting_device['id']).first() is\n None):\n # make hosting device tenant unbound if no logical\n # resource use it anymore\n hosting_device['tenant_bound'] = None\n context.session.add(hosting_device)\n LOG.info('Making hosting device %(hd_id)s with no '\n 'allocated slots tenant unbound.',\n {'hd_id': hosting_device['id']})\n self._dispatch_pool_maintenance_job(hosting_device['template'])\n return result == 1\n LOG.info('De-allocated %(num)s slots from hosting device '\n '%(hd_id)s. %(total)d slots are now allocated in '\n 'that hosting device.',\n {'num': num_str, 'total': new_allocation,\n 'hd_id': hosting_device['id']})\n slot_info.num_allocated = new_allocation\n context.session.add(slot_info)\n self._dispatch_pool_maintenance_job(hosting_device['template'])\n # report success\n return True"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_hosting_devices_qry(self, context, hosting_device_ids,\n load_agent=True):\n \"\"\"Returns hosting devices with .\"\"\"\n query = context.session.query(hd_models.HostingDevice)\n if load_agent:\n query = query.options(joinedload('cfg_agent'))\n if len(hosting_device_ids) > 1:\n query = query.filter(hd_models.HostingDevice.id.in_(\n hosting_device_ids))\n else:\n query = query.filter(hd_models.HostingDevice.id ==\n hosting_device_ids[0])\n return query", "response": "Returns hosting devices with ."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ndeleting all hosting devices.", "response": "def delete_all_hosting_devices(self, context, force_delete=False):\n \"\"\"Deletes all hosting devices.\"\"\"\n for item in self._get_collection_query(\n context, hd_models.HostingDeviceTemplate):\n self.delete_all_hosting_devices_by_template(\n context, template=item, force_delete=force_delete)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef delete_all_hosting_devices_by_template(self, context, template,\n force_delete=False):\n \"\"\"Deletes all hosting devices based on