repository_name stringclasses 316 values | func_path_in_repository stringlengths 6 223 | func_name stringlengths 1 134 | language stringclasses 1 value | func_code_string stringlengths 57 65.5k | func_documentation_string stringlengths 1 46.3k | split_name stringclasses 1 value | func_code_url stringlengths 91 315 | called_functions listlengths 1 156 ⌀ | enclosing_scope stringlengths 2 1.48M |
|---|---|---|---|---|---|---|---|---|---|
fabric/fabric | fabric/connection.py | Connection.local | python | def local(self, *args, **kwargs):
# Superclass run() uses runners.local, so we can literally just call it
# straight.
return super(Connection, self).run(*args, **kwargs) | Execute a shell command on the local system.
This method is effectively a wrapper of `invoke.run`; see its docs for
details and call signature.
.. versionadded:: 2.0 | train | https://github.com/fabric/fabric/blob/e9939d68b734935f0c98d98817912ad7c698238f/fabric/connection.py#L626-L637 | null | class Connection(Context):
"""
A connection to an SSH daemon, with methods for commands and file transfer.
**Basics**
This class inherits from Invoke's `~invoke.context.Context`, as it is a
context within which commands, tasks etc can operate. It also encapsulates
a Paramiko `~paramiko.client.SSHClient` instance, performing useful high
level operations with that `~paramiko.client.SSHClient` and
`~paramiko.channel.Channel` instances generated from it.
.. _connect_kwargs:
.. note::
Many SSH specific options -- such as specifying private keys and
passphrases, timeouts, disabling SSH agents, etc -- are handled
directly by Paramiko and should be specified via the
:ref:`connect_kwargs argument <connect_kwargs-arg>` of the constructor.
**Lifecycle**
`.Connection` has a basic "`create <__init__>`, `connect/open <open>`, `do
work <run>`, `disconnect/close <close>`" lifecycle:
* `Instantiation <__init__>` imprints the object with its connection
parameters (but does **not** actually initiate the network connection).
* Methods like `run`, `get` etc automatically trigger a call to
`open` if the connection is not active; users may of course call `open`
manually if desired.
* Connections do not always need to be explicitly closed; much of the
time, Paramiko's garbage collection hooks or Python's own shutdown
sequence will take care of things. **However**, should you encounter edge
cases (for example, sessions hanging on exit) it's helpful to explicitly
close connections when you're done with them.
This can be accomplished by manually calling `close`, or by using the
object as a contextmanager::
with Connection('host') as c:
c.run('command')
c.put('file')
.. note::
This class rebinds `invoke.context.Context.run` to `.local` so both
remote and local command execution can coexist.
**Configuration**
Most `.Connection` parameters honor :doc:`Invoke-style configuration
</concepts/configuration>` as well as any applicable :ref:`SSH config file
directives <connection-ssh-config>`. For example, to end up with a
connection to ``admin@myhost``, one could:
- Use any built-in config mechanism, such as ``/etc/fabric.yml``,
``~/.fabric.json``, collection-driven configuration, env vars, etc,
stating ``user: admin`` (or ``{"user": "admin"}``, depending on config
format.) Then ``Connection('myhost')`` would implicitly have a ``user``
of ``admin``.
- Use an SSH config file containing ``User admin`` within any applicable
``Host`` header (``Host myhost``, ``Host *``, etc.) Again,
``Connection('myhost')`` will default to an ``admin`` user.
- Leverage host-parameter shorthand (described in `.Config.__init__`), i.e.
``Connection('admin@myhost')``.
- Give the parameter directly: ``Connection('myhost', user='admin')``.
The same applies to agent forwarding, gateways, and so forth.
.. versionadded:: 2.0
"""
# NOTE: these are initialized here to hint to invoke.Config.__setattr__
# that they should be treated as real attributes instead of config proxies.
# (Additionally, we're doing this instead of using invoke.Config._set() so
# we can take advantage of Sphinx's attribute-doc-comment static analysis.)
# Once an instance is created, these values will usually be non-None
# because they default to the default config values.
host = None
original_host = None
user = None
port = None
ssh_config = None
gateway = None
forward_agent = None
connect_timeout = None
connect_kwargs = None
client = None
transport = None
_sftp = None
_agent_handler = None
# TODO: should "reopening" an existing Connection object that has been
# closed, be allowed? (See e.g. how v1 detects closed/semi-closed
# connections & nukes them before creating a new client to the same host.)
# TODO: push some of this into paramiko.client.Client? e.g. expand what
# Client.exec_command does, it already allows configuring a subset of what
# we do / will eventually do / did in 1.x. It's silly to have to do
# .get_transport().open_session().
def __init__(
self,
host,
user=None,
port=None,
config=None,
gateway=None,
forward_agent=None,
connect_timeout=None,
connect_kwargs=None,
):
"""
Set up a new object representing a server connection.
:param str host:
the hostname (or IP address) of this connection.
May include shorthand for the ``user`` and/or ``port`` parameters,
of the form ``user@host``, ``host:port``, or ``user@host:port``.
.. note::
Due to ambiguity, IPv6 host addresses are incompatible with the
``host:port`` shorthand (though ``user@host`` will still work
OK). In other words, the presence of >1 ``:`` character will
prevent any attempt to derive a shorthand port number; use the
explicit ``port`` parameter instead.
.. note::
If ``host`` matches a ``Host`` clause in loaded SSH config
data, and that ``Host`` clause contains a ``Hostname``
directive, the resulting `.Connection` object will behave as if
``host`` is equal to that ``Hostname`` value.
In all cases, the original value of ``host`` is preserved as
the ``original_host`` attribute.
Thus, given SSH config like so::
Host myalias
Hostname realhostname
a call like ``Connection(host='myalias')`` will result in an
object whose ``host`` attribute is ``realhostname``, and whose
``original_host`` attribute is ``myalias``.
:param str user:
the login user for the remote connection. Defaults to
``config.user``.
:param int port:
the remote port. Defaults to ``config.port``.
:param config:
configuration settings to use when executing methods on this
`.Connection` (e.g. default SSH port and so forth).
Should be a `.Config` or an `invoke.config.Config`
(which will be turned into a `.Config`).
Default is an anonymous `.Config` object.
:param gateway:
An object to use as a proxy or gateway for this connection.
This parameter accepts one of the following:
- another `.Connection` (for a ``ProxyJump`` style gateway);
- a shell command string (for a ``ProxyCommand`` style style
gateway).
Default: ``None``, meaning no gatewaying will occur (unless
otherwise configured; if one wants to override a configured gateway
at runtime, specify ``gateway=False``.)
.. seealso:: :ref:`ssh-gateways`
:param bool forward_agent:
Whether to enable SSH agent forwarding.
Default: ``config.forward_agent``.
:param int connect_timeout:
Connection timeout, in seconds.
Default: ``config.timeouts.connect``.
.. _connect_kwargs-arg:
:param dict connect_kwargs:
Keyword arguments handed verbatim to
`SSHClient.connect <paramiko.client.SSHClient.connect>` (when
`.open` is called).
`.Connection` tries not to grow additional settings/kwargs of its
own unless it is adding value of some kind; thus,
``connect_kwargs`` is currently the right place to hand in paramiko
connection parameters such as ``pkey`` or ``key_filename``. For
example::
c = Connection(
host="hostname",
user="admin",
connect_kwargs={
"key_filename": "/home/myuser/.ssh/private.key",
},
)
Default: ``config.connect_kwargs``.
:raises ValueError:
if user or port values are given via both ``host`` shorthand *and*
their own arguments. (We `refuse the temptation to guess`_).
.. _refuse the temptation to guess:
http://zen-of-python.info/
in-the-face-of-ambiguity-refuse-the-temptation-to-guess.html#12
"""
# NOTE: parent __init__ sets self._config; for now we simply overwrite
# that below. If it's somehow problematic we would want to break parent
# __init__ up in a manner that is more cleanly overrideable.
super(Connection, self).__init__(config=config)
#: The .Config object referenced when handling default values (for e.g.
#: user or port, when not explicitly given) or deciding how to behave.
if config is None:
config = Config()
# Handle 'vanilla' Invoke config objects, which need cloning 'into' one
# of our own Configs (which grants the new defaults, etc, while not
# squashing them if the Invoke-level config already accounted for them)
elif not isinstance(config, Config):
config = config.clone(into=Config)
self._set(_config=config)
# TODO: when/how to run load_files, merge, load_shell_env, etc?
# TODO: i.e. what is the lib use case here (and honestly in invoke too)
shorthand = self.derive_shorthand(host)
host = shorthand["host"]
err = "You supplied the {} via both shorthand and kwarg! Please pick one." # noqa
if shorthand["user"] is not None:
if user is not None:
raise ValueError(err.format("user"))
user = shorthand["user"]
if shorthand["port"] is not None:
if port is not None:
raise ValueError(err.format("port"))
port = shorthand["port"]
# NOTE: we load SSH config data as early as possible as it has
# potential to affect nearly every other attribute.
#: The per-host SSH config data, if any. (See :ref:`ssh-config`.)
self.ssh_config = self.config.base_ssh_config.lookup(host)
self.original_host = host
#: The hostname of the target server.
self.host = host
if "hostname" in self.ssh_config:
# TODO: log that this occurred?
self.host = self.ssh_config["hostname"]
#: The username this connection will use to connect to the remote end.
self.user = user or self.ssh_config.get("user", self.config.user)
# TODO: is it _ever_ possible to give an empty user value (e.g.
# user='')? E.g. do some SSH server specs allow for that?
#: The network port to connect on.
self.port = port or int(self.ssh_config.get("port", self.config.port))
# Gateway/proxy/bastion/jump setting: non-None values - string,
# Connection, even eg False - get set directly; None triggers seek in
# config/ssh_config
#: The gateway `.Connection` or ``ProxyCommand`` string to be used,
#: if any.
self.gateway = gateway if gateway is not None else self.get_gateway()
# NOTE: we use string above, vs ProxyCommand obj, to avoid spinning up
# the ProxyCommand subprocess at init time, vs open() time.
# TODO: make paramiko.proxy.ProxyCommand lazy instead?
if forward_agent is None:
# Default to config...
forward_agent = self.config.forward_agent
# But if ssh_config is present, it wins
if "forwardagent" in self.ssh_config:
# TODO: SSHConfig really, seriously needs some love here, god
map_ = {"yes": True, "no": False}
forward_agent = map_[self.ssh_config["forwardagent"]]
#: Whether agent forwarding is enabled.
self.forward_agent = forward_agent
if connect_timeout is None:
connect_timeout = self.ssh_config.get(
"connecttimeout", self.config.timeouts.connect
)
if connect_timeout is not None:
connect_timeout = int(connect_timeout)
#: Connection timeout
self.connect_timeout = connect_timeout
#: Keyword arguments given to `paramiko.client.SSHClient.connect` when
#: `open` is called.
self.connect_kwargs = self.resolve_connect_kwargs(connect_kwargs)
#: The `paramiko.client.SSHClient` instance this connection wraps.
client = SSHClient()
client.set_missing_host_key_policy(AutoAddPolicy())
self.client = client
#: A convenience handle onto the return value of
#: ``self.client.get_transport()``.
self.transport = None
def resolve_connect_kwargs(self, connect_kwargs):
# Grab connect_kwargs from config if not explicitly given.
if connect_kwargs is None:
# TODO: is it better to pre-empt conflicts w/ manually-handled
# connect() kwargs (hostname, username, etc) here or in open()?
# We're doing open() for now in case e.g. someone manually modifies
# .connect_kwargs attributewise, but otherwise it feels better to
# do it early instead of late.
connect_kwargs = self.config.connect_kwargs
# Special case: key_filename gets merged instead of overridden.
# TODO: probably want some sorta smart merging generally, special cases
# are bad.
elif "key_filename" in self.config.connect_kwargs:
kwarg_val = connect_kwargs.get("key_filename", [])
conf_val = self.config.connect_kwargs["key_filename"]
# Config value comes before kwarg value (because it may contain
# CLI flag value.)
connect_kwargs["key_filename"] = conf_val + kwarg_val
# SSH config identityfile values come last in the key_filename
# 'hierarchy'.
if "identityfile" in self.ssh_config:
connect_kwargs.setdefault("key_filename", [])
connect_kwargs["key_filename"].extend(
self.ssh_config["identityfile"]
)
return connect_kwargs
def get_gateway(self):
# SSH config wins over Invoke-style config
if "proxyjump" in self.ssh_config:
# Reverse hop1,hop2,hop3 style ProxyJump directive so we start
# with the final (itself non-gatewayed) hop and work up to
# the front (actual, supplied as our own gateway) hop
hops = reversed(self.ssh_config["proxyjump"].split(","))
prev_gw = None
for hop in hops:
# Short-circuit if we appear to be our own proxy, which would
# be a RecursionError. Implies SSH config wildcards.
# TODO: in an ideal world we'd check user/port too in case they
# differ, but...seriously? They can file a PR with those extra
# half dozen test cases in play, E_NOTIME
if self.derive_shorthand(hop)["host"] == self.host:
return None
# Happily, ProxyJump uses identical format to our host
# shorthand...
kwargs = dict(config=self.config.clone())
if prev_gw is not None:
kwargs["gateway"] = prev_gw
cxn = Connection(hop, **kwargs)
prev_gw = cxn
return prev_gw
elif "proxycommand" in self.ssh_config:
# Just a string, which we interpret as a proxy command..
return self.ssh_config["proxycommand"]
# Fallback: config value (may be None).
return self.config.gateway
def __repr__(self):
# Host comes first as it's the most common differentiator by far
bits = [("host", self.host)]
# TODO: maybe always show user regardless? Explicit is good...
if self.user != self.config.user:
bits.append(("user", self.user))
# TODO: harder to make case for 'always show port'; maybe if it's
# non-22 (even if config has overridden the local default)?
if self.port != self.config.port:
bits.append(("port", self.port))
# NOTE: sometimes self.gateway may be eg False if someone wants to
# explicitly override a configured non-None value (as otherwise it's
# impossible for __init__ to tell if a None means "nothing given" or
# "seriously please no gatewaying". So, this must always be a vanilla
# truth test and not eg "is not None".
if self.gateway:
# Displaying type because gw params would probs be too verbose
val = "proxyjump"
if isinstance(self.gateway, string_types):
val = "proxycommand"
bits.append(("gw", val))
return "<Connection {}>".format(
" ".join("{}={}".format(*x) for x in bits)
)
def _identity(self):
# TODO: consider including gateway and maybe even other init kwargs?
# Whether two cxns w/ same user/host/port but different
# gateway/keys/etc, should be considered "the same", is unclear.
return (self.host, self.user, self.port)
def __eq__(self, other):
if not isinstance(other, Connection):
return False
return self._identity() == other._identity()
def __lt__(self, other):
return self._identity() < other._identity()
def __hash__(self):
# NOTE: this departs from Context/DataProxy, which is not usefully
# hashable.
return hash(self._identity())
def derive_shorthand(self, host_string):
user_hostport = host_string.rsplit("@", 1)
hostport = user_hostport.pop()
user = user_hostport[0] if user_hostport and user_hostport[0] else None
# IPv6: can't reliably tell where addr ends and port begins, so don't
# try (and don't bother adding special syntax either, user should avoid
# this situation by using port=).
if hostport.count(":") > 1:
host = hostport
port = None
# IPv4: can split on ':' reliably.
else:
host_port = hostport.rsplit(":", 1)
host = host_port.pop(0) or None
port = host_port[0] if host_port and host_port[0] else None
if port is not None:
port = int(port)
return {"user": user, "host": host, "port": port}
@property
def is_connected(self):
"""
Whether or not this connection is actually open.
.. versionadded:: 2.0
"""
return self.transport.active if self.transport else False
def open(self):
"""
Initiate an SSH connection to the host/port this object is bound to.
This may include activating the configured gateway connection, if one
is set.
Also saves a handle to the now-set Transport object for easier access.
Various connect-time settings (and/or their corresponding :ref:`SSH
config options <ssh-config>`) are utilized here in the call to
`SSHClient.connect <paramiko.client.SSHClient.connect>`. (For details,
see :doc:`the configuration docs </concepts/configuration>`.)
.. versionadded:: 2.0
"""
# Short-circuit
if self.is_connected:
return
err = "Refusing to be ambiguous: connect() kwarg '{}' was given both via regular arg and via connect_kwargs!" # noqa
# These may not be given, period
for key in """
hostname
port
username
""".split():
if key in self.connect_kwargs:
raise ValueError(err.format(key))
# These may be given one way or the other, but not both
if (
"timeout" in self.connect_kwargs
and self.connect_timeout is not None
):
raise ValueError(err.format("timeout"))
# No conflicts -> merge 'em together
kwargs = dict(
self.connect_kwargs,
username=self.user,
hostname=self.host,
port=self.port,
)
if self.gateway:
kwargs["sock"] = self.open_gateway()
if self.connect_timeout:
kwargs["timeout"] = self.connect_timeout
# Strip out empty defaults for less noisy debugging
if "key_filename" in kwargs and not kwargs["key_filename"]:
del kwargs["key_filename"]
# Actually connect!
self.client.connect(**kwargs)
self.transport = self.client.get_transport()
def open_gateway(self):
"""
Obtain a socket-like object from `gateway`.
:returns:
A ``direct-tcpip`` `paramiko.channel.Channel`, if `gateway` was a
`.Connection`; or a `~paramiko.proxy.ProxyCommand`, if `gateway`
was a string.
.. versionadded:: 2.0
"""
# ProxyCommand is faster to set up, so do it first.
if isinstance(self.gateway, string_types):
# Leverage a dummy SSHConfig to ensure %h/%p/etc are parsed.
# TODO: use real SSH config once loading one properly is
# implemented.
ssh_conf = SSHConfig()
dummy = "Host {}\n ProxyCommand {}"
ssh_conf.parse(StringIO(dummy.format(self.host, self.gateway)))
return ProxyCommand(ssh_conf.lookup(self.host)["proxycommand"])
# Handle inner-Connection gateway type here.
# TODO: logging
self.gateway.open()
# TODO: expose the opened channel itself as an attribute? (another
# possible argument for separating the two gateway types...) e.g. if
# someone wanted to piggyback on it for other same-interpreter socket
# needs...
# TODO: and the inverse? allow users to supply their own socket/like
# object they got via $WHEREEVER?
# TODO: how best to expose timeout param? reuse general connection
# timeout from config?
return self.gateway.transport.open_channel(
kind="direct-tcpip",
dest_addr=(self.host, int(self.port)),
# NOTE: src_addr needs to be 'empty but not None' values to
# correctly encode into a network message. Theoretically Paramiko
# could auto-interpret None sometime & save us the trouble.
src_addr=("", 0),
)
def close(self):
"""
Terminate the network connection to the remote end, if open.
If no connection is open, this method does nothing.
.. versionadded:: 2.0
"""
if self.is_connected:
self.client.close()
if self.forward_agent and self._agent_handler is not None:
self._agent_handler.close()
def __enter__(self):
return self
def __exit__(self, *exc):
self.close()
@opens
def create_session(self):
channel = self.transport.open_session()
if self.forward_agent:
self._agent_handler = AgentRequestHandler(channel)
return channel
@opens
def run(self, command, **kwargs):
"""
Execute a shell command on the remote end of this connection.
This method wraps an SSH-capable implementation of
`invoke.runners.Runner.run`; see its documentation for details.
.. warning::
There are a few spots where Fabric departs from Invoke's default
settings/behaviors; they are documented under
`.Config.global_defaults`.
.. versionadded:: 2.0
"""
runner = self.config.runners.remote(self)
return self._run(runner, command, **kwargs)
@opens
def sudo(self, command, **kwargs):
"""
Execute a shell command, via ``sudo``, on the remote end.
This method is identical to `invoke.context.Context.sudo` in every way,
except in that -- like `run` -- it honors per-host/per-connection
configuration overrides in addition to the generic/global ones. Thus,
for example, per-host sudo passwords may be configured.
.. versionadded:: 2.0
"""
runner = self.config.runners.remote(self)
return self._sudo(runner, command, **kwargs)
@opens
def sftp(self):
"""
Return a `~paramiko.sftp_client.SFTPClient` object.
If called more than one time, memoizes the first result; thus, any
given `.Connection` instance will only ever have a single SFTP client,
and state (such as that managed by
`~paramiko.sftp_client.SFTPClient.chdir`) will be preserved.
.. versionadded:: 2.0
"""
if self._sftp is None:
self._sftp = self.client.open_sftp()
return self._sftp
def get(self, *args, **kwargs):
"""
Get a remote file to the local filesystem or file-like object.
Simply a wrapper for `.Transfer.get`. Please see its documentation for
all details.
.. versionadded:: 2.0
"""
return Transfer(self).get(*args, **kwargs)
def put(self, *args, **kwargs):
"""
Put a remote file (or file-like object) to the remote filesystem.
Simply a wrapper for `.Transfer.put`. Please see its documentation for
all details.
.. versionadded:: 2.0
"""
return Transfer(self).put(*args, **kwargs)
# TODO: yield the socket for advanced users? Other advanced use cases
# (perhaps factor out socket creation itself)?
# TODO: probably push some of this down into Paramiko
@contextmanager
@opens
def forward_local(
self,
local_port,
remote_port=None,
remote_host="localhost",
local_host="localhost",
):
"""
Open a tunnel connecting ``local_port`` to the server's environment.
For example, say you want to connect to a remote PostgreSQL database
which is locked down and only accessible via the system it's running
on. You have SSH access to this server, so you can temporarily make
port 5432 on your local system act like port 5432 on the server::
import psycopg2
from fabric import Connection
with Connection('my-db-server').forward_local(5432):
db = psycopg2.connect(
host='localhost', port=5432, database='mydb'
)
# Do things with 'db' here
This method is analogous to using the ``-L`` option of OpenSSH's
``ssh`` program.
:param int local_port: The local port number on which to listen.
:param int remote_port:
The remote port number. Defaults to the same value as
``local_port``.
:param str local_host:
The local hostname/interface on which to listen. Default:
``localhost``.
:param str remote_host:
The remote hostname serving the forwarded remote port. Default:
``localhost`` (i.e., the host this `.Connection` is connected to.)
:returns:
Nothing; this method is only useful as a context manager affecting
local operating system state.
.. versionadded:: 2.0
"""
if not remote_port:
remote_port = local_port
# TunnelManager does all of the work, sitting in the background (so we
# can yield) and spawning threads every time somebody connects to our
# local port.
finished = Event()
manager = TunnelManager(
local_port=local_port,
local_host=local_host,
remote_port=remote_port,
remote_host=remote_host,
# TODO: not a huge fan of handing in our transport, but...?
transport=self.transport,
finished=finished,
)
manager.start()
# Return control to caller now that things ought to be operational
try:
yield
# Teardown once user exits block
finally:
# Signal to manager that it should close all open tunnels
finished.set()
# Then wait for it to do so
manager.join()
# Raise threading errors from within the manager, which would be
# one of:
# - an inner ThreadException, which was created by the manager on
# behalf of its Tunnels; this gets directly raised.
# - some other exception, which would thus have occurred in the
# manager itself; we wrap this in a new ThreadException.
# NOTE: in these cases, some of the metadata tracking in
# ExceptionHandlingThread/ExceptionWrapper/ThreadException (which
# is useful when dealing with multiple nearly-identical sibling IO
# threads) is superfluous, but it doesn't feel worth breaking
# things up further; we just ignore it for now.
wrapper = manager.exception()
if wrapper is not None:
if wrapper.type is ThreadException:
raise wrapper.value
else:
raise ThreadException([wrapper])
# TODO: cancel port forward on transport? Does that even make sense
# here (where we used direct-tcpip) vs the opposite method (which
# is what uses forward-tcpip)?
# TODO: probably push some of this down into Paramiko
@contextmanager
@opens
def forward_remote(
self,
remote_port,
local_port=None,
remote_host="127.0.0.1",
local_host="localhost",
):
"""
Open a tunnel connecting ``remote_port`` to the local environment.
For example, say you're running a daemon in development mode on your
workstation at port 8080, and want to funnel traffic to it from a
production or staging environment.
In most situations this isn't possible as your office/home network
probably blocks inbound traffic. But you have SSH access to this
server, so you can temporarily make port 8080 on that server act like
port 8080 on your workstation::
from fabric import Connection
c = Connection('my-remote-server')
with c.forward_remote(8080):
c.run("remote-data-writer --port 8080")
# Assuming remote-data-writer runs until interrupted, this will
# stay open until you Ctrl-C...
This method is analogous to using the ``-R`` option of OpenSSH's
``ssh`` program.
:param int remote_port: The remote port number on which to listen.
:param int local_port:
The local port number. Defaults to the same value as
``remote_port``.
:param str local_host:
The local hostname/interface the forwarded connection talks to.
Default: ``localhost``.
:param str remote_host:
The remote interface address to listen on when forwarding
connections. Default: ``127.0.0.1`` (i.e. only listen on the remote
localhost).
:returns:
Nothing; this method is only useful as a context manager affecting
local operating system state.
.. versionadded:: 2.0
"""
if not local_port:
local_port = remote_port
# Callback executes on each connection to the remote port and is given
# a Channel hooked up to said port. (We don't actually care about the
# source/dest host/port pairs at all; only whether the channel has data
# to read and suchlike.)
# We then pair that channel with a new 'outbound' socket connection to
# the local host/port being forwarded, in a new Tunnel.
# That Tunnel is then added to a shared data structure so we can track
# & close them during shutdown.
#
# TODO: this approach is less than ideal because we have to share state
# between ourselves & the callback handed into the transport's own
# thread handling (which is roughly analogous to our self-controlled
# TunnelManager for local forwarding). See if we can use more of
# Paramiko's API (or improve it and then do so) so that isn't
# necessary.
tunnels = []
def callback(channel, src_addr_tup, dst_addr_tup):
sock = socket.socket()
# TODO: handle connection failure such that channel, etc get closed
sock.connect((local_host, local_port))
# TODO: we don't actually need to generate the Events at our level,
# do we? Just let Tunnel.__init__ do it; all we do is "press its
# button" on shutdown...
tunnel = Tunnel(channel=channel, sock=sock, finished=Event())
tunnel.start()
# Communication between ourselves & the Paramiko handling subthread
tunnels.append(tunnel)
# Ask Paramiko (really, the remote sshd) to call our callback whenever
# connections are established on the remote iface/port.
# transport.request_port_forward(remote_host, remote_port, callback)
try:
self.transport.request_port_forward(
address=remote_host, port=remote_port, handler=callback
)
yield
finally:
# TODO: see above re: lack of a TunnelManager
# TODO: and/or also refactor with TunnelManager re: shutdown logic.
# E.g. maybe have a non-thread TunnelManager-alike with a method
# that acts as the callback? At least then there's a tiny bit more
# encapsulation...meh.
for tunnel in tunnels:
tunnel.finished.set()
tunnel.join()
self.transport.cancel_port_forward(
address=remote_host, port=remote_port
)
|
fabric/fabric | fabric/connection.py | Connection.sftp | python | def sftp(self):
if self._sftp is None:
self._sftp = self.client.open_sftp()
return self._sftp | Return a `~paramiko.sftp_client.SFTPClient` object.
If called more than one time, memoizes the first result; thus, any
given `.Connection` instance will only ever have a single SFTP client,
and state (such as that managed by
`~paramiko.sftp_client.SFTPClient.chdir`) will be preserved.
.. versionadded:: 2.0 | train | https://github.com/fabric/fabric/blob/e9939d68b734935f0c98d98817912ad7c698238f/fabric/connection.py#L640-L653 | null | class Connection(Context):
"""
A connection to an SSH daemon, with methods for commands and file transfer.
**Basics**
This class inherits from Invoke's `~invoke.context.Context`, as it is a
context within which commands, tasks etc can operate. It also encapsulates
a Paramiko `~paramiko.client.SSHClient` instance, performing useful high
level operations with that `~paramiko.client.SSHClient` and
`~paramiko.channel.Channel` instances generated from it.
.. _connect_kwargs:
.. note::
Many SSH specific options -- such as specifying private keys and
passphrases, timeouts, disabling SSH agents, etc -- are handled
directly by Paramiko and should be specified via the
:ref:`connect_kwargs argument <connect_kwargs-arg>` of the constructor.
**Lifecycle**
`.Connection` has a basic "`create <__init__>`, `connect/open <open>`, `do
work <run>`, `disconnect/close <close>`" lifecycle:
* `Instantiation <__init__>` imprints the object with its connection
parameters (but does **not** actually initiate the network connection).
* Methods like `run`, `get` etc automatically trigger a call to
`open` if the connection is not active; users may of course call `open`
manually if desired.
* Connections do not always need to be explicitly closed; much of the
time, Paramiko's garbage collection hooks or Python's own shutdown
sequence will take care of things. **However**, should you encounter edge
cases (for example, sessions hanging on exit) it's helpful to explicitly
close connections when you're done with them.
This can be accomplished by manually calling `close`, or by using the
object as a contextmanager::
with Connection('host') as c:
c.run('command')
c.put('file')
.. note::
This class rebinds `invoke.context.Context.run` to `.local` so both
remote and local command execution can coexist.
**Configuration**
Most `.Connection` parameters honor :doc:`Invoke-style configuration
</concepts/configuration>` as well as any applicable :ref:`SSH config file
directives <connection-ssh-config>`. For example, to end up with a
connection to ``admin@myhost``, one could:
- Use any built-in config mechanism, such as ``/etc/fabric.yml``,
``~/.fabric.json``, collection-driven configuration, env vars, etc,
stating ``user: admin`` (or ``{"user": "admin"}``, depending on config
format.) Then ``Connection('myhost')`` would implicitly have a ``user``
of ``admin``.
- Use an SSH config file containing ``User admin`` within any applicable
``Host`` header (``Host myhost``, ``Host *``, etc.) Again,
``Connection('myhost')`` will default to an ``admin`` user.
- Leverage host-parameter shorthand (described in `.Config.__init__`), i.e.
``Connection('admin@myhost')``.
- Give the parameter directly: ``Connection('myhost', user='admin')``.
The same applies to agent forwarding, gateways, and so forth.
.. versionadded:: 2.0
"""
# NOTE: these are initialized here to hint to invoke.Config.__setattr__
# that they should be treated as real attributes instead of config proxies.
# (Additionally, we're doing this instead of using invoke.Config._set() so
# we can take advantage of Sphinx's attribute-doc-comment static analysis.)
# Once an instance is created, these values will usually be non-None
# because they default to the default config values.
host = None
original_host = None
user = None
port = None
ssh_config = None
gateway = None
forward_agent = None
connect_timeout = None
connect_kwargs = None
client = None
transport = None
_sftp = None
_agent_handler = None
# TODO: should "reopening" an existing Connection object that has been
# closed, be allowed? (See e.g. how v1 detects closed/semi-closed
# connections & nukes them before creating a new client to the same host.)
# TODO: push some of this into paramiko.client.Client? e.g. expand what
# Client.exec_command does, it already allows configuring a subset of what
# we do / will eventually do / did in 1.x. It's silly to have to do
# .get_transport().open_session().
def __init__(
self,
host,
user=None,
port=None,
config=None,
gateway=None,
forward_agent=None,
connect_timeout=None,
connect_kwargs=None,
):
"""
Set up a new object representing a server connection.
:param str host:
the hostname (or IP address) of this connection.
May include shorthand for the ``user`` and/or ``port`` parameters,
of the form ``user@host``, ``host:port``, or ``user@host:port``.
.. note::
Due to ambiguity, IPv6 host addresses are incompatible with the
``host:port`` shorthand (though ``user@host`` will still work
OK). In other words, the presence of >1 ``:`` character will
prevent any attempt to derive a shorthand port number; use the
explicit ``port`` parameter instead.
.. note::
If ``host`` matches a ``Host`` clause in loaded SSH config
data, and that ``Host`` clause contains a ``Hostname``
directive, the resulting `.Connection` object will behave as if
``host`` is equal to that ``Hostname`` value.
In all cases, the original value of ``host`` is preserved as
the ``original_host`` attribute.
Thus, given SSH config like so::
Host myalias
Hostname realhostname
a call like ``Connection(host='myalias')`` will result in an
object whose ``host`` attribute is ``realhostname``, and whose
``original_host`` attribute is ``myalias``.
:param str user:
the login user for the remote connection. Defaults to
``config.user``.
:param int port:
the remote port. Defaults to ``config.port``.
:param config:
configuration settings to use when executing methods on this
`.Connection` (e.g. default SSH port and so forth).
Should be a `.Config` or an `invoke.config.Config`
(which will be turned into a `.Config`).
Default is an anonymous `.Config` object.
:param gateway:
An object to use as a proxy or gateway for this connection.
This parameter accepts one of the following:
- another `.Connection` (for a ``ProxyJump`` style gateway);
- a shell command string (for a ``ProxyCommand`` style style
gateway).
Default: ``None``, meaning no gatewaying will occur (unless
otherwise configured; if one wants to override a configured gateway
at runtime, specify ``gateway=False``.)
.. seealso:: :ref:`ssh-gateways`
:param bool forward_agent:
Whether to enable SSH agent forwarding.
Default: ``config.forward_agent``.
:param int connect_timeout:
Connection timeout, in seconds.
Default: ``config.timeouts.connect``.
.. _connect_kwargs-arg:
:param dict connect_kwargs:
Keyword arguments handed verbatim to
`SSHClient.connect <paramiko.client.SSHClient.connect>` (when
`.open` is called).
`.Connection` tries not to grow additional settings/kwargs of its
own unless it is adding value of some kind; thus,
``connect_kwargs`` is currently the right place to hand in paramiko
connection parameters such as ``pkey`` or ``key_filename``. For
example::
c = Connection(
host="hostname",
user="admin",
connect_kwargs={
"key_filename": "/home/myuser/.ssh/private.key",
},
)
Default: ``config.connect_kwargs``.
:raises ValueError:
if user or port values are given via both ``host`` shorthand *and*
their own arguments. (We `refuse the temptation to guess`_).
.. _refuse the temptation to guess:
http://zen-of-python.info/
in-the-face-of-ambiguity-refuse-the-temptation-to-guess.html#12
"""
# NOTE: parent __init__ sets self._config; for now we simply overwrite
# that below. If it's somehow problematic we would want to break parent
# __init__ up in a manner that is more cleanly overrideable.
super(Connection, self).__init__(config=config)
#: The .Config object referenced when handling default values (for e.g.
#: user or port, when not explicitly given) or deciding how to behave.
if config is None:
config = Config()
# Handle 'vanilla' Invoke config objects, which need cloning 'into' one
# of our own Configs (which grants the new defaults, etc, while not
# squashing them if the Invoke-level config already accounted for them)
elif not isinstance(config, Config):
config = config.clone(into=Config)
self._set(_config=config)
# TODO: when/how to run load_files, merge, load_shell_env, etc?
# TODO: i.e. what is the lib use case here (and honestly in invoke too)
shorthand = self.derive_shorthand(host)
host = shorthand["host"]
err = "You supplied the {} via both shorthand and kwarg! Please pick one." # noqa
if shorthand["user"] is not None:
if user is not None:
raise ValueError(err.format("user"))
user = shorthand["user"]
if shorthand["port"] is not None:
if port is not None:
raise ValueError(err.format("port"))
port = shorthand["port"]
# NOTE: we load SSH config data as early as possible as it has
# potential to affect nearly every other attribute.
#: The per-host SSH config data, if any. (See :ref:`ssh-config`.)
self.ssh_config = self.config.base_ssh_config.lookup(host)
self.original_host = host
#: The hostname of the target server.
self.host = host
if "hostname" in self.ssh_config:
# TODO: log that this occurred?
self.host = self.ssh_config["hostname"]
#: The username this connection will use to connect to the remote end.
self.user = user or self.ssh_config.get("user", self.config.user)
# TODO: is it _ever_ possible to give an empty user value (e.g.
# user='')? E.g. do some SSH server specs allow for that?
#: The network port to connect on.
self.port = port or int(self.ssh_config.get("port", self.config.port))
# Gateway/proxy/bastion/jump setting: non-None values - string,
# Connection, even eg False - get set directly; None triggers seek in
# config/ssh_config
#: The gateway `.Connection` or ``ProxyCommand`` string to be used,
#: if any.
self.gateway = gateway if gateway is not None else self.get_gateway()
# NOTE: we use string above, vs ProxyCommand obj, to avoid spinning up
# the ProxyCommand subprocess at init time, vs open() time.
# TODO: make paramiko.proxy.ProxyCommand lazy instead?
if forward_agent is None:
# Default to config...
forward_agent = self.config.forward_agent
# But if ssh_config is present, it wins
if "forwardagent" in self.ssh_config:
# TODO: SSHConfig really, seriously needs some love here, god
map_ = {"yes": True, "no": False}
forward_agent = map_[self.ssh_config["forwardagent"]]
#: Whether agent forwarding is enabled.
self.forward_agent = forward_agent
if connect_timeout is None:
connect_timeout = self.ssh_config.get(
"connecttimeout", self.config.timeouts.connect
)
if connect_timeout is not None:
connect_timeout = int(connect_timeout)
#: Connection timeout
self.connect_timeout = connect_timeout
#: Keyword arguments given to `paramiko.client.SSHClient.connect` when
#: `open` is called.
self.connect_kwargs = self.resolve_connect_kwargs(connect_kwargs)
#: The `paramiko.client.SSHClient` instance this connection wraps.
client = SSHClient()
client.set_missing_host_key_policy(AutoAddPolicy())
self.client = client
#: A convenience handle onto the return value of
#: ``self.client.get_transport()``.
self.transport = None
def resolve_connect_kwargs(self, connect_kwargs):
# Grab connect_kwargs from config if not explicitly given.
if connect_kwargs is None:
# TODO: is it better to pre-empt conflicts w/ manually-handled
# connect() kwargs (hostname, username, etc) here or in open()?
# We're doing open() for now in case e.g. someone manually modifies
# .connect_kwargs attributewise, but otherwise it feels better to
# do it early instead of late.
connect_kwargs = self.config.connect_kwargs
# Special case: key_filename gets merged instead of overridden.
# TODO: probably want some sorta smart merging generally, special cases
# are bad.
elif "key_filename" in self.config.connect_kwargs:
kwarg_val = connect_kwargs.get("key_filename", [])
conf_val = self.config.connect_kwargs["key_filename"]
# Config value comes before kwarg value (because it may contain
# CLI flag value.)
connect_kwargs["key_filename"] = conf_val + kwarg_val
# SSH config identityfile values come last in the key_filename
# 'hierarchy'.
if "identityfile" in self.ssh_config:
connect_kwargs.setdefault("key_filename", [])
connect_kwargs["key_filename"].extend(
self.ssh_config["identityfile"]
)
return connect_kwargs
def get_gateway(self):
# SSH config wins over Invoke-style config
if "proxyjump" in self.ssh_config:
# Reverse hop1,hop2,hop3 style ProxyJump directive so we start
# with the final (itself non-gatewayed) hop and work up to
# the front (actual, supplied as our own gateway) hop
hops = reversed(self.ssh_config["proxyjump"].split(","))
prev_gw = None
for hop in hops:
# Short-circuit if we appear to be our own proxy, which would
# be a RecursionError. Implies SSH config wildcards.
# TODO: in an ideal world we'd check user/port too in case they
# differ, but...seriously? They can file a PR with those extra
# half dozen test cases in play, E_NOTIME
if self.derive_shorthand(hop)["host"] == self.host:
return None
# Happily, ProxyJump uses identical format to our host
# shorthand...
kwargs = dict(config=self.config.clone())
if prev_gw is not None:
kwargs["gateway"] = prev_gw
cxn = Connection(hop, **kwargs)
prev_gw = cxn
return prev_gw
elif "proxycommand" in self.ssh_config:
# Just a string, which we interpret as a proxy command..
return self.ssh_config["proxycommand"]
# Fallback: config value (may be None).
return self.config.gateway
def __repr__(self):
# Host comes first as it's the most common differentiator by far
bits = [("host", self.host)]
# TODO: maybe always show user regardless? Explicit is good...
if self.user != self.config.user:
bits.append(("user", self.user))
# TODO: harder to make case for 'always show port'; maybe if it's
# non-22 (even if config has overridden the local default)?
if self.port != self.config.port:
bits.append(("port", self.port))
# NOTE: sometimes self.gateway may be eg False if someone wants to
# explicitly override a configured non-None value (as otherwise it's
# impossible for __init__ to tell if a None means "nothing given" or
# "seriously please no gatewaying". So, this must always be a vanilla
# truth test and not eg "is not None".
if self.gateway:
# Displaying type because gw params would probs be too verbose
val = "proxyjump"
if isinstance(self.gateway, string_types):
val = "proxycommand"
bits.append(("gw", val))
return "<Connection {}>".format(
" ".join("{}={}".format(*x) for x in bits)
)
def _identity(self):
# TODO: consider including gateway and maybe even other init kwargs?
# Whether two cxns w/ same user/host/port but different
# gateway/keys/etc, should be considered "the same", is unclear.
return (self.host, self.user, self.port)
def __eq__(self, other):
if not isinstance(other, Connection):
return False
return self._identity() == other._identity()
def __lt__(self, other):
return self._identity() < other._identity()
def __hash__(self):
# NOTE: this departs from Context/DataProxy, which is not usefully
# hashable.
return hash(self._identity())
def derive_shorthand(self, host_string):
user_hostport = host_string.rsplit("@", 1)
hostport = user_hostport.pop()
user = user_hostport[0] if user_hostport and user_hostport[0] else None
# IPv6: can't reliably tell where addr ends and port begins, so don't
# try (and don't bother adding special syntax either, user should avoid
# this situation by using port=).
if hostport.count(":") > 1:
host = hostport
port = None
# IPv4: can split on ':' reliably.
else:
host_port = hostport.rsplit(":", 1)
host = host_port.pop(0) or None
port = host_port[0] if host_port and host_port[0] else None
if port is not None:
port = int(port)
return {"user": user, "host": host, "port": port}
@property
def is_connected(self):
"""
Whether or not this connection is actually open.
.. versionadded:: 2.0
"""
return self.transport.active if self.transport else False
def open(self):
"""
Initiate an SSH connection to the host/port this object is bound to.
This may include activating the configured gateway connection, if one
is set.
Also saves a handle to the now-set Transport object for easier access.
Various connect-time settings (and/or their corresponding :ref:`SSH
config options <ssh-config>`) are utilized here in the call to
`SSHClient.connect <paramiko.client.SSHClient.connect>`. (For details,
see :doc:`the configuration docs </concepts/configuration>`.)
.. versionadded:: 2.0
"""
# Short-circuit
if self.is_connected:
return
err = "Refusing to be ambiguous: connect() kwarg '{}' was given both via regular arg and via connect_kwargs!" # noqa
# These may not be given, period
for key in """
hostname
port
username
""".split():
if key in self.connect_kwargs:
raise ValueError(err.format(key))
# These may be given one way or the other, but not both
if (
"timeout" in self.connect_kwargs
and self.connect_timeout is not None
):
raise ValueError(err.format("timeout"))
# No conflicts -> merge 'em together
kwargs = dict(
self.connect_kwargs,
username=self.user,
hostname=self.host,
port=self.port,
)
if self.gateway:
kwargs["sock"] = self.open_gateway()
if self.connect_timeout:
kwargs["timeout"] = self.connect_timeout
# Strip out empty defaults for less noisy debugging
if "key_filename" in kwargs and not kwargs["key_filename"]:
del kwargs["key_filename"]
# Actually connect!
self.client.connect(**kwargs)
self.transport = self.client.get_transport()
def open_gateway(self):
"""
Obtain a socket-like object from `gateway`.
:returns:
A ``direct-tcpip`` `paramiko.channel.Channel`, if `gateway` was a
`.Connection`; or a `~paramiko.proxy.ProxyCommand`, if `gateway`
was a string.
.. versionadded:: 2.0
"""
# ProxyCommand is faster to set up, so do it first.
if isinstance(self.gateway, string_types):
# Leverage a dummy SSHConfig to ensure %h/%p/etc are parsed.
# TODO: use real SSH config once loading one properly is
# implemented.
ssh_conf = SSHConfig()
dummy = "Host {}\n ProxyCommand {}"
ssh_conf.parse(StringIO(dummy.format(self.host, self.gateway)))
return ProxyCommand(ssh_conf.lookup(self.host)["proxycommand"])
# Handle inner-Connection gateway type here.
# TODO: logging
self.gateway.open()
# TODO: expose the opened channel itself as an attribute? (another
# possible argument for separating the two gateway types...) e.g. if
# someone wanted to piggyback on it for other same-interpreter socket
# needs...
# TODO: and the inverse? allow users to supply their own socket/like
# object they got via $WHEREEVER?
# TODO: how best to expose timeout param? reuse general connection
# timeout from config?
return self.gateway.transport.open_channel(
kind="direct-tcpip",
dest_addr=(self.host, int(self.port)),
# NOTE: src_addr needs to be 'empty but not None' values to
# correctly encode into a network message. Theoretically Paramiko
# could auto-interpret None sometime & save us the trouble.
src_addr=("", 0),
)
def close(self):
"""
Terminate the network connection to the remote end, if open.
If no connection is open, this method does nothing.
.. versionadded:: 2.0
"""
if self.is_connected:
self.client.close()
if self.forward_agent and self._agent_handler is not None:
self._agent_handler.close()
def __enter__(self):
return self
def __exit__(self, *exc):
self.close()
@opens
def create_session(self):
channel = self.transport.open_session()
if self.forward_agent:
self._agent_handler = AgentRequestHandler(channel)
return channel
@opens
def run(self, command, **kwargs):
"""
Execute a shell command on the remote end of this connection.
This method wraps an SSH-capable implementation of
`invoke.runners.Runner.run`; see its documentation for details.
.. warning::
There are a few spots where Fabric departs from Invoke's default
settings/behaviors; they are documented under
`.Config.global_defaults`.
.. versionadded:: 2.0
"""
runner = self.config.runners.remote(self)
return self._run(runner, command, **kwargs)
@opens
def sudo(self, command, **kwargs):
"""
Execute a shell command, via ``sudo``, on the remote end.
This method is identical to `invoke.context.Context.sudo` in every way,
except in that -- like `run` -- it honors per-host/per-connection
configuration overrides in addition to the generic/global ones. Thus,
for example, per-host sudo passwords may be configured.
.. versionadded:: 2.0
"""
runner = self.config.runners.remote(self)
return self._sudo(runner, command, **kwargs)
def local(self, *args, **kwargs):
"""
Execute a shell command on the local system.
This method is effectively a wrapper of `invoke.run`; see its docs for
details and call signature.
.. versionadded:: 2.0
"""
# Superclass run() uses runners.local, so we can literally just call it
# straight.
return super(Connection, self).run(*args, **kwargs)
@opens
def get(self, *args, **kwargs):
"""
Get a remote file to the local filesystem or file-like object.
Simply a wrapper for `.Transfer.get`. Please see its documentation for
all details.
.. versionadded:: 2.0
"""
return Transfer(self).get(*args, **kwargs)
def put(self, *args, **kwargs):
"""
Put a remote file (or file-like object) to the remote filesystem.
Simply a wrapper for `.Transfer.put`. Please see its documentation for
all details.
.. versionadded:: 2.0
"""
return Transfer(self).put(*args, **kwargs)
# TODO: yield the socket for advanced users? Other advanced use cases
# (perhaps factor out socket creation itself)?
# TODO: probably push some of this down into Paramiko
@contextmanager
@opens
def forward_local(
self,
local_port,
remote_port=None,
remote_host="localhost",
local_host="localhost",
):
"""
Open a tunnel connecting ``local_port`` to the server's environment.
For example, say you want to connect to a remote PostgreSQL database
which is locked down and only accessible via the system it's running
on. You have SSH access to this server, so you can temporarily make
port 5432 on your local system act like port 5432 on the server::
import psycopg2
from fabric import Connection
with Connection('my-db-server').forward_local(5432):
db = psycopg2.connect(
host='localhost', port=5432, database='mydb'
)
# Do things with 'db' here
This method is analogous to using the ``-L`` option of OpenSSH's
``ssh`` program.
:param int local_port: The local port number on which to listen.
:param int remote_port:
The remote port number. Defaults to the same value as
``local_port``.
:param str local_host:
The local hostname/interface on which to listen. Default:
``localhost``.
:param str remote_host:
The remote hostname serving the forwarded remote port. Default:
``localhost`` (i.e., the host this `.Connection` is connected to.)
:returns:
Nothing; this method is only useful as a context manager affecting
local operating system state.
.. versionadded:: 2.0
"""
if not remote_port:
remote_port = local_port
# TunnelManager does all of the work, sitting in the background (so we
# can yield) and spawning threads every time somebody connects to our
# local port.
finished = Event()
manager = TunnelManager(
local_port=local_port,
local_host=local_host,
remote_port=remote_port,
remote_host=remote_host,
# TODO: not a huge fan of handing in our transport, but...?
transport=self.transport,
finished=finished,
)
manager.start()
# Return control to caller now that things ought to be operational
try:
yield
# Teardown once user exits block
finally:
# Signal to manager that it should close all open tunnels
finished.set()
# Then wait for it to do so
manager.join()
# Raise threading errors from within the manager, which would be
# one of:
# - an inner ThreadException, which was created by the manager on
# behalf of its Tunnels; this gets directly raised.
# - some other exception, which would thus have occurred in the
# manager itself; we wrap this in a new ThreadException.
# NOTE: in these cases, some of the metadata tracking in
# ExceptionHandlingThread/ExceptionWrapper/ThreadException (which
# is useful when dealing with multiple nearly-identical sibling IO
# threads) is superfluous, but it doesn't feel worth breaking
# things up further; we just ignore it for now.
wrapper = manager.exception()
if wrapper is not None:
if wrapper.type is ThreadException:
raise wrapper.value
else:
raise ThreadException([wrapper])
# TODO: cancel port forward on transport? Does that even make sense
# here (where we used direct-tcpip) vs the opposite method (which
# is what uses forward-tcpip)?
# TODO: probably push some of this down into Paramiko
@contextmanager
@opens
def forward_remote(
self,
remote_port,
local_port=None,
remote_host="127.0.0.1",
local_host="localhost",
):
"""
Open a tunnel connecting ``remote_port`` to the local environment.
For example, say you're running a daemon in development mode on your
workstation at port 8080, and want to funnel traffic to it from a
production or staging environment.
In most situations this isn't possible as your office/home network
probably blocks inbound traffic. But you have SSH access to this
server, so you can temporarily make port 8080 on that server act like
port 8080 on your workstation::
from fabric import Connection
c = Connection('my-remote-server')
with c.forward_remote(8080):
c.run("remote-data-writer --port 8080")
# Assuming remote-data-writer runs until interrupted, this will
# stay open until you Ctrl-C...
This method is analogous to using the ``-R`` option of OpenSSH's
``ssh`` program.
:param int remote_port: The remote port number on which to listen.
:param int local_port:
The local port number. Defaults to the same value as
``remote_port``.
:param str local_host:
The local hostname/interface the forwarded connection talks to.
Default: ``localhost``.
:param str remote_host:
The remote interface address to listen on when forwarding
connections. Default: ``127.0.0.1`` (i.e. only listen on the remote
localhost).
:returns:
Nothing; this method is only useful as a context manager affecting
local operating system state.
.. versionadded:: 2.0
"""
if not local_port:
local_port = remote_port
# Callback executes on each connection to the remote port and is given
# a Channel hooked up to said port. (We don't actually care about the
# source/dest host/port pairs at all; only whether the channel has data
# to read and suchlike.)
# We then pair that channel with a new 'outbound' socket connection to
# the local host/port being forwarded, in a new Tunnel.
# That Tunnel is then added to a shared data structure so we can track
# & close them during shutdown.
#
# TODO: this approach is less than ideal because we have to share state
# between ourselves & the callback handed into the transport's own
# thread handling (which is roughly analogous to our self-controlled
# TunnelManager for local forwarding). See if we can use more of
# Paramiko's API (or improve it and then do so) so that isn't
# necessary.
tunnels = []
def callback(channel, src_addr_tup, dst_addr_tup):
sock = socket.socket()
# TODO: handle connection failure such that channel, etc get closed
sock.connect((local_host, local_port))
# TODO: we don't actually need to generate the Events at our level,
# do we? Just let Tunnel.__init__ do it; all we do is "press its
# button" on shutdown...
tunnel = Tunnel(channel=channel, sock=sock, finished=Event())
tunnel.start()
# Communication between ourselves & the Paramiko handling subthread
tunnels.append(tunnel)
# Ask Paramiko (really, the remote sshd) to call our callback whenever
# connections are established on the remote iface/port.
# transport.request_port_forward(remote_host, remote_port, callback)
try:
self.transport.request_port_forward(
address=remote_host, port=remote_port, handler=callback
)
yield
finally:
# TODO: see above re: lack of a TunnelManager
# TODO: and/or also refactor with TunnelManager re: shutdown logic.
# E.g. maybe have a non-thread TunnelManager-alike with a method
# that acts as the callback? At least then there's a tiny bit more
# encapsulation...meh.
for tunnel in tunnels:
tunnel.finished.set()
tunnel.join()
self.transport.cancel_port_forward(
address=remote_host, port=remote_port
)
|
fabric/fabric | fabric/connection.py | Connection.forward_local | python | def forward_local(
self,
local_port,
remote_port=None,
remote_host="localhost",
local_host="localhost",
):
if not remote_port:
remote_port = local_port
# TunnelManager does all of the work, sitting in the background (so we
# can yield) and spawning threads every time somebody connects to our
# local port.
finished = Event()
manager = TunnelManager(
local_port=local_port,
local_host=local_host,
remote_port=remote_port,
remote_host=remote_host,
# TODO: not a huge fan of handing in our transport, but...?
transport=self.transport,
finished=finished,
)
manager.start()
# Return control to caller now that things ought to be operational
try:
yield
# Teardown once user exits block
finally:
# Signal to manager that it should close all open tunnels
finished.set()
# Then wait for it to do so
manager.join()
# Raise threading errors from within the manager, which would be
# one of:
# - an inner ThreadException, which was created by the manager on
# behalf of its Tunnels; this gets directly raised.
# - some other exception, which would thus have occurred in the
# manager itself; we wrap this in a new ThreadException.
# NOTE: in these cases, some of the metadata tracking in
# ExceptionHandlingThread/ExceptionWrapper/ThreadException (which
# is useful when dealing with multiple nearly-identical sibling IO
# threads) is superfluous, but it doesn't feel worth breaking
# things up further; we just ignore it for now.
wrapper = manager.exception()
if wrapper is not None:
if wrapper.type is ThreadException:
raise wrapper.value
else:
raise ThreadException([wrapper]) | Open a tunnel connecting ``local_port`` to the server's environment.
For example, say you want to connect to a remote PostgreSQL database
which is locked down and only accessible via the system it's running
on. You have SSH access to this server, so you can temporarily make
port 5432 on your local system act like port 5432 on the server::
import psycopg2
from fabric import Connection
with Connection('my-db-server').forward_local(5432):
db = psycopg2.connect(
host='localhost', port=5432, database='mydb'
)
# Do things with 'db' here
This method is analogous to using the ``-L`` option of OpenSSH's
``ssh`` program.
:param int local_port: The local port number on which to listen.
:param int remote_port:
The remote port number. Defaults to the same value as
``local_port``.
:param str local_host:
The local hostname/interface on which to listen. Default:
``localhost``.
:param str remote_host:
The remote hostname serving the forwarded remote port. Default:
``localhost`` (i.e., the host this `.Connection` is connected to.)
:returns:
Nothing; this method is only useful as a context manager affecting
local operating system state.
.. versionadded:: 2.0 | train | https://github.com/fabric/fabric/blob/e9939d68b734935f0c98d98817912ad7c698238f/fabric/connection.py#L682-L772 | null | class Connection(Context):
"""
A connection to an SSH daemon, with methods for commands and file transfer.
**Basics**
This class inherits from Invoke's `~invoke.context.Context`, as it is a
context within which commands, tasks etc can operate. It also encapsulates
a Paramiko `~paramiko.client.SSHClient` instance, performing useful high
level operations with that `~paramiko.client.SSHClient` and
`~paramiko.channel.Channel` instances generated from it.
.. _connect_kwargs:
.. note::
Many SSH specific options -- such as specifying private keys and
passphrases, timeouts, disabling SSH agents, etc -- are handled
directly by Paramiko and should be specified via the
:ref:`connect_kwargs argument <connect_kwargs-arg>` of the constructor.
**Lifecycle**
`.Connection` has a basic "`create <__init__>`, `connect/open <open>`, `do
work <run>`, `disconnect/close <close>`" lifecycle:
* `Instantiation <__init__>` imprints the object with its connection
parameters (but does **not** actually initiate the network connection).
* Methods like `run`, `get` etc automatically trigger a call to
`open` if the connection is not active; users may of course call `open`
manually if desired.
* Connections do not always need to be explicitly closed; much of the
time, Paramiko's garbage collection hooks or Python's own shutdown
sequence will take care of things. **However**, should you encounter edge
cases (for example, sessions hanging on exit) it's helpful to explicitly
close connections when you're done with them.
This can be accomplished by manually calling `close`, or by using the
object as a contextmanager::
with Connection('host') as c:
c.run('command')
c.put('file')
.. note::
This class rebinds `invoke.context.Context.run` to `.local` so both
remote and local command execution can coexist.
**Configuration**
Most `.Connection` parameters honor :doc:`Invoke-style configuration
</concepts/configuration>` as well as any applicable :ref:`SSH config file
directives <connection-ssh-config>`. For example, to end up with a
connection to ``admin@myhost``, one could:
- Use any built-in config mechanism, such as ``/etc/fabric.yml``,
``~/.fabric.json``, collection-driven configuration, env vars, etc,
stating ``user: admin`` (or ``{"user": "admin"}``, depending on config
format.) Then ``Connection('myhost')`` would implicitly have a ``user``
of ``admin``.
- Use an SSH config file containing ``User admin`` within any applicable
``Host`` header (``Host myhost``, ``Host *``, etc.) Again,
``Connection('myhost')`` will default to an ``admin`` user.
- Leverage host-parameter shorthand (described in `.Config.__init__`), i.e.
``Connection('admin@myhost')``.
- Give the parameter directly: ``Connection('myhost', user='admin')``.
The same applies to agent forwarding, gateways, and so forth.
.. versionadded:: 2.0
"""
# NOTE: these are initialized here to hint to invoke.Config.__setattr__
# that they should be treated as real attributes instead of config proxies.
# (Additionally, we're doing this instead of using invoke.Config._set() so
# we can take advantage of Sphinx's attribute-doc-comment static analysis.)
# Once an instance is created, these values will usually be non-None
# because they default to the default config values.
host = None
original_host = None
user = None
port = None
ssh_config = None
gateway = None
forward_agent = None
connect_timeout = None
connect_kwargs = None
client = None
transport = None
_sftp = None
_agent_handler = None
# TODO: should "reopening" an existing Connection object that has been
# closed, be allowed? (See e.g. how v1 detects closed/semi-closed
# connections & nukes them before creating a new client to the same host.)
# TODO: push some of this into paramiko.client.Client? e.g. expand what
# Client.exec_command does, it already allows configuring a subset of what
# we do / will eventually do / did in 1.x. It's silly to have to do
# .get_transport().open_session().
def __init__(
self,
host,
user=None,
port=None,
config=None,
gateway=None,
forward_agent=None,
connect_timeout=None,
connect_kwargs=None,
):
"""
Set up a new object representing a server connection.
:param str host:
the hostname (or IP address) of this connection.
May include shorthand for the ``user`` and/or ``port`` parameters,
of the form ``user@host``, ``host:port``, or ``user@host:port``.
.. note::
Due to ambiguity, IPv6 host addresses are incompatible with the
``host:port`` shorthand (though ``user@host`` will still work
OK). In other words, the presence of >1 ``:`` character will
prevent any attempt to derive a shorthand port number; use the
explicit ``port`` parameter instead.
.. note::
If ``host`` matches a ``Host`` clause in loaded SSH config
data, and that ``Host`` clause contains a ``Hostname``
directive, the resulting `.Connection` object will behave as if
``host`` is equal to that ``Hostname`` value.
In all cases, the original value of ``host`` is preserved as
the ``original_host`` attribute.
Thus, given SSH config like so::
Host myalias
Hostname realhostname
a call like ``Connection(host='myalias')`` will result in an
object whose ``host`` attribute is ``realhostname``, and whose
``original_host`` attribute is ``myalias``.
:param str user:
the login user for the remote connection. Defaults to
``config.user``.
:param int port:
the remote port. Defaults to ``config.port``.
:param config:
configuration settings to use when executing methods on this
`.Connection` (e.g. default SSH port and so forth).
Should be a `.Config` or an `invoke.config.Config`
(which will be turned into a `.Config`).
Default is an anonymous `.Config` object.
:param gateway:
An object to use as a proxy or gateway for this connection.
This parameter accepts one of the following:
- another `.Connection` (for a ``ProxyJump`` style gateway);
- a shell command string (for a ``ProxyCommand`` style style
gateway).
Default: ``None``, meaning no gatewaying will occur (unless
otherwise configured; if one wants to override a configured gateway
at runtime, specify ``gateway=False``.)
.. seealso:: :ref:`ssh-gateways`
:param bool forward_agent:
Whether to enable SSH agent forwarding.
Default: ``config.forward_agent``.
:param int connect_timeout:
Connection timeout, in seconds.
Default: ``config.timeouts.connect``.
.. _connect_kwargs-arg:
:param dict connect_kwargs:
Keyword arguments handed verbatim to
`SSHClient.connect <paramiko.client.SSHClient.connect>` (when
`.open` is called).
`.Connection` tries not to grow additional settings/kwargs of its
own unless it is adding value of some kind; thus,
``connect_kwargs`` is currently the right place to hand in paramiko
connection parameters such as ``pkey`` or ``key_filename``. For
example::
c = Connection(
host="hostname",
user="admin",
connect_kwargs={
"key_filename": "/home/myuser/.ssh/private.key",
},
)
Default: ``config.connect_kwargs``.
:raises ValueError:
if user or port values are given via both ``host`` shorthand *and*
their own arguments. (We `refuse the temptation to guess`_).
.. _refuse the temptation to guess:
http://zen-of-python.info/
in-the-face-of-ambiguity-refuse-the-temptation-to-guess.html#12
"""
# NOTE: parent __init__ sets self._config; for now we simply overwrite
# that below. If it's somehow problematic we would want to break parent
# __init__ up in a manner that is more cleanly overrideable.
super(Connection, self).__init__(config=config)
#: The .Config object referenced when handling default values (for e.g.
#: user or port, when not explicitly given) or deciding how to behave.
if config is None:
config = Config()
# Handle 'vanilla' Invoke config objects, which need cloning 'into' one
# of our own Configs (which grants the new defaults, etc, while not
# squashing them if the Invoke-level config already accounted for them)
elif not isinstance(config, Config):
config = config.clone(into=Config)
self._set(_config=config)
# TODO: when/how to run load_files, merge, load_shell_env, etc?
# TODO: i.e. what is the lib use case here (and honestly in invoke too)
shorthand = self.derive_shorthand(host)
host = shorthand["host"]
err = "You supplied the {} via both shorthand and kwarg! Please pick one." # noqa
if shorthand["user"] is not None:
if user is not None:
raise ValueError(err.format("user"))
user = shorthand["user"]
if shorthand["port"] is not None:
if port is not None:
raise ValueError(err.format("port"))
port = shorthand["port"]
# NOTE: we load SSH config data as early as possible as it has
# potential to affect nearly every other attribute.
#: The per-host SSH config data, if any. (See :ref:`ssh-config`.)
self.ssh_config = self.config.base_ssh_config.lookup(host)
self.original_host = host
#: The hostname of the target server.
self.host = host
if "hostname" in self.ssh_config:
# TODO: log that this occurred?
self.host = self.ssh_config["hostname"]
#: The username this connection will use to connect to the remote end.
self.user = user or self.ssh_config.get("user", self.config.user)
# TODO: is it _ever_ possible to give an empty user value (e.g.
# user='')? E.g. do some SSH server specs allow for that?
#: The network port to connect on.
self.port = port or int(self.ssh_config.get("port", self.config.port))
# Gateway/proxy/bastion/jump setting: non-None values - string,
# Connection, even eg False - get set directly; None triggers seek in
# config/ssh_config
#: The gateway `.Connection` or ``ProxyCommand`` string to be used,
#: if any.
self.gateway = gateway if gateway is not None else self.get_gateway()
# NOTE: we use string above, vs ProxyCommand obj, to avoid spinning up
# the ProxyCommand subprocess at init time, vs open() time.
# TODO: make paramiko.proxy.ProxyCommand lazy instead?
if forward_agent is None:
# Default to config...
forward_agent = self.config.forward_agent
# But if ssh_config is present, it wins
if "forwardagent" in self.ssh_config:
# TODO: SSHConfig really, seriously needs some love here, god
map_ = {"yes": True, "no": False}
forward_agent = map_[self.ssh_config["forwardagent"]]
#: Whether agent forwarding is enabled.
self.forward_agent = forward_agent
if connect_timeout is None:
connect_timeout = self.ssh_config.get(
"connecttimeout", self.config.timeouts.connect
)
if connect_timeout is not None:
connect_timeout = int(connect_timeout)
#: Connection timeout
self.connect_timeout = connect_timeout
#: Keyword arguments given to `paramiko.client.SSHClient.connect` when
#: `open` is called.
self.connect_kwargs = self.resolve_connect_kwargs(connect_kwargs)
#: The `paramiko.client.SSHClient` instance this connection wraps.
client = SSHClient()
client.set_missing_host_key_policy(AutoAddPolicy())
self.client = client
#: A convenience handle onto the return value of
#: ``self.client.get_transport()``.
self.transport = None
def resolve_connect_kwargs(self, connect_kwargs):
# Grab connect_kwargs from config if not explicitly given.
if connect_kwargs is None:
# TODO: is it better to pre-empt conflicts w/ manually-handled
# connect() kwargs (hostname, username, etc) here or in open()?
# We're doing open() for now in case e.g. someone manually modifies
# .connect_kwargs attributewise, but otherwise it feels better to
# do it early instead of late.
connect_kwargs = self.config.connect_kwargs
# Special case: key_filename gets merged instead of overridden.
# TODO: probably want some sorta smart merging generally, special cases
# are bad.
elif "key_filename" in self.config.connect_kwargs:
kwarg_val = connect_kwargs.get("key_filename", [])
conf_val = self.config.connect_kwargs["key_filename"]
# Config value comes before kwarg value (because it may contain
# CLI flag value.)
connect_kwargs["key_filename"] = conf_val + kwarg_val
# SSH config identityfile values come last in the key_filename
# 'hierarchy'.
if "identityfile" in self.ssh_config:
connect_kwargs.setdefault("key_filename", [])
connect_kwargs["key_filename"].extend(
self.ssh_config["identityfile"]
)
return connect_kwargs
def get_gateway(self):
# SSH config wins over Invoke-style config
if "proxyjump" in self.ssh_config:
# Reverse hop1,hop2,hop3 style ProxyJump directive so we start
# with the final (itself non-gatewayed) hop and work up to
# the front (actual, supplied as our own gateway) hop
hops = reversed(self.ssh_config["proxyjump"].split(","))
prev_gw = None
for hop in hops:
# Short-circuit if we appear to be our own proxy, which would
# be a RecursionError. Implies SSH config wildcards.
# TODO: in an ideal world we'd check user/port too in case they
# differ, but...seriously? They can file a PR with those extra
# half dozen test cases in play, E_NOTIME
if self.derive_shorthand(hop)["host"] == self.host:
return None
# Happily, ProxyJump uses identical format to our host
# shorthand...
kwargs = dict(config=self.config.clone())
if prev_gw is not None:
kwargs["gateway"] = prev_gw
cxn = Connection(hop, **kwargs)
prev_gw = cxn
return prev_gw
elif "proxycommand" in self.ssh_config:
# Just a string, which we interpret as a proxy command..
return self.ssh_config["proxycommand"]
# Fallback: config value (may be None).
return self.config.gateway
def __repr__(self):
# Host comes first as it's the most common differentiator by far
bits = [("host", self.host)]
# TODO: maybe always show user regardless? Explicit is good...
if self.user != self.config.user:
bits.append(("user", self.user))
# TODO: harder to make case for 'always show port'; maybe if it's
# non-22 (even if config has overridden the local default)?
if self.port != self.config.port:
bits.append(("port", self.port))
# NOTE: sometimes self.gateway may be eg False if someone wants to
# explicitly override a configured non-None value (as otherwise it's
# impossible for __init__ to tell if a None means "nothing given" or
# "seriously please no gatewaying". So, this must always be a vanilla
# truth test and not eg "is not None".
if self.gateway:
# Displaying type because gw params would probs be too verbose
val = "proxyjump"
if isinstance(self.gateway, string_types):
val = "proxycommand"
bits.append(("gw", val))
return "<Connection {}>".format(
" ".join("{}={}".format(*x) for x in bits)
)
def _identity(self):
# TODO: consider including gateway and maybe even other init kwargs?
# Whether two cxns w/ same user/host/port but different
# gateway/keys/etc, should be considered "the same", is unclear.
return (self.host, self.user, self.port)
def __eq__(self, other):
if not isinstance(other, Connection):
return False
return self._identity() == other._identity()
def __lt__(self, other):
return self._identity() < other._identity()
def __hash__(self):
# NOTE: this departs from Context/DataProxy, which is not usefully
# hashable.
return hash(self._identity())
def derive_shorthand(self, host_string):
user_hostport = host_string.rsplit("@", 1)
hostport = user_hostport.pop()
user = user_hostport[0] if user_hostport and user_hostport[0] else None
# IPv6: can't reliably tell where addr ends and port begins, so don't
# try (and don't bother adding special syntax either, user should avoid
# this situation by using port=).
if hostport.count(":") > 1:
host = hostport
port = None
# IPv4: can split on ':' reliably.
else:
host_port = hostport.rsplit(":", 1)
host = host_port.pop(0) or None
port = host_port[0] if host_port and host_port[0] else None
if port is not None:
port = int(port)
return {"user": user, "host": host, "port": port}
@property
def is_connected(self):
"""
Whether or not this connection is actually open.
.. versionadded:: 2.0
"""
return self.transport.active if self.transport else False
def open(self):
"""
Initiate an SSH connection to the host/port this object is bound to.
This may include activating the configured gateway connection, if one
is set.
Also saves a handle to the now-set Transport object for easier access.
Various connect-time settings (and/or their corresponding :ref:`SSH
config options <ssh-config>`) are utilized here in the call to
`SSHClient.connect <paramiko.client.SSHClient.connect>`. (For details,
see :doc:`the configuration docs </concepts/configuration>`.)
.. versionadded:: 2.0
"""
# Short-circuit
if self.is_connected:
return
err = "Refusing to be ambiguous: connect() kwarg '{}' was given both via regular arg and via connect_kwargs!" # noqa
# These may not be given, period
for key in """
hostname
port
username
""".split():
if key in self.connect_kwargs:
raise ValueError(err.format(key))
# These may be given one way or the other, but not both
if (
"timeout" in self.connect_kwargs
and self.connect_timeout is not None
):
raise ValueError(err.format("timeout"))
# No conflicts -> merge 'em together
kwargs = dict(
self.connect_kwargs,
username=self.user,
hostname=self.host,
port=self.port,
)
if self.gateway:
kwargs["sock"] = self.open_gateway()
if self.connect_timeout:
kwargs["timeout"] = self.connect_timeout
# Strip out empty defaults for less noisy debugging
if "key_filename" in kwargs and not kwargs["key_filename"]:
del kwargs["key_filename"]
# Actually connect!
self.client.connect(**kwargs)
self.transport = self.client.get_transport()
def open_gateway(self):
"""
Obtain a socket-like object from `gateway`.
:returns:
A ``direct-tcpip`` `paramiko.channel.Channel`, if `gateway` was a
`.Connection`; or a `~paramiko.proxy.ProxyCommand`, if `gateway`
was a string.
.. versionadded:: 2.0
"""
# ProxyCommand is faster to set up, so do it first.
if isinstance(self.gateway, string_types):
# Leverage a dummy SSHConfig to ensure %h/%p/etc are parsed.
# TODO: use real SSH config once loading one properly is
# implemented.
ssh_conf = SSHConfig()
dummy = "Host {}\n ProxyCommand {}"
ssh_conf.parse(StringIO(dummy.format(self.host, self.gateway)))
return ProxyCommand(ssh_conf.lookup(self.host)["proxycommand"])
# Handle inner-Connection gateway type here.
# TODO: logging
self.gateway.open()
# TODO: expose the opened channel itself as an attribute? (another
# possible argument for separating the two gateway types...) e.g. if
# someone wanted to piggyback on it for other same-interpreter socket
# needs...
# TODO: and the inverse? allow users to supply their own socket/like
# object they got via $WHEREEVER?
# TODO: how best to expose timeout param? reuse general connection
# timeout from config?
return self.gateway.transport.open_channel(
kind="direct-tcpip",
dest_addr=(self.host, int(self.port)),
# NOTE: src_addr needs to be 'empty but not None' values to
# correctly encode into a network message. Theoretically Paramiko
# could auto-interpret None sometime & save us the trouble.
src_addr=("", 0),
)
def close(self):
"""
Terminate the network connection to the remote end, if open.
If no connection is open, this method does nothing.
.. versionadded:: 2.0
"""
if self.is_connected:
self.client.close()
if self.forward_agent and self._agent_handler is not None:
self._agent_handler.close()
def __enter__(self):
return self
def __exit__(self, *exc):
self.close()
@opens
def create_session(self):
channel = self.transport.open_session()
if self.forward_agent:
self._agent_handler = AgentRequestHandler(channel)
return channel
@opens
def run(self, command, **kwargs):
"""
Execute a shell command on the remote end of this connection.
This method wraps an SSH-capable implementation of
`invoke.runners.Runner.run`; see its documentation for details.
.. warning::
There are a few spots where Fabric departs from Invoke's default
settings/behaviors; they are documented under
`.Config.global_defaults`.
.. versionadded:: 2.0
"""
runner = self.config.runners.remote(self)
return self._run(runner, command, **kwargs)
@opens
def sudo(self, command, **kwargs):
"""
Execute a shell command, via ``sudo``, on the remote end.
This method is identical to `invoke.context.Context.sudo` in every way,
except in that -- like `run` -- it honors per-host/per-connection
configuration overrides in addition to the generic/global ones. Thus,
for example, per-host sudo passwords may be configured.
.. versionadded:: 2.0
"""
runner = self.config.runners.remote(self)
return self._sudo(runner, command, **kwargs)
def local(self, *args, **kwargs):
"""
Execute a shell command on the local system.
This method is effectively a wrapper of `invoke.run`; see its docs for
details and call signature.
.. versionadded:: 2.0
"""
# Superclass run() uses runners.local, so we can literally just call it
# straight.
return super(Connection, self).run(*args, **kwargs)
@opens
def sftp(self):
"""
Return a `~paramiko.sftp_client.SFTPClient` object.
If called more than one time, memoizes the first result; thus, any
given `.Connection` instance will only ever have a single SFTP client,
and state (such as that managed by
`~paramiko.sftp_client.SFTPClient.chdir`) will be preserved.
.. versionadded:: 2.0
"""
if self._sftp is None:
self._sftp = self.client.open_sftp()
return self._sftp
def get(self, *args, **kwargs):
"""
Get a remote file to the local filesystem or file-like object.
Simply a wrapper for `.Transfer.get`. Please see its documentation for
all details.
.. versionadded:: 2.0
"""
return Transfer(self).get(*args, **kwargs)
def put(self, *args, **kwargs):
"""
Put a remote file (or file-like object) to the remote filesystem.
Simply a wrapper for `.Transfer.put`. Please see its documentation for
all details.
.. versionadded:: 2.0
"""
return Transfer(self).put(*args, **kwargs)
# TODO: yield the socket for advanced users? Other advanced use cases
# (perhaps factor out socket creation itself)?
# TODO: probably push some of this down into Paramiko
@contextmanager
@opens
# TODO: cancel port forward on transport? Does that even make sense
# here (where we used direct-tcpip) vs the opposite method (which
# is what uses forward-tcpip)?
# TODO: probably push some of this down into Paramiko
@contextmanager
@opens
def forward_remote(
self,
remote_port,
local_port=None,
remote_host="127.0.0.1",
local_host="localhost",
):
"""
Open a tunnel connecting ``remote_port`` to the local environment.
For example, say you're running a daemon in development mode on your
workstation at port 8080, and want to funnel traffic to it from a
production or staging environment.
In most situations this isn't possible as your office/home network
probably blocks inbound traffic. But you have SSH access to this
server, so you can temporarily make port 8080 on that server act like
port 8080 on your workstation::
from fabric import Connection
c = Connection('my-remote-server')
with c.forward_remote(8080):
c.run("remote-data-writer --port 8080")
# Assuming remote-data-writer runs until interrupted, this will
# stay open until you Ctrl-C...
This method is analogous to using the ``-R`` option of OpenSSH's
``ssh`` program.
:param int remote_port: The remote port number on which to listen.
:param int local_port:
The local port number. Defaults to the same value as
``remote_port``.
:param str local_host:
The local hostname/interface the forwarded connection talks to.
Default: ``localhost``.
:param str remote_host:
The remote interface address to listen on when forwarding
connections. Default: ``127.0.0.1`` (i.e. only listen on the remote
localhost).
:returns:
Nothing; this method is only useful as a context manager affecting
local operating system state.
.. versionadded:: 2.0
"""
if not local_port:
local_port = remote_port
# Callback executes on each connection to the remote port and is given
# a Channel hooked up to said port. (We don't actually care about the
# source/dest host/port pairs at all; only whether the channel has data
# to read and suchlike.)
# We then pair that channel with a new 'outbound' socket connection to
# the local host/port being forwarded, in a new Tunnel.
# That Tunnel is then added to a shared data structure so we can track
# & close them during shutdown.
#
# TODO: this approach is less than ideal because we have to share state
# between ourselves & the callback handed into the transport's own
# thread handling (which is roughly analogous to our self-controlled
# TunnelManager for local forwarding). See if we can use more of
# Paramiko's API (or improve it and then do so) so that isn't
# necessary.
tunnels = []
def callback(channel, src_addr_tup, dst_addr_tup):
sock = socket.socket()
# TODO: handle connection failure such that channel, etc get closed
sock.connect((local_host, local_port))
# TODO: we don't actually need to generate the Events at our level,
# do we? Just let Tunnel.__init__ do it; all we do is "press its
# button" on shutdown...
tunnel = Tunnel(channel=channel, sock=sock, finished=Event())
tunnel.start()
# Communication between ourselves & the Paramiko handling subthread
tunnels.append(tunnel)
# Ask Paramiko (really, the remote sshd) to call our callback whenever
# connections are established on the remote iface/port.
# transport.request_port_forward(remote_host, remote_port, callback)
try:
self.transport.request_port_forward(
address=remote_host, port=remote_port, handler=callback
)
yield
finally:
# TODO: see above re: lack of a TunnelManager
# TODO: and/or also refactor with TunnelManager re: shutdown logic.
# E.g. maybe have a non-thread TunnelManager-alike with a method
# that acts as the callback? At least then there's a tiny bit more
# encapsulation...meh.
for tunnel in tunnels:
tunnel.finished.set()
tunnel.join()
self.transport.cancel_port_forward(
address=remote_host, port=remote_port
)
|
fabric/fabric | fabric/connection.py | Connection.forward_remote | python | def forward_remote(
self,
remote_port,
local_port=None,
remote_host="127.0.0.1",
local_host="localhost",
):
if not local_port:
local_port = remote_port
# Callback executes on each connection to the remote port and is given
# a Channel hooked up to said port. (We don't actually care about the
# source/dest host/port pairs at all; only whether the channel has data
# to read and suchlike.)
# We then pair that channel with a new 'outbound' socket connection to
# the local host/port being forwarded, in a new Tunnel.
# That Tunnel is then added to a shared data structure so we can track
# & close them during shutdown.
#
# TODO: this approach is less than ideal because we have to share state
# between ourselves & the callback handed into the transport's own
# thread handling (which is roughly analogous to our self-controlled
# TunnelManager for local forwarding). See if we can use more of
# Paramiko's API (or improve it and then do so) so that isn't
# necessary.
tunnels = []
def callback(channel, src_addr_tup, dst_addr_tup):
sock = socket.socket()
# TODO: handle connection failure such that channel, etc get closed
sock.connect((local_host, local_port))
# TODO: we don't actually need to generate the Events at our level,
# do we? Just let Tunnel.__init__ do it; all we do is "press its
# button" on shutdown...
tunnel = Tunnel(channel=channel, sock=sock, finished=Event())
tunnel.start()
# Communication between ourselves & the Paramiko handling subthread
tunnels.append(tunnel)
# Ask Paramiko (really, the remote sshd) to call our callback whenever
# connections are established on the remote iface/port.
# transport.request_port_forward(remote_host, remote_port, callback)
try:
self.transport.request_port_forward(
address=remote_host, port=remote_port, handler=callback
)
yield
finally:
# TODO: see above re: lack of a TunnelManager
# TODO: and/or also refactor with TunnelManager re: shutdown logic.
# E.g. maybe have a non-thread TunnelManager-alike with a method
# that acts as the callback? At least then there's a tiny bit more
# encapsulation...meh.
for tunnel in tunnels:
tunnel.finished.set()
tunnel.join()
self.transport.cancel_port_forward(
address=remote_host, port=remote_port
) | Open a tunnel connecting ``remote_port`` to the local environment.
For example, say you're running a daemon in development mode on your
workstation at port 8080, and want to funnel traffic to it from a
production or staging environment.
In most situations this isn't possible as your office/home network
probably blocks inbound traffic. But you have SSH access to this
server, so you can temporarily make port 8080 on that server act like
port 8080 on your workstation::
from fabric import Connection
c = Connection('my-remote-server')
with c.forward_remote(8080):
c.run("remote-data-writer --port 8080")
# Assuming remote-data-writer runs until interrupted, this will
# stay open until you Ctrl-C...
This method is analogous to using the ``-R`` option of OpenSSH's
``ssh`` program.
:param int remote_port: The remote port number on which to listen.
:param int local_port:
The local port number. Defaults to the same value as
``remote_port``.
:param str local_host:
The local hostname/interface the forwarded connection talks to.
Default: ``localhost``.
:param str remote_host:
The remote interface address to listen on when forwarding
connections. Default: ``127.0.0.1`` (i.e. only listen on the remote
localhost).
:returns:
Nothing; this method is only useful as a context manager affecting
local operating system state.
.. versionadded:: 2.0 | train | https://github.com/fabric/fabric/blob/e9939d68b734935f0c98d98817912ad7c698238f/fabric/connection.py#L781-L882 | null | class Connection(Context):
"""
A connection to an SSH daemon, with methods for commands and file transfer.
**Basics**
This class inherits from Invoke's `~invoke.context.Context`, as it is a
context within which commands, tasks etc can operate. It also encapsulates
a Paramiko `~paramiko.client.SSHClient` instance, performing useful high
level operations with that `~paramiko.client.SSHClient` and
`~paramiko.channel.Channel` instances generated from it.
.. _connect_kwargs:
.. note::
Many SSH specific options -- such as specifying private keys and
passphrases, timeouts, disabling SSH agents, etc -- are handled
directly by Paramiko and should be specified via the
:ref:`connect_kwargs argument <connect_kwargs-arg>` of the constructor.
**Lifecycle**
`.Connection` has a basic "`create <__init__>`, `connect/open <open>`, `do
work <run>`, `disconnect/close <close>`" lifecycle:
* `Instantiation <__init__>` imprints the object with its connection
parameters (but does **not** actually initiate the network connection).
* Methods like `run`, `get` etc automatically trigger a call to
`open` if the connection is not active; users may of course call `open`
manually if desired.
* Connections do not always need to be explicitly closed; much of the
time, Paramiko's garbage collection hooks or Python's own shutdown
sequence will take care of things. **However**, should you encounter edge
cases (for example, sessions hanging on exit) it's helpful to explicitly
close connections when you're done with them.
This can be accomplished by manually calling `close`, or by using the
object as a contextmanager::
with Connection('host') as c:
c.run('command')
c.put('file')
.. note::
This class rebinds `invoke.context.Context.run` to `.local` so both
remote and local command execution can coexist.
**Configuration**
Most `.Connection` parameters honor :doc:`Invoke-style configuration
</concepts/configuration>` as well as any applicable :ref:`SSH config file
directives <connection-ssh-config>`. For example, to end up with a
connection to ``admin@myhost``, one could:
- Use any built-in config mechanism, such as ``/etc/fabric.yml``,
``~/.fabric.json``, collection-driven configuration, env vars, etc,
stating ``user: admin`` (or ``{"user": "admin"}``, depending on config
format.) Then ``Connection('myhost')`` would implicitly have a ``user``
of ``admin``.
- Use an SSH config file containing ``User admin`` within any applicable
``Host`` header (``Host myhost``, ``Host *``, etc.) Again,
``Connection('myhost')`` will default to an ``admin`` user.
- Leverage host-parameter shorthand (described in `.Config.__init__`), i.e.
``Connection('admin@myhost')``.
- Give the parameter directly: ``Connection('myhost', user='admin')``.
The same applies to agent forwarding, gateways, and so forth.
.. versionadded:: 2.0
"""
# NOTE: these are initialized here to hint to invoke.Config.__setattr__
# that they should be treated as real attributes instead of config proxies.
# (Additionally, we're doing this instead of using invoke.Config._set() so
# we can take advantage of Sphinx's attribute-doc-comment static analysis.)
# Once an instance is created, these values will usually be non-None
# because they default to the default config values.
host = None
original_host = None
user = None
port = None
ssh_config = None
gateway = None
forward_agent = None
connect_timeout = None
connect_kwargs = None
client = None
transport = None
_sftp = None
_agent_handler = None
# TODO: should "reopening" an existing Connection object that has been
# closed, be allowed? (See e.g. how v1 detects closed/semi-closed
# connections & nukes them before creating a new client to the same host.)
# TODO: push some of this into paramiko.client.Client? e.g. expand what
# Client.exec_command does, it already allows configuring a subset of what
# we do / will eventually do / did in 1.x. It's silly to have to do
# .get_transport().open_session().
def __init__(
self,
host,
user=None,
port=None,
config=None,
gateway=None,
forward_agent=None,
connect_timeout=None,
connect_kwargs=None,
):
"""
Set up a new object representing a server connection.
:param str host:
the hostname (or IP address) of this connection.
May include shorthand for the ``user`` and/or ``port`` parameters,
of the form ``user@host``, ``host:port``, or ``user@host:port``.
.. note::
Due to ambiguity, IPv6 host addresses are incompatible with the
``host:port`` shorthand (though ``user@host`` will still work
OK). In other words, the presence of >1 ``:`` character will
prevent any attempt to derive a shorthand port number; use the
explicit ``port`` parameter instead.
.. note::
If ``host`` matches a ``Host`` clause in loaded SSH config
data, and that ``Host`` clause contains a ``Hostname``
directive, the resulting `.Connection` object will behave as if
``host`` is equal to that ``Hostname`` value.
In all cases, the original value of ``host`` is preserved as
the ``original_host`` attribute.
Thus, given SSH config like so::
Host myalias
Hostname realhostname
a call like ``Connection(host='myalias')`` will result in an
object whose ``host`` attribute is ``realhostname``, and whose
``original_host`` attribute is ``myalias``.
:param str user:
the login user for the remote connection. Defaults to
``config.user``.
:param int port:
the remote port. Defaults to ``config.port``.
:param config:
configuration settings to use when executing methods on this
`.Connection` (e.g. default SSH port and so forth).
Should be a `.Config` or an `invoke.config.Config`
(which will be turned into a `.Config`).
Default is an anonymous `.Config` object.
:param gateway:
An object to use as a proxy or gateway for this connection.
This parameter accepts one of the following:
- another `.Connection` (for a ``ProxyJump`` style gateway);
- a shell command string (for a ``ProxyCommand`` style style
gateway).
Default: ``None``, meaning no gatewaying will occur (unless
otherwise configured; if one wants to override a configured gateway
at runtime, specify ``gateway=False``.)
.. seealso:: :ref:`ssh-gateways`
:param bool forward_agent:
Whether to enable SSH agent forwarding.
Default: ``config.forward_agent``.
:param int connect_timeout:
Connection timeout, in seconds.
Default: ``config.timeouts.connect``.
.. _connect_kwargs-arg:
:param dict connect_kwargs:
Keyword arguments handed verbatim to
`SSHClient.connect <paramiko.client.SSHClient.connect>` (when
`.open` is called).
`.Connection` tries not to grow additional settings/kwargs of its
own unless it is adding value of some kind; thus,
``connect_kwargs`` is currently the right place to hand in paramiko
connection parameters such as ``pkey`` or ``key_filename``. For
example::
c = Connection(
host="hostname",
user="admin",
connect_kwargs={
"key_filename": "/home/myuser/.ssh/private.key",
},
)
Default: ``config.connect_kwargs``.
:raises ValueError:
if user or port values are given via both ``host`` shorthand *and*
their own arguments. (We `refuse the temptation to guess`_).
.. _refuse the temptation to guess:
http://zen-of-python.info/
in-the-face-of-ambiguity-refuse-the-temptation-to-guess.html#12
"""
# NOTE: parent __init__ sets self._config; for now we simply overwrite
# that below. If it's somehow problematic we would want to break parent
# __init__ up in a manner that is more cleanly overrideable.
super(Connection, self).__init__(config=config)
#: The .Config object referenced when handling default values (for e.g.
#: user or port, when not explicitly given) or deciding how to behave.
if config is None:
config = Config()
# Handle 'vanilla' Invoke config objects, which need cloning 'into' one
# of our own Configs (which grants the new defaults, etc, while not
# squashing them if the Invoke-level config already accounted for them)
elif not isinstance(config, Config):
config = config.clone(into=Config)
self._set(_config=config)
# TODO: when/how to run load_files, merge, load_shell_env, etc?
# TODO: i.e. what is the lib use case here (and honestly in invoke too)
shorthand = self.derive_shorthand(host)
host = shorthand["host"]
err = "You supplied the {} via both shorthand and kwarg! Please pick one." # noqa
if shorthand["user"] is not None:
if user is not None:
raise ValueError(err.format("user"))
user = shorthand["user"]
if shorthand["port"] is not None:
if port is not None:
raise ValueError(err.format("port"))
port = shorthand["port"]
# NOTE: we load SSH config data as early as possible as it has
# potential to affect nearly every other attribute.
#: The per-host SSH config data, if any. (See :ref:`ssh-config`.)
self.ssh_config = self.config.base_ssh_config.lookup(host)
self.original_host = host
#: The hostname of the target server.
self.host = host
if "hostname" in self.ssh_config:
# TODO: log that this occurred?
self.host = self.ssh_config["hostname"]
#: The username this connection will use to connect to the remote end.
self.user = user or self.ssh_config.get("user", self.config.user)
# TODO: is it _ever_ possible to give an empty user value (e.g.
# user='')? E.g. do some SSH server specs allow for that?
#: The network port to connect on.
self.port = port or int(self.ssh_config.get("port", self.config.port))
# Gateway/proxy/bastion/jump setting: non-None values - string,
# Connection, even eg False - get set directly; None triggers seek in
# config/ssh_config
#: The gateway `.Connection` or ``ProxyCommand`` string to be used,
#: if any.
self.gateway = gateway if gateway is not None else self.get_gateway()
# NOTE: we use string above, vs ProxyCommand obj, to avoid spinning up
# the ProxyCommand subprocess at init time, vs open() time.
# TODO: make paramiko.proxy.ProxyCommand lazy instead?
if forward_agent is None:
# Default to config...
forward_agent = self.config.forward_agent
# But if ssh_config is present, it wins
if "forwardagent" in self.ssh_config:
# TODO: SSHConfig really, seriously needs some love here, god
map_ = {"yes": True, "no": False}
forward_agent = map_[self.ssh_config["forwardagent"]]
#: Whether agent forwarding is enabled.
self.forward_agent = forward_agent
if connect_timeout is None:
connect_timeout = self.ssh_config.get(
"connecttimeout", self.config.timeouts.connect
)
if connect_timeout is not None:
connect_timeout = int(connect_timeout)
#: Connection timeout
self.connect_timeout = connect_timeout
#: Keyword arguments given to `paramiko.client.SSHClient.connect` when
#: `open` is called.
self.connect_kwargs = self.resolve_connect_kwargs(connect_kwargs)
#: The `paramiko.client.SSHClient` instance this connection wraps.
client = SSHClient()
client.set_missing_host_key_policy(AutoAddPolicy())
self.client = client
#: A convenience handle onto the return value of
#: ``self.client.get_transport()``.
self.transport = None
def resolve_connect_kwargs(self, connect_kwargs):
# Grab connect_kwargs from config if not explicitly given.
if connect_kwargs is None:
# TODO: is it better to pre-empt conflicts w/ manually-handled
# connect() kwargs (hostname, username, etc) here or in open()?
# We're doing open() for now in case e.g. someone manually modifies
# .connect_kwargs attributewise, but otherwise it feels better to
# do it early instead of late.
connect_kwargs = self.config.connect_kwargs
# Special case: key_filename gets merged instead of overridden.
# TODO: probably want some sorta smart merging generally, special cases
# are bad.
elif "key_filename" in self.config.connect_kwargs:
kwarg_val = connect_kwargs.get("key_filename", [])
conf_val = self.config.connect_kwargs["key_filename"]
# Config value comes before kwarg value (because it may contain
# CLI flag value.)
connect_kwargs["key_filename"] = conf_val + kwarg_val
# SSH config identityfile values come last in the key_filename
# 'hierarchy'.
if "identityfile" in self.ssh_config:
connect_kwargs.setdefault("key_filename", [])
connect_kwargs["key_filename"].extend(
self.ssh_config["identityfile"]
)
return connect_kwargs
def get_gateway(self):
# SSH config wins over Invoke-style config
if "proxyjump" in self.ssh_config:
# Reverse hop1,hop2,hop3 style ProxyJump directive so we start
# with the final (itself non-gatewayed) hop and work up to
# the front (actual, supplied as our own gateway) hop
hops = reversed(self.ssh_config["proxyjump"].split(","))
prev_gw = None
for hop in hops:
# Short-circuit if we appear to be our own proxy, which would
# be a RecursionError. Implies SSH config wildcards.
# TODO: in an ideal world we'd check user/port too in case they
# differ, but...seriously? They can file a PR with those extra
# half dozen test cases in play, E_NOTIME
if self.derive_shorthand(hop)["host"] == self.host:
return None
# Happily, ProxyJump uses identical format to our host
# shorthand...
kwargs = dict(config=self.config.clone())
if prev_gw is not None:
kwargs["gateway"] = prev_gw
cxn = Connection(hop, **kwargs)
prev_gw = cxn
return prev_gw
elif "proxycommand" in self.ssh_config:
# Just a string, which we interpret as a proxy command..
return self.ssh_config["proxycommand"]
# Fallback: config value (may be None).
return self.config.gateway
def __repr__(self):
# Host comes first as it's the most common differentiator by far
bits = [("host", self.host)]
# TODO: maybe always show user regardless? Explicit is good...
if self.user != self.config.user:
bits.append(("user", self.user))
# TODO: harder to make case for 'always show port'; maybe if it's
# non-22 (even if config has overridden the local default)?
if self.port != self.config.port:
bits.append(("port", self.port))
# NOTE: sometimes self.gateway may be eg False if someone wants to
# explicitly override a configured non-None value (as otherwise it's
# impossible for __init__ to tell if a None means "nothing given" or
# "seriously please no gatewaying". So, this must always be a vanilla
# truth test and not eg "is not None".
if self.gateway:
# Displaying type because gw params would probs be too verbose
val = "proxyjump"
if isinstance(self.gateway, string_types):
val = "proxycommand"
bits.append(("gw", val))
return "<Connection {}>".format(
" ".join("{}={}".format(*x) for x in bits)
)
def _identity(self):
# TODO: consider including gateway and maybe even other init kwargs?
# Whether two cxns w/ same user/host/port but different
# gateway/keys/etc, should be considered "the same", is unclear.
return (self.host, self.user, self.port)
def __eq__(self, other):
if not isinstance(other, Connection):
return False
return self._identity() == other._identity()
def __lt__(self, other):
return self._identity() < other._identity()
def __hash__(self):
# NOTE: this departs from Context/DataProxy, which is not usefully
# hashable.
return hash(self._identity())
def derive_shorthand(self, host_string):
user_hostport = host_string.rsplit("@", 1)
hostport = user_hostport.pop()
user = user_hostport[0] if user_hostport and user_hostport[0] else None
# IPv6: can't reliably tell where addr ends and port begins, so don't
# try (and don't bother adding special syntax either, user should avoid
# this situation by using port=).
if hostport.count(":") > 1:
host = hostport
port = None
# IPv4: can split on ':' reliably.
else:
host_port = hostport.rsplit(":", 1)
host = host_port.pop(0) or None
port = host_port[0] if host_port and host_port[0] else None
if port is not None:
port = int(port)
return {"user": user, "host": host, "port": port}
@property
def is_connected(self):
"""
Whether or not this connection is actually open.
.. versionadded:: 2.0
"""
return self.transport.active if self.transport else False
def open(self):
"""
Initiate an SSH connection to the host/port this object is bound to.
This may include activating the configured gateway connection, if one
is set.
Also saves a handle to the now-set Transport object for easier access.
Various connect-time settings (and/or their corresponding :ref:`SSH
config options <ssh-config>`) are utilized here in the call to
`SSHClient.connect <paramiko.client.SSHClient.connect>`. (For details,
see :doc:`the configuration docs </concepts/configuration>`.)
.. versionadded:: 2.0
"""
# Short-circuit
if self.is_connected:
return
err = "Refusing to be ambiguous: connect() kwarg '{}' was given both via regular arg and via connect_kwargs!" # noqa
# These may not be given, period
for key in """
hostname
port
username
""".split():
if key in self.connect_kwargs:
raise ValueError(err.format(key))
# These may be given one way or the other, but not both
if (
"timeout" in self.connect_kwargs
and self.connect_timeout is not None
):
raise ValueError(err.format("timeout"))
# No conflicts -> merge 'em together
kwargs = dict(
self.connect_kwargs,
username=self.user,
hostname=self.host,
port=self.port,
)
if self.gateway:
kwargs["sock"] = self.open_gateway()
if self.connect_timeout:
kwargs["timeout"] = self.connect_timeout
# Strip out empty defaults for less noisy debugging
if "key_filename" in kwargs and not kwargs["key_filename"]:
del kwargs["key_filename"]
# Actually connect!
self.client.connect(**kwargs)
self.transport = self.client.get_transport()
def open_gateway(self):
"""
Obtain a socket-like object from `gateway`.
:returns:
A ``direct-tcpip`` `paramiko.channel.Channel`, if `gateway` was a
`.Connection`; or a `~paramiko.proxy.ProxyCommand`, if `gateway`
was a string.
.. versionadded:: 2.0
"""
# ProxyCommand is faster to set up, so do it first.
if isinstance(self.gateway, string_types):
# Leverage a dummy SSHConfig to ensure %h/%p/etc are parsed.
# TODO: use real SSH config once loading one properly is
# implemented.
ssh_conf = SSHConfig()
dummy = "Host {}\n ProxyCommand {}"
ssh_conf.parse(StringIO(dummy.format(self.host, self.gateway)))
return ProxyCommand(ssh_conf.lookup(self.host)["proxycommand"])
# Handle inner-Connection gateway type here.
# TODO: logging
self.gateway.open()
# TODO: expose the opened channel itself as an attribute? (another
# possible argument for separating the two gateway types...) e.g. if
# someone wanted to piggyback on it for other same-interpreter socket
# needs...
# TODO: and the inverse? allow users to supply their own socket/like
# object they got via $WHEREEVER?
# TODO: how best to expose timeout param? reuse general connection
# timeout from config?
return self.gateway.transport.open_channel(
kind="direct-tcpip",
dest_addr=(self.host, int(self.port)),
# NOTE: src_addr needs to be 'empty but not None' values to
# correctly encode into a network message. Theoretically Paramiko
# could auto-interpret None sometime & save us the trouble.
src_addr=("", 0),
)
def close(self):
"""
Terminate the network connection to the remote end, if open.
If no connection is open, this method does nothing.
.. versionadded:: 2.0
"""
if self.is_connected:
self.client.close()
if self.forward_agent and self._agent_handler is not None:
self._agent_handler.close()
def __enter__(self):
return self
def __exit__(self, *exc):
self.close()
@opens
def create_session(self):
channel = self.transport.open_session()
if self.forward_agent:
self._agent_handler = AgentRequestHandler(channel)
return channel
@opens
def run(self, command, **kwargs):
"""
Execute a shell command on the remote end of this connection.
This method wraps an SSH-capable implementation of
`invoke.runners.Runner.run`; see its documentation for details.
.. warning::
There are a few spots where Fabric departs from Invoke's default
settings/behaviors; they are documented under
`.Config.global_defaults`.
.. versionadded:: 2.0
"""
runner = self.config.runners.remote(self)
return self._run(runner, command, **kwargs)
@opens
def sudo(self, command, **kwargs):
"""
Execute a shell command, via ``sudo``, on the remote end.
This method is identical to `invoke.context.Context.sudo` in every way,
except in that -- like `run` -- it honors per-host/per-connection
configuration overrides in addition to the generic/global ones. Thus,
for example, per-host sudo passwords may be configured.
.. versionadded:: 2.0
"""
runner = self.config.runners.remote(self)
return self._sudo(runner, command, **kwargs)
def local(self, *args, **kwargs):
"""
Execute a shell command on the local system.
This method is effectively a wrapper of `invoke.run`; see its docs for
details and call signature.
.. versionadded:: 2.0
"""
# Superclass run() uses runners.local, so we can literally just call it
# straight.
return super(Connection, self).run(*args, **kwargs)
@opens
def sftp(self):
"""
Return a `~paramiko.sftp_client.SFTPClient` object.
If called more than one time, memoizes the first result; thus, any
given `.Connection` instance will only ever have a single SFTP client,
and state (such as that managed by
`~paramiko.sftp_client.SFTPClient.chdir`) will be preserved.
.. versionadded:: 2.0
"""
if self._sftp is None:
self._sftp = self.client.open_sftp()
return self._sftp
def get(self, *args, **kwargs):
"""
Get a remote file to the local filesystem or file-like object.
Simply a wrapper for `.Transfer.get`. Please see its documentation for
all details.
.. versionadded:: 2.0
"""
return Transfer(self).get(*args, **kwargs)
def put(self, *args, **kwargs):
"""
Put a remote file (or file-like object) to the remote filesystem.
Simply a wrapper for `.Transfer.put`. Please see its documentation for
all details.
.. versionadded:: 2.0
"""
return Transfer(self).put(*args, **kwargs)
# TODO: yield the socket for advanced users? Other advanced use cases
# (perhaps factor out socket creation itself)?
# TODO: probably push some of this down into Paramiko
@contextmanager
@opens
def forward_local(
self,
local_port,
remote_port=None,
remote_host="localhost",
local_host="localhost",
):
"""
Open a tunnel connecting ``local_port`` to the server's environment.
For example, say you want to connect to a remote PostgreSQL database
which is locked down and only accessible via the system it's running
on. You have SSH access to this server, so you can temporarily make
port 5432 on your local system act like port 5432 on the server::
import psycopg2
from fabric import Connection
with Connection('my-db-server').forward_local(5432):
db = psycopg2.connect(
host='localhost', port=5432, database='mydb'
)
# Do things with 'db' here
This method is analogous to using the ``-L`` option of OpenSSH's
``ssh`` program.
:param int local_port: The local port number on which to listen.
:param int remote_port:
The remote port number. Defaults to the same value as
``local_port``.
:param str local_host:
The local hostname/interface on which to listen. Default:
``localhost``.
:param str remote_host:
The remote hostname serving the forwarded remote port. Default:
``localhost`` (i.e., the host this `.Connection` is connected to.)
:returns:
Nothing; this method is only useful as a context manager affecting
local operating system state.
.. versionadded:: 2.0
"""
if not remote_port:
remote_port = local_port
# TunnelManager does all of the work, sitting in the background (so we
# can yield) and spawning threads every time somebody connects to our
# local port.
finished = Event()
manager = TunnelManager(
local_port=local_port,
local_host=local_host,
remote_port=remote_port,
remote_host=remote_host,
# TODO: not a huge fan of handing in our transport, but...?
transport=self.transport,
finished=finished,
)
manager.start()
# Return control to caller now that things ought to be operational
try:
yield
# Teardown once user exits block
finally:
# Signal to manager that it should close all open tunnels
finished.set()
# Then wait for it to do so
manager.join()
# Raise threading errors from within the manager, which would be
# one of:
# - an inner ThreadException, which was created by the manager on
# behalf of its Tunnels; this gets directly raised.
# - some other exception, which would thus have occurred in the
# manager itself; we wrap this in a new ThreadException.
# NOTE: in these cases, some of the metadata tracking in
# ExceptionHandlingThread/ExceptionWrapper/ThreadException (which
# is useful when dealing with multiple nearly-identical sibling IO
# threads) is superfluous, but it doesn't feel worth breaking
# things up further; we just ignore it for now.
wrapper = manager.exception()
if wrapper is not None:
if wrapper.type is ThreadException:
raise wrapper.value
else:
raise ThreadException([wrapper])
# TODO: cancel port forward on transport? Does that even make sense
# here (where we used direct-tcpip) vs the opposite method (which
# is what uses forward-tcpip)?
# TODO: probably push some of this down into Paramiko
@contextmanager
@opens
|
yaml/pyyaml | lib/yaml/__init__.py | scan | python | def scan(stream, Loader=Loader):
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose() | Scan a YAML stream and produce scanning tokens. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L58-L67 | [
"def check_token(self, *choices):\n if not self.scanned:\n self.scan()\n if self.tokens:\n if not choices:\n return True\n for choice in choices:\n if isinstance(self.tokens[0], choice):\n return True\n return False\n",
"def check_token(self, *choices):\n if not self.scanned:\n self.scan()\n if self.tokens:\n if not choices:\n return True\n for choice in choices:\n if isinstance(self.tokens[0], choice):\n return True\n return False\n",
"def dispose(self):\n pass\n",
"def dispose(self):\n pass\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | parse | python | def parse(stream, Loader=Loader):
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose() | Parse a YAML stream and produce parsing events. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L69-L78 | [
"def dispose(self):\n pass\n",
"def dispose(self):\n pass\n",
"def check_event(self, *choices):\n if not self.parsed:\n self.parse()\n if self.events:\n if not choices:\n return True\n for choice in choices:\n if isinstance(self.events[0], choice):\n return True\n return False\n",
"def check_event(self, *choices):\n if not self.parsed:\n self.parse()\n if self.events:\n if not choices:\n return True\n for choice in choices:\n if isinstance(self.events[0], choice):\n return True\n return False\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | compose | python | def compose(stream, Loader=Loader):
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose() | Parse the first YAML document in a stream
and produce the corresponding representation tree. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L80-L89 | [
"def dispose(self):\n pass\n",
"def dispose(self):\n pass\n",
"def get_single_node(self):\n # Drop the STREAM-START event.\n self.get_event()\n\n # Compose a document if the stream is not empty.\n document = None\n if not self.check_event(StreamEndEvent):\n document = self.compose_document()\n\n # Ensure that the stream contains no more documents.\n if not self.check_event(StreamEndEvent):\n event = self.get_event()\n raise ComposerError(\"expected a single document in the stream\",\n document.start_mark, \"but found another document\",\n event.start_mark)\n\n # Drop the STREAM-END event.\n self.get_event()\n\n return document\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | compose_all | python | def compose_all(stream, Loader=Loader):
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose() | Parse all YAML documents in a stream
and produce corresponding representation trees. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L91-L101 | [
"def dispose(self):\n pass\n",
"def dispose(self):\n pass\n",
"def check_node(self):\n # Drop the STREAM-START event.\n if self.check_event(StreamStartEvent):\n self.get_event()\n\n # If there are more documents available?\n return not self.check_event(StreamEndEvent)\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | load | python | def load(stream, Loader=None):
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose() | Parse the first YAML document in a stream
and produce the corresponding Python object. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L103-L116 | [
"def load_warning(method):\n if _warnings_enabled['YAMLLoadWarning'] is False:\n return\n\n import warnings\n\n message = (\n \"calling yaml.%s() without Loader=... is deprecated, as the \"\n \"default Loader is unsafe. Please read \"\n \"https://msg.pyyaml.org/load for full details.\"\n ) % method\n\n warnings.warn(message, YAMLLoadWarning, stacklevel=3)\n",
"def dispose(self):\n pass\n",
"def dispose(self):\n pass\n",
"def get_single_data(self):\n # Ensure that the stream contains a single document and construct it.\n node = self.get_single_node()\n if node is not None:\n return self.construct_document(node)\n return None\n",
"def dispose(self):\n # Reset the state attributes (to clear self-references)\n self.states = []\n self.state = None\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | load_all | python | def load_all(stream, Loader=None):
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose() | Parse all YAML documents in a stream
and produce corresponding Python objects. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L118-L132 | [
"def load_warning(method):\n if _warnings_enabled['YAMLLoadWarning'] is False:\n return\n\n import warnings\n\n message = (\n \"calling yaml.%s() without Loader=... is deprecated, as the \"\n \"default Loader is unsafe. Please read \"\n \"https://msg.pyyaml.org/load for full details.\"\n ) % method\n\n warnings.warn(message, YAMLLoadWarning, stacklevel=3)\n",
"def dispose(self):\n pass\n",
"def dispose(self):\n pass\n",
"def check_data(self):\n # If there are more documents available?\n return self.check_node()\n",
"def dispose(self):\n # Reset the state attributes (to clear self-references)\n self.states = []\n self.state = None\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | emit | python | def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue() | Emit YAML parsing events into a stream.
If stream is None, return the produced string instead. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L194-L214 | null |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | serialize | python | def serialize(node, stream=None, Dumper=Dumper, **kwds):
return serialize_all([node], stream, Dumper=Dumper, **kwds) | Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L247-L252 | [
"def serialize_all(nodes, stream=None, Dumper=Dumper,\n canonical=None, indent=None, width=None,\n allow_unicode=None, line_break=None,\n encoding='utf-8', explicit_start=None, explicit_end=None,\n version=None, tags=None):\n \"\"\"\n Serialize a sequence of representation trees into a YAML stream.\n If stream is None, return the produced string instead.\n \"\"\"\n getvalue = None\n if stream is None:\n if encoding is None:\n from StringIO import StringIO\n else:\n from cStringIO import StringIO\n stream = StringIO()\n getvalue = stream.getvalue\n dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,\n allow_unicode=allow_unicode, line_break=line_break,\n encoding=encoding, version=version, tags=tags,\n explicit_start=explicit_start, explicit_end=explicit_end)\n try:\n dumper.open()\n for node in nodes:\n dumper.serialize(node)\n dumper.close()\n finally:\n dumper.dispose()\n if getvalue:\n return getvalue()\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | dump | python | def dump(data, stream=None, Dumper=Dumper, **kwds):
return dump_all([data], stream, Dumper=Dumper, **kwds) | Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L288-L293 | [
"def dump_all(documents, stream=None, Dumper=Dumper,\n default_style=None, default_flow_style=False,\n canonical=None, indent=None, width=None,\n allow_unicode=None, line_break=None,\n encoding='utf-8', explicit_start=None, explicit_end=None,\n version=None, tags=None, sort_keys=True):\n \"\"\"\n Serialize a sequence of Python objects into a YAML stream.\n If stream is None, return the produced string instead.\n \"\"\"\n getvalue = None\n if stream is None:\n if encoding is None:\n from StringIO import StringIO\n else:\n from cStringIO import StringIO\n stream = StringIO()\n getvalue = stream.getvalue\n dumper = Dumper(stream, default_style=default_style,\n default_flow_style=default_flow_style,\n canonical=canonical, indent=indent, width=width,\n allow_unicode=allow_unicode, line_break=line_break,\n encoding=encoding, version=version, tags=tags,\n explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)\n try:\n dumper.open()\n for data in documents:\n dumper.represent(data)\n dumper.close()\n finally:\n dumper.dispose()\n if getvalue:\n return getvalue()\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | safe_dump_all | python | def safe_dump_all(documents, stream=None, **kwds):
return dump_all(documents, stream, Dumper=SafeDumper, **kwds) | Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L295-L301 | [
"def dump_all(documents, stream=None, Dumper=Dumper,\n default_style=None, default_flow_style=False,\n canonical=None, indent=None, width=None,\n allow_unicode=None, line_break=None,\n encoding='utf-8', explicit_start=None, explicit_end=None,\n version=None, tags=None, sort_keys=True):\n \"\"\"\n Serialize a sequence of Python objects into a YAML stream.\n If stream is None, return the produced string instead.\n \"\"\"\n getvalue = None\n if stream is None:\n if encoding is None:\n from StringIO import StringIO\n else:\n from cStringIO import StringIO\n stream = StringIO()\n getvalue = stream.getvalue\n dumper = Dumper(stream, default_style=default_style,\n default_flow_style=default_flow_style,\n canonical=canonical, indent=indent, width=width,\n allow_unicode=allow_unicode, line_break=line_break,\n encoding=encoding, version=version, tags=tags,\n explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)\n try:\n dumper.open()\n for data in documents:\n dumper.represent(data)\n dumper.close()\n finally:\n dumper.dispose()\n if getvalue:\n return getvalue()\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | safe_dump | python | def safe_dump(data, stream=None, **kwds):
return dump_all([data], stream, Dumper=SafeDumper, **kwds) | Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L303-L309 | [
"def dump_all(documents, stream=None, Dumper=Dumper,\n default_style=None, default_flow_style=False,\n canonical=None, indent=None, width=None,\n allow_unicode=None, line_break=None,\n encoding='utf-8', explicit_start=None, explicit_end=None,\n version=None, tags=None, sort_keys=True):\n \"\"\"\n Serialize a sequence of Python objects into a YAML stream.\n If stream is None, return the produced string instead.\n \"\"\"\n getvalue = None\n if stream is None:\n if encoding is None:\n from StringIO import StringIO\n else:\n from cStringIO import StringIO\n stream = StringIO()\n getvalue = stream.getvalue\n dumper = Dumper(stream, default_style=default_style,\n default_flow_style=default_flow_style,\n canonical=canonical, indent=indent, width=width,\n allow_unicode=allow_unicode, line_break=line_break,\n encoding=encoding, version=version, tags=tags,\n explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)\n try:\n dumper.open()\n for data in documents:\n dumper.represent(data)\n dumper.close()\n finally:\n dumper.dispose()\n if getvalue:\n return getvalue()\n"
] |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | add_implicit_resolver | python | def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first) | Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L311-L320 | null |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | add_path_resolver | python | def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind) | Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L322-L330 | null |
from error import *
from tokens import *
from events import *
from nodes import *
from loader import *
from dumper import *
__version__ = '5.1'
try:
from cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
from StringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None):
"""
Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding='utf-8', explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
from StringIO import StringIO
else:
from cStringIO import StringIO
stream = StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib/yaml/__init__.py | YAMLObject.to_yaml | python | def to_yaml(cls, dumper, data):
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style) | Convert a Python object to a representation node. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib/yaml/__init__.py#L399-L404 | null | class YAMLObject(object):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__metaclass__ = YAMLObjectMetaclass
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
from_yaml = classmethod(from_yaml)
to_yaml = classmethod(to_yaml)
|
yaml/pyyaml | lib3/yaml/__init__.py | serialize_all | python | def serialize_all(nodes, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding=None, explicit_start=None, explicit_end=None,
version=None, tags=None):
getvalue = None
if stream is None:
if encoding is None:
stream = io.StringIO()
else:
stream = io.BytesIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end)
try:
dumper.open()
for node in nodes:
dumper.serialize(node)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue() | Serialize a sequence of representation trees into a YAML stream.
If stream is None, return the produced string instead. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/lib3/yaml/__init__.py#L215-L243 | [
"def dispose(self):\n # Reset the state attributes (to clear self-references)\n self.states = []\n self.state = None\n",
"def open(self):\n if self.closed is None:\n self.emit(StreamStartEvent(encoding=self.use_encoding))\n self.closed = False\n elif self.closed:\n raise SerializerError(\"serializer is closed\")\n else:\n raise SerializerError(\"serializer is already opened\")\n",
"def close(self):\n if self.closed is None:\n raise SerializerError(\"serializer is not opened\")\n elif not self.closed:\n self.emit(StreamEndEvent())\n self.closed = True\n",
"def serialize(self, node):\n if self.closed is None:\n raise SerializerError(\"serializer is not opened\")\n elif self.closed:\n raise SerializerError(\"serializer is closed\")\n self.emit(DocumentStartEvent(explicit=self.use_explicit_start,\n version=self.use_version, tags=self.use_tags))\n self.anchor_node(node)\n self.serialize_node(node, None, None)\n self.emit(DocumentEndEvent(explicit=self.use_explicit_end))\n self.serialized_nodes = {}\n self.anchors = {}\n self.last_anchor_id = 0\n"
] |
from .error import *
from .tokens import *
from .events import *
from .nodes import *
from .loader import *
from .dumper import *
__version__ = '5.1'
try:
from .cyaml import *
__with_libyaml__ = True
except ImportError:
__with_libyaml__ = False
import io
#------------------------------------------------------------------------------
# Warnings control
#------------------------------------------------------------------------------
# 'Global' warnings state:
_warnings_enabled = {
'YAMLLoadWarning': True,
}
# Get or set global warnings' state
def warnings(settings=None):
if settings is None:
return _warnings_enabled
if type(settings) is dict:
for key in settings:
if key in _warnings_enabled:
_warnings_enabled[key] = settings[key]
# Warn when load() is called without Loader=...
class YAMLLoadWarning(RuntimeWarning):
pass
def load_warning(method):
if _warnings_enabled['YAMLLoadWarning'] is False:
return
import warnings
message = (
"calling yaml.%s() without Loader=... is deprecated, as the "
"default Loader is unsafe. Please read "
"https://msg.pyyaml.org/load for full details."
) % method
warnings.warn(message, YAMLLoadWarning, stacklevel=3)
#------------------------------------------------------------------------------
def scan(stream, Loader=Loader):
"""
Scan a YAML stream and produce scanning tokens.
"""
loader = Loader(stream)
try:
while loader.check_token():
yield loader.get_token()
finally:
loader.dispose()
def parse(stream, Loader=Loader):
"""
Parse a YAML stream and produce parsing events.
"""
loader = Loader(stream)
try:
while loader.check_event():
yield loader.get_event()
finally:
loader.dispose()
def compose(stream, Loader=Loader):
"""
Parse the first YAML document in a stream
and produce the corresponding representation tree.
"""
loader = Loader(stream)
try:
return loader.get_single_node()
finally:
loader.dispose()
def compose_all(stream, Loader=Loader):
"""
Parse all YAML documents in a stream
and produce corresponding representation trees.
"""
loader = Loader(stream)
try:
while loader.check_node():
yield loader.get_node()
finally:
loader.dispose()
def load(stream, Loader=None):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
"""
if Loader is None:
load_warning('load')
Loader = FullLoader
loader = Loader(stream)
try:
return loader.get_single_data()
finally:
loader.dispose()
def load_all(stream, Loader=None):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
"""
if Loader is None:
load_warning('load_all')
Loader = FullLoader
loader = Loader(stream)
try:
while loader.check_data():
yield loader.get_data()
finally:
loader.dispose()
def full_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load(stream, FullLoader)
def full_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags except those known to be
unsafe on untrusted input.
"""
return load_all(stream, FullLoader)
def safe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load(stream, SafeLoader)
def safe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve only basic YAML tags. This is known
to be safe for untrusted input.
"""
return load_all(stream, SafeLoader)
def unsafe_load(stream):
"""
Parse the first YAML document in a stream
and produce the corresponding Python object.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load(stream, UnsafeLoader)
def unsafe_load_all(stream):
"""
Parse all YAML documents in a stream
and produce corresponding Python objects.
Resolve all tags, even those known to be
unsafe on untrusted input.
"""
return load_all(stream, UnsafeLoader)
def emit(events, stream=None, Dumper=Dumper,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None):
"""
Emit YAML parsing events into a stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
stream = io.StringIO()
getvalue = stream.getvalue
dumper = Dumper(stream, canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
try:
for event in events:
dumper.emit(event)
finally:
dumper.dispose()
if getvalue:
return getvalue()
def serialize(node, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a representation tree into a YAML stream.
If stream is None, return the produced string instead.
"""
return serialize_all([node], stream, Dumper=Dumper, **kwds)
def dump_all(documents, stream=None, Dumper=Dumper,
default_style=None, default_flow_style=False,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding=None, explicit_start=None, explicit_end=None,
version=None, tags=None, sort_keys=True):
"""
Serialize a sequence of Python objects into a YAML stream.
If stream is None, return the produced string instead.
"""
getvalue = None
if stream is None:
if encoding is None:
stream = io.StringIO()
else:
stream = io.BytesIO()
getvalue = stream.getvalue
dumper = Dumper(stream, default_style=default_style,
default_flow_style=default_flow_style,
canonical=canonical, indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break,
encoding=encoding, version=version, tags=tags,
explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys)
try:
dumper.open()
for data in documents:
dumper.represent(data)
dumper.close()
finally:
dumper.dispose()
if getvalue:
return getvalue()
def dump(data, stream=None, Dumper=Dumper, **kwds):
"""
Serialize a Python object into a YAML stream.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=Dumper, **kwds)
def safe_dump_all(documents, stream=None, **kwds):
"""
Serialize a sequence of Python objects into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all(documents, stream, Dumper=SafeDumper, **kwds)
def safe_dump(data, stream=None, **kwds):
"""
Serialize a Python object into a YAML stream.
Produce only basic YAML tags.
If stream is None, return the produced string instead.
"""
return dump_all([data], stream, Dumper=SafeDumper, **kwds)
def add_implicit_resolver(tag, regexp, first=None,
Loader=Loader, Dumper=Dumper):
"""
Add an implicit scalar detector.
If an implicit scalar value matches the given regexp,
the corresponding tag is assigned to the scalar.
first is a sequence of possible initial characters or None.
"""
Loader.add_implicit_resolver(tag, regexp, first)
Dumper.add_implicit_resolver(tag, regexp, first)
def add_path_resolver(tag, path, kind=None, Loader=Loader, Dumper=Dumper):
"""
Add a path based resolver for the given tag.
A path is a list of keys that forms a path
to a node in the representation tree.
Keys can be string values, integers, or None.
"""
Loader.add_path_resolver(tag, path, kind)
Dumper.add_path_resolver(tag, path, kind)
def add_constructor(tag, constructor, Loader=Loader):
"""
Add a constructor for the given tag.
Constructor is a function that accepts a Loader instance
and a node object and produces the corresponding Python object.
"""
Loader.add_constructor(tag, constructor)
def add_multi_constructor(tag_prefix, multi_constructor, Loader=Loader):
"""
Add a multi-constructor for the given tag prefix.
Multi-constructor is called for a node if its tag starts with tag_prefix.
Multi-constructor accepts a Loader instance, a tag suffix,
and a node object and produces the corresponding Python object.
"""
Loader.add_multi_constructor(tag_prefix, multi_constructor)
def add_representer(data_type, representer, Dumper=Dumper):
"""
Add a representer for the given type.
Representer is a function accepting a Dumper instance
and an instance of the given data type
and producing the corresponding representation node.
"""
Dumper.add_representer(data_type, representer)
def add_multi_representer(data_type, multi_representer, Dumper=Dumper):
"""
Add a representer for the given type.
Multi-representer is a function accepting a Dumper instance
and an instance of the given data type or subtype
and producing the corresponding representation node.
"""
Dumper.add_multi_representer(data_type, multi_representer)
class YAMLObjectMetaclass(type):
"""
The metaclass for YAMLObject.
"""
def __init__(cls, name, bases, kwds):
super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds)
if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None:
cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml)
cls.yaml_dumper.add_representer(cls, cls.to_yaml)
class YAMLObject(metaclass=YAMLObjectMetaclass):
"""
An object that can dump itself to a YAML stream
and load itself from a YAML stream.
"""
__slots__ = () # no direct instantiation, so allow immutable subclasses
yaml_loader = Loader
yaml_dumper = Dumper
yaml_tag = None
yaml_flow_style = None
@classmethod
def from_yaml(cls, loader, node):
"""
Convert a representation node to a Python object.
"""
return loader.construct_yaml_object(node, cls)
@classmethod
def to_yaml(cls, dumper, data):
"""
Convert a Python object to a representation node.
"""
return dumper.represent_yaml_object(cls.yaml_tag, data, cls,
flow_style=cls.yaml_flow_style)
|
yaml/pyyaml | examples/pygments-lexer/yaml.py | something | python | def something(TokenClass):
def callback(lexer, match, context):
text = match.group()
if not text:
return
yield match.start(), TokenClass, text
context.pos = match.end()
return callback | Do not produce empty tokens. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/examples/pygments-lexer/yaml.py#L32-L40 | null |
"""
yaml.py
Lexer for YAML, a human-friendly data serialization language
(http://yaml.org/).
Written by Kirill Simonov <xi@resolvent.net>.
License: Whatever suitable for inclusion into the Pygments package.
"""
from pygments.lexer import \
ExtendedRegexLexer, LexerContext, include, bygroups
from pygments.token import \
Text, Comment, Punctuation, Name, Literal
__all__ = ['YAMLLexer']
class YAMLLexerContext(LexerContext):
"""Indentation context for the YAML lexer."""
def __init__(self, *args, **kwds):
super(YAMLLexerContext, self).__init__(*args, **kwds)
self.indent_stack = []
self.indent = -1
self.next_indent = 0
self.block_scalar_indent = None
def reset_indent(TokenClass):
"""Reset the indentation levels."""
def callback(lexer, match, context):
text = match.group()
context.indent_stack = []
context.indent = -1
context.next_indent = 0
context.block_scalar_indent = None
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def save_indent(TokenClass, start=False):
"""Save a possible indentation level."""
def callback(lexer, match, context):
text = match.group()
extra = ''
if start:
context.next_indent = len(text)
if context.next_indent < context.indent:
while context.next_indent < context.indent:
context.indent = context.indent_stack.pop()
if context.next_indent > context.indent:
extra = text[context.indent:]
text = text[:context.indent]
else:
context.next_indent += len(text)
if text:
yield match.start(), TokenClass, text
if extra:
yield match.start()+len(text), TokenClass.Error, extra
context.pos = match.end()
return callback
def set_indent(TokenClass, implicit=False):
"""Set the previously saved indentation level."""
def callback(lexer, match, context):
text = match.group()
if context.indent < context.next_indent:
context.indent_stack.append(context.indent)
context.indent = context.next_indent
if not implicit:
context.next_indent += len(text)
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def set_block_scalar_indent(TokenClass):
"""Set an explicit indentation level for a block scalar."""
def callback(lexer, match, context):
text = match.group()
context.block_scalar_indent = None
if not text:
return
increment = match.group(1)
if increment:
current_indent = max(context.indent, 0)
increment = int(increment)
context.block_scalar_indent = current_indent + increment
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def parse_block_scalar_empty_line(IndentTokenClass, ContentTokenClass):
"""Process an empty line in a block scalar."""
def callback(lexer, match, context):
text = match.group()
if (context.block_scalar_indent is None or
len(text) <= context.block_scalar_indent):
if text:
yield match.start(), IndentTokenClass, text
else:
indentation = text[:context.block_scalar_indent]
content = text[context.block_scalar_indent:]
yield match.start(), IndentTokenClass, indentation
yield (match.start()+context.block_scalar_indent,
ContentTokenClass, content)
context.pos = match.end()
return callback
def parse_block_scalar_indent(TokenClass):
"""Process indentation spaces in a block scalar."""
def callback(lexer, match, context):
text = match.group()
if context.block_scalar_indent is None:
if len(text) <= max(context.indent, 0):
context.stack.pop()
context.stack.pop()
return
context.block_scalar_indent = len(text)
else:
if len(text) < context.block_scalar_indent:
context.stack.pop()
context.stack.pop()
return
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def parse_plain_scalar_indent(TokenClass):
"""Process indentation spaces in a plain scalar."""
def callback(lexer, match, context):
text = match.group()
if len(text) <= context.indent:
context.stack.pop()
context.stack.pop()
return
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
class YAMLLexer(ExtendedRegexLexer):
"""Lexer for the YAML language."""
name = 'YAML'
aliases = ['yaml']
filenames = ['*.yaml', '*.yml']
mimetypes = ['text/x-yaml']
tokens = {
# the root rules
'root': [
# ignored whitespaces
(r'[ ]+(?=#|$)', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# a comment
(r'#[^\n]*', Comment.Single),
# the '%YAML' directive
(r'^%YAML(?=[ ]|$)', reset_indent(Name.Directive),
'yaml-directive'),
# the %TAG directive
(r'^%TAG(?=[ ]|$)', reset_indent(Name.Directive),
'tag-directive'),
# document start and document end indicators
(r'^(?:---|\.\.\.)(?=[ ]|$)',
reset_indent(Punctuation.Document), 'block-line'),
# indentation spaces
(r'[ ]*(?![ \t\n\r\f\v]|$)',
save_indent(Text.Indent, start=True),
('block-line', 'indentation')),
],
# trailing whitespaces after directives or a block scalar indicator
'ignored-line': [
# ignored whitespaces
(r'[ ]+(?=#|$)', Text.Blank),
# a comment
(r'#[^\n]*', Comment.Single),
# line break
(r'\n', Text.Break, '#pop:2'),
],
# the %YAML directive
'yaml-directive': [
# the version number
(r'([ ]+)([0-9]+\.[0-9]+)',
bygroups(Text.Blank, Literal.Version), 'ignored-line'),
],
# the %YAG directive
'tag-directive': [
# a tag handle and the corresponding prefix
(r'([ ]+)(!|![0-9A-Za-z_-]*!)'
r'([ ]+)(!|!?[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)',
bygroups(Text.Blank, Name.Type, Text.Blank, Name.Type),
'ignored-line'),
],
# block scalar indicators and indentation spaces
'indentation': [
# trailing whitespaces are ignored
(r'[ ]*$', something(Text.Blank), '#pop:2'),
# whitespaces preceding block collection indicators
(r'[ ]+(?=[?:-](?:[ ]|$))', save_indent(Text.Indent)),
# block collection indicators
(r'[?:-](?=[ ]|$)', set_indent(Punctuation.Indicator)),
# the beginning a block line
(r'[ ]*', save_indent(Text.Indent), '#pop'),
],
# an indented line in the block context
'block-line': [
# the line end
(r'[ ]*(?=#|$)', something(Text.Blank), '#pop'),
# whitespaces separating tokens
(r'[ ]+', Text.Blank),
# tags, anchors and aliases,
include('descriptors'),
# block collections and scalars
include('block-nodes'),
# flow collections and quoted scalars
include('flow-nodes'),
# a plain scalar
(r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`-]|[?:-][^ \t\n\r\f\v])',
something(Literal.Scalar.Plain),
'plain-scalar-in-block-context'),
],
# tags, anchors, aliases
'descriptors' : [
# a full-form tag
(r'!<[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+>', Name.Type),
# a tag in the form '!', '!suffix' or '!handle!suffix'
(r'!(?:[0-9A-Za-z_-]+)?'
r'(?:![0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)?', Name.Type),
# an anchor
(r'&[0-9A-Za-z_-]+', Name.Anchor),
# an alias
(r'\*[0-9A-Za-z_-]+', Name.Alias),
],
# block collections and scalars
'block-nodes': [
# implicit key
(r':(?=[ ]|$)', set_indent(Punctuation.Indicator, implicit=True)),
# literal and folded scalars
(r'[|>]', Punctuation.Indicator,
('block-scalar-content', 'block-scalar-header')),
],
# flow collections and quoted scalars
'flow-nodes': [
# a flow sequence
(r'\[', Punctuation.Indicator, 'flow-sequence'),
# a flow mapping
(r'\{', Punctuation.Indicator, 'flow-mapping'),
# a single-quoted scalar
(r'\'', Literal.Scalar.Flow.Quote, 'single-quoted-scalar'),
# a double-quoted scalar
(r'\"', Literal.Scalar.Flow.Quote, 'double-quoted-scalar'),
],
# the content of a flow collection
'flow-collection': [
# whitespaces
(r'[ ]+', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# a comment
(r'#[^\n]*', Comment.Single),
# simple indicators
(r'[?:,]', Punctuation.Indicator),
# tags, anchors and aliases
include('descriptors'),
# nested collections and quoted scalars
include('flow-nodes'),
# a plain scalar
(r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`])',
something(Literal.Scalar.Plain),
'plain-scalar-in-flow-context'),
],
# a flow sequence indicated by '[' and ']'
'flow-sequence': [
# include flow collection rules
include('flow-collection'),
# the closing indicator
(r'\]', Punctuation.Indicator, '#pop'),
],
# a flow mapping indicated by '{' and '}'
'flow-mapping': [
# include flow collection rules
include('flow-collection'),
# the closing indicator
(r'\}', Punctuation.Indicator, '#pop'),
],
# block scalar lines
'block-scalar-content': [
# line break
(r'\n', Text.Break),
# empty line
(r'^[ ]+$',
parse_block_scalar_empty_line(Text.Indent,
Literal.Scalar.Block)),
# indentation spaces (we may leave the state here)
(r'^[ ]*', parse_block_scalar_indent(Text.Indent)),
# line content
(r'[^\n\r\f\v]+', Literal.Scalar.Block),
],
# the content of a literal or folded scalar
'block-scalar-header': [
# indentation indicator followed by chomping flag
(r'([1-9])?[+-]?(?=[ ]|$)',
set_block_scalar_indent(Punctuation.Indicator),
'ignored-line'),
# chomping flag followed by indentation indicator
(r'[+-]?([1-9])?(?=[ ]|$)',
set_block_scalar_indent(Punctuation.Indicator),
'ignored-line'),
],
# ignored and regular whitespaces in quoted scalars
'quoted-scalar-whitespaces': [
# leading and trailing whitespaces are ignored
(r'^[ ]+|[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Flow),
],
# single-quoted scalars
'single-quoted-scalar': [
# include whitespace and line break rules
include('quoted-scalar-whitespaces'),
# escaping of the quote character
(r'\'\'', Literal.Scalar.Flow.Escape),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v\']+', Literal.Scalar.Flow),
# the closing quote
(r'\'', Literal.Scalar.Flow.Quote, '#pop'),
],
# double-quoted scalars
'double-quoted-scalar': [
# include whitespace and line break rules
include('quoted-scalar-whitespaces'),
# escaping of special characters
(r'\\[0abt\tn\nvfre "\\N_LP]', Literal.Scalar.Flow.Escape),
# escape codes
(r'\\(?:x[0-9A-Fa-f]{2}|u[0-9A-Fa-f]{4}|U[0-9A-Fa-f]{8})',
Literal.Scalar.Flow.Escape),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v\"\\]+', Literal.Scalar.Flow),
# the closing quote
(r'"', Literal.Scalar.Flow.Quote, '#pop'),
],
# the beginning of a new line while scanning a plain scalar
'plain-scalar-in-block-context-new-line': [
# empty lines
(r'^[ ]+$', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# document start and document end indicators
(r'^(?=---|\.\.\.)', something(Punctuation.Document), '#pop:3'),
# indentation spaces (we may leave the block line state here)
(r'^[ ]*', parse_plain_scalar_indent(Text.Indent), '#pop'),
],
# a plain scalar in the block context
'plain-scalar-in-block-context': [
# the scalar ends with the ':' indicator
(r'[ ]*(?=:[ ]|:$)', something(Text.Blank), '#pop'),
# the scalar ends with whitespaces followed by a comment
(r'[ ]+(?=#)', Text.Blank, '#pop'),
# trailing whitespaces are ignored
(r'[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break, 'plain-scalar-in-block-context-new-line'),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Plain),
# regular non-whitespace characters
(r'(?::(?![ \t\n\r\f\v])|[^ \t\n\r\f\v:])+',
Literal.Scalar.Plain),
],
# a plain scalar is the flow context
'plain-scalar-in-flow-context': [
# the scalar ends with an indicator character
(r'[ ]*(?=[,:?\[\]{}])', something(Text.Blank), '#pop'),
# the scalar ends with a comment
(r'[ ]+(?=#)', Text.Blank, '#pop'),
# leading and trailing whitespaces are ignored
(r'^[ ]+|[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Plain),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v,:?\[\]{}]+', Literal.Scalar.Plain),
],
}
def get_tokens_unprocessed(self, text=None, context=None):
if context is None:
context = YAMLLexerContext(text, 0)
return super(YAMLLexer, self).get_tokens_unprocessed(text, context)
|
yaml/pyyaml | examples/pygments-lexer/yaml.py | set_indent | python | def set_indent(TokenClass, implicit=False):
def callback(lexer, match, context):
text = match.group()
if context.indent < context.next_indent:
context.indent_stack.append(context.indent)
context.indent = context.next_indent
if not implicit:
context.next_indent += len(text)
yield match.start(), TokenClass, text
context.pos = match.end()
return callback | Set the previously saved indentation level. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/examples/pygments-lexer/yaml.py#L76-L87 | null |
"""
yaml.py
Lexer for YAML, a human-friendly data serialization language
(http://yaml.org/).
Written by Kirill Simonov <xi@resolvent.net>.
License: Whatever suitable for inclusion into the Pygments package.
"""
from pygments.lexer import \
ExtendedRegexLexer, LexerContext, include, bygroups
from pygments.token import \
Text, Comment, Punctuation, Name, Literal
__all__ = ['YAMLLexer']
class YAMLLexerContext(LexerContext):
"""Indentation context for the YAML lexer."""
def __init__(self, *args, **kwds):
super(YAMLLexerContext, self).__init__(*args, **kwds)
self.indent_stack = []
self.indent = -1
self.next_indent = 0
self.block_scalar_indent = None
def something(TokenClass):
"""Do not produce empty tokens."""
def callback(lexer, match, context):
text = match.group()
if not text:
return
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def reset_indent(TokenClass):
"""Reset the indentation levels."""
def callback(lexer, match, context):
text = match.group()
context.indent_stack = []
context.indent = -1
context.next_indent = 0
context.block_scalar_indent = None
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def save_indent(TokenClass, start=False):
"""Save a possible indentation level."""
def callback(lexer, match, context):
text = match.group()
extra = ''
if start:
context.next_indent = len(text)
if context.next_indent < context.indent:
while context.next_indent < context.indent:
context.indent = context.indent_stack.pop()
if context.next_indent > context.indent:
extra = text[context.indent:]
text = text[:context.indent]
else:
context.next_indent += len(text)
if text:
yield match.start(), TokenClass, text
if extra:
yield match.start()+len(text), TokenClass.Error, extra
context.pos = match.end()
return callback
def set_block_scalar_indent(TokenClass):
"""Set an explicit indentation level for a block scalar."""
def callback(lexer, match, context):
text = match.group()
context.block_scalar_indent = None
if not text:
return
increment = match.group(1)
if increment:
current_indent = max(context.indent, 0)
increment = int(increment)
context.block_scalar_indent = current_indent + increment
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def parse_block_scalar_empty_line(IndentTokenClass, ContentTokenClass):
"""Process an empty line in a block scalar."""
def callback(lexer, match, context):
text = match.group()
if (context.block_scalar_indent is None or
len(text) <= context.block_scalar_indent):
if text:
yield match.start(), IndentTokenClass, text
else:
indentation = text[:context.block_scalar_indent]
content = text[context.block_scalar_indent:]
yield match.start(), IndentTokenClass, indentation
yield (match.start()+context.block_scalar_indent,
ContentTokenClass, content)
context.pos = match.end()
return callback
def parse_block_scalar_indent(TokenClass):
"""Process indentation spaces in a block scalar."""
def callback(lexer, match, context):
text = match.group()
if context.block_scalar_indent is None:
if len(text) <= max(context.indent, 0):
context.stack.pop()
context.stack.pop()
return
context.block_scalar_indent = len(text)
else:
if len(text) < context.block_scalar_indent:
context.stack.pop()
context.stack.pop()
return
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def parse_plain_scalar_indent(TokenClass):
"""Process indentation spaces in a plain scalar."""
def callback(lexer, match, context):
text = match.group()
if len(text) <= context.indent:
context.stack.pop()
context.stack.pop()
return
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
class YAMLLexer(ExtendedRegexLexer):
"""Lexer for the YAML language."""
name = 'YAML'
aliases = ['yaml']
filenames = ['*.yaml', '*.yml']
mimetypes = ['text/x-yaml']
tokens = {
# the root rules
'root': [
# ignored whitespaces
(r'[ ]+(?=#|$)', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# a comment
(r'#[^\n]*', Comment.Single),
# the '%YAML' directive
(r'^%YAML(?=[ ]|$)', reset_indent(Name.Directive),
'yaml-directive'),
# the %TAG directive
(r'^%TAG(?=[ ]|$)', reset_indent(Name.Directive),
'tag-directive'),
# document start and document end indicators
(r'^(?:---|\.\.\.)(?=[ ]|$)',
reset_indent(Punctuation.Document), 'block-line'),
# indentation spaces
(r'[ ]*(?![ \t\n\r\f\v]|$)',
save_indent(Text.Indent, start=True),
('block-line', 'indentation')),
],
# trailing whitespaces after directives or a block scalar indicator
'ignored-line': [
# ignored whitespaces
(r'[ ]+(?=#|$)', Text.Blank),
# a comment
(r'#[^\n]*', Comment.Single),
# line break
(r'\n', Text.Break, '#pop:2'),
],
# the %YAML directive
'yaml-directive': [
# the version number
(r'([ ]+)([0-9]+\.[0-9]+)',
bygroups(Text.Blank, Literal.Version), 'ignored-line'),
],
# the %YAG directive
'tag-directive': [
# a tag handle and the corresponding prefix
(r'([ ]+)(!|![0-9A-Za-z_-]*!)'
r'([ ]+)(!|!?[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)',
bygroups(Text.Blank, Name.Type, Text.Blank, Name.Type),
'ignored-line'),
],
# block scalar indicators and indentation spaces
'indentation': [
# trailing whitespaces are ignored
(r'[ ]*$', something(Text.Blank), '#pop:2'),
# whitespaces preceding block collection indicators
(r'[ ]+(?=[?:-](?:[ ]|$))', save_indent(Text.Indent)),
# block collection indicators
(r'[?:-](?=[ ]|$)', set_indent(Punctuation.Indicator)),
# the beginning a block line
(r'[ ]*', save_indent(Text.Indent), '#pop'),
],
# an indented line in the block context
'block-line': [
# the line end
(r'[ ]*(?=#|$)', something(Text.Blank), '#pop'),
# whitespaces separating tokens
(r'[ ]+', Text.Blank),
# tags, anchors and aliases,
include('descriptors'),
# block collections and scalars
include('block-nodes'),
# flow collections and quoted scalars
include('flow-nodes'),
# a plain scalar
(r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`-]|[?:-][^ \t\n\r\f\v])',
something(Literal.Scalar.Plain),
'plain-scalar-in-block-context'),
],
# tags, anchors, aliases
'descriptors' : [
# a full-form tag
(r'!<[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+>', Name.Type),
# a tag in the form '!', '!suffix' or '!handle!suffix'
(r'!(?:[0-9A-Za-z_-]+)?'
r'(?:![0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)?', Name.Type),
# an anchor
(r'&[0-9A-Za-z_-]+', Name.Anchor),
# an alias
(r'\*[0-9A-Za-z_-]+', Name.Alias),
],
# block collections and scalars
'block-nodes': [
# implicit key
(r':(?=[ ]|$)', set_indent(Punctuation.Indicator, implicit=True)),
# literal and folded scalars
(r'[|>]', Punctuation.Indicator,
('block-scalar-content', 'block-scalar-header')),
],
# flow collections and quoted scalars
'flow-nodes': [
# a flow sequence
(r'\[', Punctuation.Indicator, 'flow-sequence'),
# a flow mapping
(r'\{', Punctuation.Indicator, 'flow-mapping'),
# a single-quoted scalar
(r'\'', Literal.Scalar.Flow.Quote, 'single-quoted-scalar'),
# a double-quoted scalar
(r'\"', Literal.Scalar.Flow.Quote, 'double-quoted-scalar'),
],
# the content of a flow collection
'flow-collection': [
# whitespaces
(r'[ ]+', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# a comment
(r'#[^\n]*', Comment.Single),
# simple indicators
(r'[?:,]', Punctuation.Indicator),
# tags, anchors and aliases
include('descriptors'),
# nested collections and quoted scalars
include('flow-nodes'),
# a plain scalar
(r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`])',
something(Literal.Scalar.Plain),
'plain-scalar-in-flow-context'),
],
# a flow sequence indicated by '[' and ']'
'flow-sequence': [
# include flow collection rules
include('flow-collection'),
# the closing indicator
(r'\]', Punctuation.Indicator, '#pop'),
],
# a flow mapping indicated by '{' and '}'
'flow-mapping': [
# include flow collection rules
include('flow-collection'),
# the closing indicator
(r'\}', Punctuation.Indicator, '#pop'),
],
# block scalar lines
'block-scalar-content': [
# line break
(r'\n', Text.Break),
# empty line
(r'^[ ]+$',
parse_block_scalar_empty_line(Text.Indent,
Literal.Scalar.Block)),
# indentation spaces (we may leave the state here)
(r'^[ ]*', parse_block_scalar_indent(Text.Indent)),
# line content
(r'[^\n\r\f\v]+', Literal.Scalar.Block),
],
# the content of a literal or folded scalar
'block-scalar-header': [
# indentation indicator followed by chomping flag
(r'([1-9])?[+-]?(?=[ ]|$)',
set_block_scalar_indent(Punctuation.Indicator),
'ignored-line'),
# chomping flag followed by indentation indicator
(r'[+-]?([1-9])?(?=[ ]|$)',
set_block_scalar_indent(Punctuation.Indicator),
'ignored-line'),
],
# ignored and regular whitespaces in quoted scalars
'quoted-scalar-whitespaces': [
# leading and trailing whitespaces are ignored
(r'^[ ]+|[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Flow),
],
# single-quoted scalars
'single-quoted-scalar': [
# include whitespace and line break rules
include('quoted-scalar-whitespaces'),
# escaping of the quote character
(r'\'\'', Literal.Scalar.Flow.Escape),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v\']+', Literal.Scalar.Flow),
# the closing quote
(r'\'', Literal.Scalar.Flow.Quote, '#pop'),
],
# double-quoted scalars
'double-quoted-scalar': [
# include whitespace and line break rules
include('quoted-scalar-whitespaces'),
# escaping of special characters
(r'\\[0abt\tn\nvfre "\\N_LP]', Literal.Scalar.Flow.Escape),
# escape codes
(r'\\(?:x[0-9A-Fa-f]{2}|u[0-9A-Fa-f]{4}|U[0-9A-Fa-f]{8})',
Literal.Scalar.Flow.Escape),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v\"\\]+', Literal.Scalar.Flow),
# the closing quote
(r'"', Literal.Scalar.Flow.Quote, '#pop'),
],
# the beginning of a new line while scanning a plain scalar
'plain-scalar-in-block-context-new-line': [
# empty lines
(r'^[ ]+$', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# document start and document end indicators
(r'^(?=---|\.\.\.)', something(Punctuation.Document), '#pop:3'),
# indentation spaces (we may leave the block line state here)
(r'^[ ]*', parse_plain_scalar_indent(Text.Indent), '#pop'),
],
# a plain scalar in the block context
'plain-scalar-in-block-context': [
# the scalar ends with the ':' indicator
(r'[ ]*(?=:[ ]|:$)', something(Text.Blank), '#pop'),
# the scalar ends with whitespaces followed by a comment
(r'[ ]+(?=#)', Text.Blank, '#pop'),
# trailing whitespaces are ignored
(r'[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break, 'plain-scalar-in-block-context-new-line'),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Plain),
# regular non-whitespace characters
(r'(?::(?![ \t\n\r\f\v])|[^ \t\n\r\f\v:])+',
Literal.Scalar.Plain),
],
# a plain scalar is the flow context
'plain-scalar-in-flow-context': [
# the scalar ends with an indicator character
(r'[ ]*(?=[,:?\[\]{}])', something(Text.Blank), '#pop'),
# the scalar ends with a comment
(r'[ ]+(?=#)', Text.Blank, '#pop'),
# leading and trailing whitespaces are ignored
(r'^[ ]+|[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Plain),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v,:?\[\]{}]+', Literal.Scalar.Plain),
],
}
def get_tokens_unprocessed(self, text=None, context=None):
if context is None:
context = YAMLLexerContext(text, 0)
return super(YAMLLexer, self).get_tokens_unprocessed(text, context)
|
yaml/pyyaml | examples/pygments-lexer/yaml.py | parse_block_scalar_empty_line | python | def parse_block_scalar_empty_line(IndentTokenClass, ContentTokenClass):
def callback(lexer, match, context):
text = match.group()
if (context.block_scalar_indent is None or
len(text) <= context.block_scalar_indent):
if text:
yield match.start(), IndentTokenClass, text
else:
indentation = text[:context.block_scalar_indent]
content = text[context.block_scalar_indent:]
yield match.start(), IndentTokenClass, indentation
yield (match.start()+context.block_scalar_indent,
ContentTokenClass, content)
context.pos = match.end()
return callback | Process an empty line in a block scalar. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/examples/pygments-lexer/yaml.py#L106-L121 | null |
"""
yaml.py
Lexer for YAML, a human-friendly data serialization language
(http://yaml.org/).
Written by Kirill Simonov <xi@resolvent.net>.
License: Whatever suitable for inclusion into the Pygments package.
"""
from pygments.lexer import \
ExtendedRegexLexer, LexerContext, include, bygroups
from pygments.token import \
Text, Comment, Punctuation, Name, Literal
__all__ = ['YAMLLexer']
class YAMLLexerContext(LexerContext):
"""Indentation context for the YAML lexer."""
def __init__(self, *args, **kwds):
super(YAMLLexerContext, self).__init__(*args, **kwds)
self.indent_stack = []
self.indent = -1
self.next_indent = 0
self.block_scalar_indent = None
def something(TokenClass):
"""Do not produce empty tokens."""
def callback(lexer, match, context):
text = match.group()
if not text:
return
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def reset_indent(TokenClass):
"""Reset the indentation levels."""
def callback(lexer, match, context):
text = match.group()
context.indent_stack = []
context.indent = -1
context.next_indent = 0
context.block_scalar_indent = None
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def save_indent(TokenClass, start=False):
"""Save a possible indentation level."""
def callback(lexer, match, context):
text = match.group()
extra = ''
if start:
context.next_indent = len(text)
if context.next_indent < context.indent:
while context.next_indent < context.indent:
context.indent = context.indent_stack.pop()
if context.next_indent > context.indent:
extra = text[context.indent:]
text = text[:context.indent]
else:
context.next_indent += len(text)
if text:
yield match.start(), TokenClass, text
if extra:
yield match.start()+len(text), TokenClass.Error, extra
context.pos = match.end()
return callback
def set_indent(TokenClass, implicit=False):
"""Set the previously saved indentation level."""
def callback(lexer, match, context):
text = match.group()
if context.indent < context.next_indent:
context.indent_stack.append(context.indent)
context.indent = context.next_indent
if not implicit:
context.next_indent += len(text)
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def set_block_scalar_indent(TokenClass):
"""Set an explicit indentation level for a block scalar."""
def callback(lexer, match, context):
text = match.group()
context.block_scalar_indent = None
if not text:
return
increment = match.group(1)
if increment:
current_indent = max(context.indent, 0)
increment = int(increment)
context.block_scalar_indent = current_indent + increment
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def parse_block_scalar_indent(TokenClass):
"""Process indentation spaces in a block scalar."""
def callback(lexer, match, context):
text = match.group()
if context.block_scalar_indent is None:
if len(text) <= max(context.indent, 0):
context.stack.pop()
context.stack.pop()
return
context.block_scalar_indent = len(text)
else:
if len(text) < context.block_scalar_indent:
context.stack.pop()
context.stack.pop()
return
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def parse_plain_scalar_indent(TokenClass):
"""Process indentation spaces in a plain scalar."""
def callback(lexer, match, context):
text = match.group()
if len(text) <= context.indent:
context.stack.pop()
context.stack.pop()
return
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
class YAMLLexer(ExtendedRegexLexer):
"""Lexer for the YAML language."""
name = 'YAML'
aliases = ['yaml']
filenames = ['*.yaml', '*.yml']
mimetypes = ['text/x-yaml']
tokens = {
# the root rules
'root': [
# ignored whitespaces
(r'[ ]+(?=#|$)', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# a comment
(r'#[^\n]*', Comment.Single),
# the '%YAML' directive
(r'^%YAML(?=[ ]|$)', reset_indent(Name.Directive),
'yaml-directive'),
# the %TAG directive
(r'^%TAG(?=[ ]|$)', reset_indent(Name.Directive),
'tag-directive'),
# document start and document end indicators
(r'^(?:---|\.\.\.)(?=[ ]|$)',
reset_indent(Punctuation.Document), 'block-line'),
# indentation spaces
(r'[ ]*(?![ \t\n\r\f\v]|$)',
save_indent(Text.Indent, start=True),
('block-line', 'indentation')),
],
# trailing whitespaces after directives or a block scalar indicator
'ignored-line': [
# ignored whitespaces
(r'[ ]+(?=#|$)', Text.Blank),
# a comment
(r'#[^\n]*', Comment.Single),
# line break
(r'\n', Text.Break, '#pop:2'),
],
# the %YAML directive
'yaml-directive': [
# the version number
(r'([ ]+)([0-9]+\.[0-9]+)',
bygroups(Text.Blank, Literal.Version), 'ignored-line'),
],
# the %YAG directive
'tag-directive': [
# a tag handle and the corresponding prefix
(r'([ ]+)(!|![0-9A-Za-z_-]*!)'
r'([ ]+)(!|!?[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)',
bygroups(Text.Blank, Name.Type, Text.Blank, Name.Type),
'ignored-line'),
],
# block scalar indicators and indentation spaces
'indentation': [
# trailing whitespaces are ignored
(r'[ ]*$', something(Text.Blank), '#pop:2'),
# whitespaces preceding block collection indicators
(r'[ ]+(?=[?:-](?:[ ]|$))', save_indent(Text.Indent)),
# block collection indicators
(r'[?:-](?=[ ]|$)', set_indent(Punctuation.Indicator)),
# the beginning a block line
(r'[ ]*', save_indent(Text.Indent), '#pop'),
],
# an indented line in the block context
'block-line': [
# the line end
(r'[ ]*(?=#|$)', something(Text.Blank), '#pop'),
# whitespaces separating tokens
(r'[ ]+', Text.Blank),
# tags, anchors and aliases,
include('descriptors'),
# block collections and scalars
include('block-nodes'),
# flow collections and quoted scalars
include('flow-nodes'),
# a plain scalar
(r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`-]|[?:-][^ \t\n\r\f\v])',
something(Literal.Scalar.Plain),
'plain-scalar-in-block-context'),
],
# tags, anchors, aliases
'descriptors' : [
# a full-form tag
(r'!<[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+>', Name.Type),
# a tag in the form '!', '!suffix' or '!handle!suffix'
(r'!(?:[0-9A-Za-z_-]+)?'
r'(?:![0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)?', Name.Type),
# an anchor
(r'&[0-9A-Za-z_-]+', Name.Anchor),
# an alias
(r'\*[0-9A-Za-z_-]+', Name.Alias),
],
# block collections and scalars
'block-nodes': [
# implicit key
(r':(?=[ ]|$)', set_indent(Punctuation.Indicator, implicit=True)),
# literal and folded scalars
(r'[|>]', Punctuation.Indicator,
('block-scalar-content', 'block-scalar-header')),
],
# flow collections and quoted scalars
'flow-nodes': [
# a flow sequence
(r'\[', Punctuation.Indicator, 'flow-sequence'),
# a flow mapping
(r'\{', Punctuation.Indicator, 'flow-mapping'),
# a single-quoted scalar
(r'\'', Literal.Scalar.Flow.Quote, 'single-quoted-scalar'),
# a double-quoted scalar
(r'\"', Literal.Scalar.Flow.Quote, 'double-quoted-scalar'),
],
# the content of a flow collection
'flow-collection': [
# whitespaces
(r'[ ]+', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# a comment
(r'#[^\n]*', Comment.Single),
# simple indicators
(r'[?:,]', Punctuation.Indicator),
# tags, anchors and aliases
include('descriptors'),
# nested collections and quoted scalars
include('flow-nodes'),
# a plain scalar
(r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`])',
something(Literal.Scalar.Plain),
'plain-scalar-in-flow-context'),
],
# a flow sequence indicated by '[' and ']'
'flow-sequence': [
# include flow collection rules
include('flow-collection'),
# the closing indicator
(r'\]', Punctuation.Indicator, '#pop'),
],
# a flow mapping indicated by '{' and '}'
'flow-mapping': [
# include flow collection rules
include('flow-collection'),
# the closing indicator
(r'\}', Punctuation.Indicator, '#pop'),
],
# block scalar lines
'block-scalar-content': [
# line break
(r'\n', Text.Break),
# empty line
(r'^[ ]+$',
parse_block_scalar_empty_line(Text.Indent,
Literal.Scalar.Block)),
# indentation spaces (we may leave the state here)
(r'^[ ]*', parse_block_scalar_indent(Text.Indent)),
# line content
(r'[^\n\r\f\v]+', Literal.Scalar.Block),
],
# the content of a literal or folded scalar
'block-scalar-header': [
# indentation indicator followed by chomping flag
(r'([1-9])?[+-]?(?=[ ]|$)',
set_block_scalar_indent(Punctuation.Indicator),
'ignored-line'),
# chomping flag followed by indentation indicator
(r'[+-]?([1-9])?(?=[ ]|$)',
set_block_scalar_indent(Punctuation.Indicator),
'ignored-line'),
],
# ignored and regular whitespaces in quoted scalars
'quoted-scalar-whitespaces': [
# leading and trailing whitespaces are ignored
(r'^[ ]+|[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Flow),
],
# single-quoted scalars
'single-quoted-scalar': [
# include whitespace and line break rules
include('quoted-scalar-whitespaces'),
# escaping of the quote character
(r'\'\'', Literal.Scalar.Flow.Escape),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v\']+', Literal.Scalar.Flow),
# the closing quote
(r'\'', Literal.Scalar.Flow.Quote, '#pop'),
],
# double-quoted scalars
'double-quoted-scalar': [
# include whitespace and line break rules
include('quoted-scalar-whitespaces'),
# escaping of special characters
(r'\\[0abt\tn\nvfre "\\N_LP]', Literal.Scalar.Flow.Escape),
# escape codes
(r'\\(?:x[0-9A-Fa-f]{2}|u[0-9A-Fa-f]{4}|U[0-9A-Fa-f]{8})',
Literal.Scalar.Flow.Escape),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v\"\\]+', Literal.Scalar.Flow),
# the closing quote
(r'"', Literal.Scalar.Flow.Quote, '#pop'),
],
# the beginning of a new line while scanning a plain scalar
'plain-scalar-in-block-context-new-line': [
# empty lines
(r'^[ ]+$', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# document start and document end indicators
(r'^(?=---|\.\.\.)', something(Punctuation.Document), '#pop:3'),
# indentation spaces (we may leave the block line state here)
(r'^[ ]*', parse_plain_scalar_indent(Text.Indent), '#pop'),
],
# a plain scalar in the block context
'plain-scalar-in-block-context': [
# the scalar ends with the ':' indicator
(r'[ ]*(?=:[ ]|:$)', something(Text.Blank), '#pop'),
# the scalar ends with whitespaces followed by a comment
(r'[ ]+(?=#)', Text.Blank, '#pop'),
# trailing whitespaces are ignored
(r'[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break, 'plain-scalar-in-block-context-new-line'),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Plain),
# regular non-whitespace characters
(r'(?::(?![ \t\n\r\f\v])|[^ \t\n\r\f\v:])+',
Literal.Scalar.Plain),
],
# a plain scalar is the flow context
'plain-scalar-in-flow-context': [
# the scalar ends with an indicator character
(r'[ ]*(?=[,:?\[\]{}])', something(Text.Blank), '#pop'),
# the scalar ends with a comment
(r'[ ]+(?=#)', Text.Blank, '#pop'),
# leading and trailing whitespaces are ignored
(r'^[ ]+|[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Plain),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v,:?\[\]{}]+', Literal.Scalar.Plain),
],
}
def get_tokens_unprocessed(self, text=None, context=None):
if context is None:
context = YAMLLexerContext(text, 0)
return super(YAMLLexer, self).get_tokens_unprocessed(text, context)
|
yaml/pyyaml | examples/pygments-lexer/yaml.py | parse_plain_scalar_indent | python | def parse_plain_scalar_indent(TokenClass):
def callback(lexer, match, context):
text = match.group()
if len(text) <= context.indent:
context.stack.pop()
context.stack.pop()
return
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback | Process indentation spaces in a plain scalar. | train | https://github.com/yaml/pyyaml/blob/e471e86bf6dabdad45a1438c20a4a5c033eb9034/examples/pygments-lexer/yaml.py#L143-L154 | null |
"""
yaml.py
Lexer for YAML, a human-friendly data serialization language
(http://yaml.org/).
Written by Kirill Simonov <xi@resolvent.net>.
License: Whatever suitable for inclusion into the Pygments package.
"""
from pygments.lexer import \
ExtendedRegexLexer, LexerContext, include, bygroups
from pygments.token import \
Text, Comment, Punctuation, Name, Literal
__all__ = ['YAMLLexer']
class YAMLLexerContext(LexerContext):
"""Indentation context for the YAML lexer."""
def __init__(self, *args, **kwds):
super(YAMLLexerContext, self).__init__(*args, **kwds)
self.indent_stack = []
self.indent = -1
self.next_indent = 0
self.block_scalar_indent = None
def something(TokenClass):
"""Do not produce empty tokens."""
def callback(lexer, match, context):
text = match.group()
if not text:
return
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def reset_indent(TokenClass):
"""Reset the indentation levels."""
def callback(lexer, match, context):
text = match.group()
context.indent_stack = []
context.indent = -1
context.next_indent = 0
context.block_scalar_indent = None
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def save_indent(TokenClass, start=False):
"""Save a possible indentation level."""
def callback(lexer, match, context):
text = match.group()
extra = ''
if start:
context.next_indent = len(text)
if context.next_indent < context.indent:
while context.next_indent < context.indent:
context.indent = context.indent_stack.pop()
if context.next_indent > context.indent:
extra = text[context.indent:]
text = text[:context.indent]
else:
context.next_indent += len(text)
if text:
yield match.start(), TokenClass, text
if extra:
yield match.start()+len(text), TokenClass.Error, extra
context.pos = match.end()
return callback
def set_indent(TokenClass, implicit=False):
"""Set the previously saved indentation level."""
def callback(lexer, match, context):
text = match.group()
if context.indent < context.next_indent:
context.indent_stack.append(context.indent)
context.indent = context.next_indent
if not implicit:
context.next_indent += len(text)
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def set_block_scalar_indent(TokenClass):
"""Set an explicit indentation level for a block scalar."""
def callback(lexer, match, context):
text = match.group()
context.block_scalar_indent = None
if not text:
return
increment = match.group(1)
if increment:
current_indent = max(context.indent, 0)
increment = int(increment)
context.block_scalar_indent = current_indent + increment
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
def parse_block_scalar_empty_line(IndentTokenClass, ContentTokenClass):
"""Process an empty line in a block scalar."""
def callback(lexer, match, context):
text = match.group()
if (context.block_scalar_indent is None or
len(text) <= context.block_scalar_indent):
if text:
yield match.start(), IndentTokenClass, text
else:
indentation = text[:context.block_scalar_indent]
content = text[context.block_scalar_indent:]
yield match.start(), IndentTokenClass, indentation
yield (match.start()+context.block_scalar_indent,
ContentTokenClass, content)
context.pos = match.end()
return callback
def parse_block_scalar_indent(TokenClass):
"""Process indentation spaces in a block scalar."""
def callback(lexer, match, context):
text = match.group()
if context.block_scalar_indent is None:
if len(text) <= max(context.indent, 0):
context.stack.pop()
context.stack.pop()
return
context.block_scalar_indent = len(text)
else:
if len(text) < context.block_scalar_indent:
context.stack.pop()
context.stack.pop()
return
if text:
yield match.start(), TokenClass, text
context.pos = match.end()
return callback
class YAMLLexer(ExtendedRegexLexer):
"""Lexer for the YAML language."""
name = 'YAML'
aliases = ['yaml']
filenames = ['*.yaml', '*.yml']
mimetypes = ['text/x-yaml']
tokens = {
# the root rules
'root': [
# ignored whitespaces
(r'[ ]+(?=#|$)', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# a comment
(r'#[^\n]*', Comment.Single),
# the '%YAML' directive
(r'^%YAML(?=[ ]|$)', reset_indent(Name.Directive),
'yaml-directive'),
# the %TAG directive
(r'^%TAG(?=[ ]|$)', reset_indent(Name.Directive),
'tag-directive'),
# document start and document end indicators
(r'^(?:---|\.\.\.)(?=[ ]|$)',
reset_indent(Punctuation.Document), 'block-line'),
# indentation spaces
(r'[ ]*(?![ \t\n\r\f\v]|$)',
save_indent(Text.Indent, start=True),
('block-line', 'indentation')),
],
# trailing whitespaces after directives or a block scalar indicator
'ignored-line': [
# ignored whitespaces
(r'[ ]+(?=#|$)', Text.Blank),
# a comment
(r'#[^\n]*', Comment.Single),
# line break
(r'\n', Text.Break, '#pop:2'),
],
# the %YAML directive
'yaml-directive': [
# the version number
(r'([ ]+)([0-9]+\.[0-9]+)',
bygroups(Text.Blank, Literal.Version), 'ignored-line'),
],
# the %YAG directive
'tag-directive': [
# a tag handle and the corresponding prefix
(r'([ ]+)(!|![0-9A-Za-z_-]*!)'
r'([ ]+)(!|!?[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)',
bygroups(Text.Blank, Name.Type, Text.Blank, Name.Type),
'ignored-line'),
],
# block scalar indicators and indentation spaces
'indentation': [
# trailing whitespaces are ignored
(r'[ ]*$', something(Text.Blank), '#pop:2'),
# whitespaces preceding block collection indicators
(r'[ ]+(?=[?:-](?:[ ]|$))', save_indent(Text.Indent)),
# block collection indicators
(r'[?:-](?=[ ]|$)', set_indent(Punctuation.Indicator)),
# the beginning a block line
(r'[ ]*', save_indent(Text.Indent), '#pop'),
],
# an indented line in the block context
'block-line': [
# the line end
(r'[ ]*(?=#|$)', something(Text.Blank), '#pop'),
# whitespaces separating tokens
(r'[ ]+', Text.Blank),
# tags, anchors and aliases,
include('descriptors'),
# block collections and scalars
include('block-nodes'),
# flow collections and quoted scalars
include('flow-nodes'),
# a plain scalar
(r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`-]|[?:-][^ \t\n\r\f\v])',
something(Literal.Scalar.Plain),
'plain-scalar-in-block-context'),
],
# tags, anchors, aliases
'descriptors' : [
# a full-form tag
(r'!<[0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+>', Name.Type),
# a tag in the form '!', '!suffix' or '!handle!suffix'
(r'!(?:[0-9A-Za-z_-]+)?'
r'(?:![0-9A-Za-z;/?:@&=+$,_.!~*\'()\[\]%-]+)?', Name.Type),
# an anchor
(r'&[0-9A-Za-z_-]+', Name.Anchor),
# an alias
(r'\*[0-9A-Za-z_-]+', Name.Alias),
],
# block collections and scalars
'block-nodes': [
# implicit key
(r':(?=[ ]|$)', set_indent(Punctuation.Indicator, implicit=True)),
# literal and folded scalars
(r'[|>]', Punctuation.Indicator,
('block-scalar-content', 'block-scalar-header')),
],
# flow collections and quoted scalars
'flow-nodes': [
# a flow sequence
(r'\[', Punctuation.Indicator, 'flow-sequence'),
# a flow mapping
(r'\{', Punctuation.Indicator, 'flow-mapping'),
# a single-quoted scalar
(r'\'', Literal.Scalar.Flow.Quote, 'single-quoted-scalar'),
# a double-quoted scalar
(r'\"', Literal.Scalar.Flow.Quote, 'double-quoted-scalar'),
],
# the content of a flow collection
'flow-collection': [
# whitespaces
(r'[ ]+', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# a comment
(r'#[^\n]*', Comment.Single),
# simple indicators
(r'[?:,]', Punctuation.Indicator),
# tags, anchors and aliases
include('descriptors'),
# nested collections and quoted scalars
include('flow-nodes'),
# a plain scalar
(r'(?=[^ \t\n\r\f\v?:,\[\]{}#&*!|>\'"%@`])',
something(Literal.Scalar.Plain),
'plain-scalar-in-flow-context'),
],
# a flow sequence indicated by '[' and ']'
'flow-sequence': [
# include flow collection rules
include('flow-collection'),
# the closing indicator
(r'\]', Punctuation.Indicator, '#pop'),
],
# a flow mapping indicated by '{' and '}'
'flow-mapping': [
# include flow collection rules
include('flow-collection'),
# the closing indicator
(r'\}', Punctuation.Indicator, '#pop'),
],
# block scalar lines
'block-scalar-content': [
# line break
(r'\n', Text.Break),
# empty line
(r'^[ ]+$',
parse_block_scalar_empty_line(Text.Indent,
Literal.Scalar.Block)),
# indentation spaces (we may leave the state here)
(r'^[ ]*', parse_block_scalar_indent(Text.Indent)),
# line content
(r'[^\n\r\f\v]+', Literal.Scalar.Block),
],
# the content of a literal or folded scalar
'block-scalar-header': [
# indentation indicator followed by chomping flag
(r'([1-9])?[+-]?(?=[ ]|$)',
set_block_scalar_indent(Punctuation.Indicator),
'ignored-line'),
# chomping flag followed by indentation indicator
(r'[+-]?([1-9])?(?=[ ]|$)',
set_block_scalar_indent(Punctuation.Indicator),
'ignored-line'),
],
# ignored and regular whitespaces in quoted scalars
'quoted-scalar-whitespaces': [
# leading and trailing whitespaces are ignored
(r'^[ ]+|[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Flow),
],
# single-quoted scalars
'single-quoted-scalar': [
# include whitespace and line break rules
include('quoted-scalar-whitespaces'),
# escaping of the quote character
(r'\'\'', Literal.Scalar.Flow.Escape),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v\']+', Literal.Scalar.Flow),
# the closing quote
(r'\'', Literal.Scalar.Flow.Quote, '#pop'),
],
# double-quoted scalars
'double-quoted-scalar': [
# include whitespace and line break rules
include('quoted-scalar-whitespaces'),
# escaping of special characters
(r'\\[0abt\tn\nvfre "\\N_LP]', Literal.Scalar.Flow.Escape),
# escape codes
(r'\\(?:x[0-9A-Fa-f]{2}|u[0-9A-Fa-f]{4}|U[0-9A-Fa-f]{8})',
Literal.Scalar.Flow.Escape),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v\"\\]+', Literal.Scalar.Flow),
# the closing quote
(r'"', Literal.Scalar.Flow.Quote, '#pop'),
],
# the beginning of a new line while scanning a plain scalar
'plain-scalar-in-block-context-new-line': [
# empty lines
(r'^[ ]+$', Text.Blank),
# line breaks
(r'\n+', Text.Break),
# document start and document end indicators
(r'^(?=---|\.\.\.)', something(Punctuation.Document), '#pop:3'),
# indentation spaces (we may leave the block line state here)
(r'^[ ]*', parse_plain_scalar_indent(Text.Indent), '#pop'),
],
# a plain scalar in the block context
'plain-scalar-in-block-context': [
# the scalar ends with the ':' indicator
(r'[ ]*(?=:[ ]|:$)', something(Text.Blank), '#pop'),
# the scalar ends with whitespaces followed by a comment
(r'[ ]+(?=#)', Text.Blank, '#pop'),
# trailing whitespaces are ignored
(r'[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break, 'plain-scalar-in-block-context-new-line'),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Plain),
# regular non-whitespace characters
(r'(?::(?![ \t\n\r\f\v])|[^ \t\n\r\f\v:])+',
Literal.Scalar.Plain),
],
# a plain scalar is the flow context
'plain-scalar-in-flow-context': [
# the scalar ends with an indicator character
(r'[ ]*(?=[,:?\[\]{}])', something(Text.Blank), '#pop'),
# the scalar ends with a comment
(r'[ ]+(?=#)', Text.Blank, '#pop'),
# leading and trailing whitespaces are ignored
(r'^[ ]+|[ ]+$', Text.Blank),
# line breaks are ignored
(r'\n+', Text.Break),
# other whitespaces are a part of the value
(r'[ ]+', Literal.Scalar.Plain),
# regular non-whitespace characters
(r'[^ \t\n\r\f\v,:?\[\]{}]+', Literal.Scalar.Plain),
],
}
def get_tokens_unprocessed(self, text=None, context=None):
if context is None:
context = YAMLLexerContext(text, 0)
return super(YAMLLexer, self).get_tokens_unprocessed(text, context)
|
ashleysommer/sanicpluginsframework | spf/plugin.py | SanicPlugin.middleware | python | def middleware(self, *args, **kwargs):
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs.setdefault('with_context', False)
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
self._middlewares.append(
FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))
return middle_f
def wrapper(middleware_f):
self._middlewares.append(
FutureMiddleware(middleware_f, args=args, kwargs=kwargs))
return middleware_f
return wrapper | Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugin.py#L37-L60 | null | class SanicPlugin(object):
__slots__ = ('registrations', '_routes', '_ws', '_static',
'_middlewares', '_exceptions', '_listeners', '_initialized',
'__weakref__')
AssociatedTuple = PluginAssociated
# Decorator
def exception(self, *args, **kwargs):
"""Decorate and register an exception handler
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
if isinstance(args[0], type) and issubclass(args[0], Exception):
pass
else: # pragma: no cover
raise RuntimeError("Cannot use the @exception decorator "
"without arguments")
def wrapper(handler_f):
self._exceptions.append(FutureException(handler_f,
exceptions=args,
kwargs=kwargs))
return handler_f
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]): # pragma: no cover
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
def wrapper(listener_f):
if len(kwargs) > 0:
listener_f = (listener_f, kwargs)
self._listeners[event].append(listener_f)
return listener_f
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri): # pragma: no cover
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._routes.append(FutureRoute(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._ws.append(FutureWebsocket(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def static(self, uri, file_or_directory, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('pattern', r'/?.+')
kwargs.setdefault('use_modified_since', True)
kwargs.setdefault('use_content_range', False)
kwargs.setdefault('stream_large_files', False)
kwargs.setdefault('name', 'static')
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
self._static.append(FutureStatic(uri, file_or_directory, args, kwargs))
def on_before_registered(self, context, *args, **kwargs):
pass
def on_registered(self, context, reg, *args, **kwargs):
pass
def find_plugin_registration(self, spf):
if isinstance(spf, PluginRegistration):
return spf
for reg in self.registrations:
(s, n, u) = reg
if s is not None and s == spf:
return reg
return KeyError("Not found")
def first_plugin_context(self):
"""Returns the context is associated with the first app this plugin was
registered on"""
# Note, because registrations are stored in a set, its not _really_
# the first one, but whichever one it sees first in the set.
first_spf_reg = next(iter(self.registrations))
return self.get_context_from_spf(first_spf_reg)
def get_context_from_spf(self, spf):
rt_err = RuntimeError(
"Cannot use the plugin's Context before it is "
"registered.")
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
(s, n, u) = reg
try:
return s.get_context(n)
except KeyError as k:
raise k
except AttributeError:
raise rt_err
def get_app_from_spf_context(self, spf):
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.app
def spf_resolve_url_for(self, spf, view_name, *args, **kwargs):
reg = self.find_plugin_registration(spf)
(spf, name, url_prefix) = reg
app = self.get_app_from_spf_context(reg)
if app is None:
return None
if isinstance(app, Blueprint):
self.warning("Cannot use url_for when plugin is registered "
"on a Blueprint. Use `app.url_for` instead.")
return None
constructed_name = "{}.{}".format(name, view_name)
return app.url_for(constructed_name, *args, **kwargs)
def log(self, spf, level, message, *args, **kwargs):
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.log(level, message, *args, reg=self, **kwargs)
def debug(self, message, *args, **kwargs):
return self.log(DEBUG, message, *args, **kwargs)
def info(self, message, *args, **kwargs):
return self.log(INFO, message, *args, **kwargs)
def warning(self, message, *args, **kwargs):
return self.log(WARNING, message, *args, **kwargs)
def error(self, message, *args, **kwargs):
return self.log(ERROR, message, *args, **kwargs)
def critical(self, message, *args, **kwargs):
return self.log(CRITICAL, message, *args, **kwargs)
@classmethod
def decorate(cls, app, *args, run_middleware=False, with_context=False,
**kwargs):
"""
This is a decorator that can be used to apply this plugin to a specific
route/view on your app, rather than the whole app.
:param app:
:type app: Sanic | Blueprint
:param args:
:type args: tuple(Any)
:param run_middleware:
:type run_middleware: bool
:param with_context:
:type with_context: bool
:param kwargs:
:param kwargs: dict(Any)
:return: the decorated route/view
:rtype: fn
"""
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app) # get the singleton from the app
try:
assoc = spf.register_plugin(cls, skip_reg=True)
except ValueError as e:
# this is normal, if this plugin has been registered previously
assert e.args and len(e.args) > 1
assoc = e.args[1]
(plugin, reg) = assoc
inst = spf.get_plugin(plugin) # plugin may not actually be registered
# registered might be True, False or None at this point
regd = True if inst else None
if regd is True:
# middleware will be run on this route anyway, because the plugin
# is registered on the app. Turn it off on the route-level.
run_middleware = False
req_middleware = deque()
resp_middleware = deque()
if run_middleware:
for i, m in enumerate(plugin._middlewares):
attach_to = m.kwargs.pop('attach_to', 'request')
priority = m.kwargs.pop('priority', 5)
with_context = m.kwargs.pop('with_context', False)
mw_handle_fn = m.middleware
if attach_to == 'response':
relative = m.kwargs.pop('relative', 'post')
if relative == "pre":
mw = (0, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
else: # relative = "post"
mw = (1, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
resp_middleware.append(mw)
else: # attach_to = "request"
relative = m.kwargs.pop('relative', 'pre')
if relative == "post":
mw = (1, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
else: # relative = "pre"
mw = (0, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
req_middleware.append(mw)
req_middleware = tuple(sorted(req_middleware))
resp_middleware = tuple(sorted(resp_middleware))
def _decorator(f):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, args, kwargs
async def wrapper(request, *a, **kw):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, f, args, kwargs
# the plugin was not registered on the app, it might be now
if regd is None:
_inst = spf.get_plugin(plugin)
regd = _inst is not None
context = plugin.get_context_from_spf(spf)
if run_middleware and not regd and len(req_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in req_middleware:
if with_context:
resp = handler(request, *args, context=context,
**kwargs)
else:
resp = handler(request, *args, **kwargs)
if isawaitable(resp):
resp = await resp
if resp:
return
response = await plugin.route_wrapper(
f, request, context, a, kw, *args,
with_context=with_context, **kwargs)
if isawaitable(response):
response = await response
if run_middleware and not regd and len(resp_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in resp_middleware:
if with_context:
_resp = handler(request, response, *args,
context=context, **kwargs)
else:
_resp = handler(request, response, *args, **kwargs)
if isawaitable(_resp):
_resp = await _resp
if _resp:
response = _resp
break
return response
return update_wrapper(wrapper, f)
return _decorator
async def route_wrapper(self, route, request, context, request_args,
request_kw, *decorator_args, with_context=None,
**decorator_kw):
"""This is the function that is called when a route is decorated with
your plugin decorator. Context will normally be None, but the user
can pass use_context=True so the route will get the plugin
context
"""
# by default, do nothing, just run the wrapped function
if with_context:
resp = route(request, context, *request_args, **request_kw)
else:
resp = route(request, *request_args, **request_kw)
if isawaitable(resp):
resp = await resp
return resp
def __new__(cls, *args, **kwargs):
# making a bold assumption here.
# Assuming that if a sanic plugin is initialized using
# `MyPlugin(app)`, then the user is attempting to do a legacy plugin
# instantiation, aka Flask-Style plugin instantiation.
if args and len(args) > 0 and \
(isinstance(args[0], Sanic) or isinstance(args[0], Blueprint)):
app = args[0]
try:
mod_name = cls.__module__
mod = importlib.import_module(mod_name)
assert mod
except (ImportError, AssertionError):
raise RuntimeError(
"Failed attempting a legacy plugin instantiation. "
"Cannot find the module this plugin belongs to.")
# Get the spf singleton from this app
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app)
# catch cases like when the module is "__main__" or
# "__call__" or "__init__"
if mod_name.startswith("__"):
# In this case, we cannot use the module to register the
# plugin. Try to use the class method.
assoc = spf.register_plugin(cls, *args, **kwargs)
else:
assoc = spf.register_plugin(mod, *args, **kwargs)
return assoc
self = super(SanicPlugin, cls).__new__(cls)
try:
self._initialized # initialized may be True or Unknown
except AttributeError:
self._initialized = False
return self
def is_registered_on_framework(self, check_spf):
for reg in self.registrations:
(spf, name, url) = reg
if spf is not None and spf == check_spf:
return True
return False
def __init__(self, *args, **kwargs):
# Sometimes __init__ can be called twice.
# Ignore it on subsequent times
if self._initialized:
return
assert len(args) < 1,\
"Unexpected arguments passed to this Sanic Plugins."
assert len(kwargs) < 1,\
"Unexpected keyword arguments passed to this Sanic Plugins."
super(SanicPlugin, self).__init__(*args, **kwargs)
self._routes = []
self._ws = []
self._static = []
self._middlewares = []
self._exceptions = []
self._listeners = defaultdict(list)
self.registrations = set()
self._initialized = True
def __getstate__(self):
state_dict = {}
for s in SanicPlugin.__slots__:
state_dict[s] = getattr(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
if s == "__weakref__":
if v is None:
continue
else:
raise NotImplementedError("Setting weakrefs on Plugin")
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
return SanicPlugin.__new__, (self.__class__,), state_dict
|
ashleysommer/sanicpluginsframework | spf/plugin.py | SanicPlugin.exception | python | def exception(self, *args, **kwargs):
if len(args) == 1 and callable(args[0]):
if isinstance(args[0], type) and issubclass(args[0], Exception):
pass
else: # pragma: no cover
raise RuntimeError("Cannot use the @exception decorator "
"without arguments")
def wrapper(handler_f):
self._exceptions.append(FutureException(handler_f,
exceptions=args,
kwargs=kwargs))
return handler_f
return wrapper | Decorate and register an exception handler
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugin.py#L62-L83 | null | class SanicPlugin(object):
__slots__ = ('registrations', '_routes', '_ws', '_static',
'_middlewares', '_exceptions', '_listeners', '_initialized',
'__weakref__')
AssociatedTuple = PluginAssociated
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs.setdefault('with_context', False)
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
self._middlewares.append(
FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))
return middle_f
def wrapper(middleware_f):
self._middlewares.append(
FutureMiddleware(middleware_f, args=args, kwargs=kwargs))
return middleware_f
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]): # pragma: no cover
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
def wrapper(listener_f):
if len(kwargs) > 0:
listener_f = (listener_f, kwargs)
self._listeners[event].append(listener_f)
return listener_f
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri): # pragma: no cover
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._routes.append(FutureRoute(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._ws.append(FutureWebsocket(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def static(self, uri, file_or_directory, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('pattern', r'/?.+')
kwargs.setdefault('use_modified_since', True)
kwargs.setdefault('use_content_range', False)
kwargs.setdefault('stream_large_files', False)
kwargs.setdefault('name', 'static')
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
self._static.append(FutureStatic(uri, file_or_directory, args, kwargs))
def on_before_registered(self, context, *args, **kwargs):
pass
def on_registered(self, context, reg, *args, **kwargs):
pass
def find_plugin_registration(self, spf):
if isinstance(spf, PluginRegistration):
return spf
for reg in self.registrations:
(s, n, u) = reg
if s is not None and s == spf:
return reg
return KeyError("Not found")
def first_plugin_context(self):
"""Returns the context is associated with the first app this plugin was
registered on"""
# Note, because registrations are stored in a set, its not _really_
# the first one, but whichever one it sees first in the set.
first_spf_reg = next(iter(self.registrations))
return self.get_context_from_spf(first_spf_reg)
def get_context_from_spf(self, spf):
rt_err = RuntimeError(
"Cannot use the plugin's Context before it is "
"registered.")
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
(s, n, u) = reg
try:
return s.get_context(n)
except KeyError as k:
raise k
except AttributeError:
raise rt_err
def get_app_from_spf_context(self, spf):
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.app
def spf_resolve_url_for(self, spf, view_name, *args, **kwargs):
reg = self.find_plugin_registration(spf)
(spf, name, url_prefix) = reg
app = self.get_app_from_spf_context(reg)
if app is None:
return None
if isinstance(app, Blueprint):
self.warning("Cannot use url_for when plugin is registered "
"on a Blueprint. Use `app.url_for` instead.")
return None
constructed_name = "{}.{}".format(name, view_name)
return app.url_for(constructed_name, *args, **kwargs)
def log(self, spf, level, message, *args, **kwargs):
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.log(level, message, *args, reg=self, **kwargs)
def debug(self, message, *args, **kwargs):
return self.log(DEBUG, message, *args, **kwargs)
def info(self, message, *args, **kwargs):
return self.log(INFO, message, *args, **kwargs)
def warning(self, message, *args, **kwargs):
return self.log(WARNING, message, *args, **kwargs)
def error(self, message, *args, **kwargs):
return self.log(ERROR, message, *args, **kwargs)
def critical(self, message, *args, **kwargs):
return self.log(CRITICAL, message, *args, **kwargs)
@classmethod
def decorate(cls, app, *args, run_middleware=False, with_context=False,
**kwargs):
"""
This is a decorator that can be used to apply this plugin to a specific
route/view on your app, rather than the whole app.
:param app:
:type app: Sanic | Blueprint
:param args:
:type args: tuple(Any)
:param run_middleware:
:type run_middleware: bool
:param with_context:
:type with_context: bool
:param kwargs:
:param kwargs: dict(Any)
:return: the decorated route/view
:rtype: fn
"""
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app) # get the singleton from the app
try:
assoc = spf.register_plugin(cls, skip_reg=True)
except ValueError as e:
# this is normal, if this plugin has been registered previously
assert e.args and len(e.args) > 1
assoc = e.args[1]
(plugin, reg) = assoc
inst = spf.get_plugin(plugin) # plugin may not actually be registered
# registered might be True, False or None at this point
regd = True if inst else None
if regd is True:
# middleware will be run on this route anyway, because the plugin
# is registered on the app. Turn it off on the route-level.
run_middleware = False
req_middleware = deque()
resp_middleware = deque()
if run_middleware:
for i, m in enumerate(plugin._middlewares):
attach_to = m.kwargs.pop('attach_to', 'request')
priority = m.kwargs.pop('priority', 5)
with_context = m.kwargs.pop('with_context', False)
mw_handle_fn = m.middleware
if attach_to == 'response':
relative = m.kwargs.pop('relative', 'post')
if relative == "pre":
mw = (0, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
else: # relative = "post"
mw = (1, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
resp_middleware.append(mw)
else: # attach_to = "request"
relative = m.kwargs.pop('relative', 'pre')
if relative == "post":
mw = (1, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
else: # relative = "pre"
mw = (0, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
req_middleware.append(mw)
req_middleware = tuple(sorted(req_middleware))
resp_middleware = tuple(sorted(resp_middleware))
def _decorator(f):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, args, kwargs
async def wrapper(request, *a, **kw):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, f, args, kwargs
# the plugin was not registered on the app, it might be now
if regd is None:
_inst = spf.get_plugin(plugin)
regd = _inst is not None
context = plugin.get_context_from_spf(spf)
if run_middleware and not regd and len(req_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in req_middleware:
if with_context:
resp = handler(request, *args, context=context,
**kwargs)
else:
resp = handler(request, *args, **kwargs)
if isawaitable(resp):
resp = await resp
if resp:
return
response = await plugin.route_wrapper(
f, request, context, a, kw, *args,
with_context=with_context, **kwargs)
if isawaitable(response):
response = await response
if run_middleware and not regd and len(resp_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in resp_middleware:
if with_context:
_resp = handler(request, response, *args,
context=context, **kwargs)
else:
_resp = handler(request, response, *args, **kwargs)
if isawaitable(_resp):
_resp = await _resp
if _resp:
response = _resp
break
return response
return update_wrapper(wrapper, f)
return _decorator
async def route_wrapper(self, route, request, context, request_args,
request_kw, *decorator_args, with_context=None,
**decorator_kw):
"""This is the function that is called when a route is decorated with
your plugin decorator. Context will normally be None, but the user
can pass use_context=True so the route will get the plugin
context
"""
# by default, do nothing, just run the wrapped function
if with_context:
resp = route(request, context, *request_args, **request_kw)
else:
resp = route(request, *request_args, **request_kw)
if isawaitable(resp):
resp = await resp
return resp
def __new__(cls, *args, **kwargs):
# making a bold assumption here.
# Assuming that if a sanic plugin is initialized using
# `MyPlugin(app)`, then the user is attempting to do a legacy plugin
# instantiation, aka Flask-Style plugin instantiation.
if args and len(args) > 0 and \
(isinstance(args[0], Sanic) or isinstance(args[0], Blueprint)):
app = args[0]
try:
mod_name = cls.__module__
mod = importlib.import_module(mod_name)
assert mod
except (ImportError, AssertionError):
raise RuntimeError(
"Failed attempting a legacy plugin instantiation. "
"Cannot find the module this plugin belongs to.")
# Get the spf singleton from this app
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app)
# catch cases like when the module is "__main__" or
# "__call__" or "__init__"
if mod_name.startswith("__"):
# In this case, we cannot use the module to register the
# plugin. Try to use the class method.
assoc = spf.register_plugin(cls, *args, **kwargs)
else:
assoc = spf.register_plugin(mod, *args, **kwargs)
return assoc
self = super(SanicPlugin, cls).__new__(cls)
try:
self._initialized # initialized may be True or Unknown
except AttributeError:
self._initialized = False
return self
def is_registered_on_framework(self, check_spf):
for reg in self.registrations:
(spf, name, url) = reg
if spf is not None and spf == check_spf:
return True
return False
def __init__(self, *args, **kwargs):
# Sometimes __init__ can be called twice.
# Ignore it on subsequent times
if self._initialized:
return
assert len(args) < 1,\
"Unexpected arguments passed to this Sanic Plugins."
assert len(kwargs) < 1,\
"Unexpected keyword arguments passed to this Sanic Plugins."
super(SanicPlugin, self).__init__(*args, **kwargs)
self._routes = []
self._ws = []
self._static = []
self._middlewares = []
self._exceptions = []
self._listeners = defaultdict(list)
self.registrations = set()
self._initialized = True
def __getstate__(self):
state_dict = {}
for s in SanicPlugin.__slots__:
state_dict[s] = getattr(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
if s == "__weakref__":
if v is None:
continue
else:
raise NotImplementedError("Setting weakrefs on Plugin")
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
return SanicPlugin.__new__, (self.__class__,), state_dict
|
ashleysommer/sanicpluginsframework | spf/plugin.py | SanicPlugin.listener | python | def listener(self, event, *args, **kwargs):
if len(args) == 1 and callable(args[0]): # pragma: no cover
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
def wrapper(listener_f):
if len(kwargs) > 0:
listener_f = (listener_f, kwargs)
self._listeners[event].append(listener_f)
return listener_f
return wrapper | Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugin.py#L85-L105 | null | class SanicPlugin(object):
__slots__ = ('registrations', '_routes', '_ws', '_static',
'_middlewares', '_exceptions', '_listeners', '_initialized',
'__weakref__')
AssociatedTuple = PluginAssociated
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs.setdefault('with_context', False)
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
self._middlewares.append(
FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))
return middle_f
def wrapper(middleware_f):
self._middlewares.append(
FutureMiddleware(middleware_f, args=args, kwargs=kwargs))
return middleware_f
return wrapper
def exception(self, *args, **kwargs):
"""Decorate and register an exception handler
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
if isinstance(args[0], type) and issubclass(args[0], Exception):
pass
else: # pragma: no cover
raise RuntimeError("Cannot use the @exception decorator "
"without arguments")
def wrapper(handler_f):
self._exceptions.append(FutureException(handler_f,
exceptions=args,
kwargs=kwargs))
return handler_f
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri): # pragma: no cover
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._routes.append(FutureRoute(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._ws.append(FutureWebsocket(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def static(self, uri, file_or_directory, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('pattern', r'/?.+')
kwargs.setdefault('use_modified_since', True)
kwargs.setdefault('use_content_range', False)
kwargs.setdefault('stream_large_files', False)
kwargs.setdefault('name', 'static')
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
self._static.append(FutureStatic(uri, file_or_directory, args, kwargs))
def on_before_registered(self, context, *args, **kwargs):
pass
def on_registered(self, context, reg, *args, **kwargs):
pass
def find_plugin_registration(self, spf):
if isinstance(spf, PluginRegistration):
return spf
for reg in self.registrations:
(s, n, u) = reg
if s is not None and s == spf:
return reg
return KeyError("Not found")
def first_plugin_context(self):
"""Returns the context is associated with the first app this plugin was
registered on"""
# Note, because registrations are stored in a set, its not _really_
# the first one, but whichever one it sees first in the set.
first_spf_reg = next(iter(self.registrations))
return self.get_context_from_spf(first_spf_reg)
def get_context_from_spf(self, spf):
rt_err = RuntimeError(
"Cannot use the plugin's Context before it is "
"registered.")
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
(s, n, u) = reg
try:
return s.get_context(n)
except KeyError as k:
raise k
except AttributeError:
raise rt_err
def get_app_from_spf_context(self, spf):
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.app
def spf_resolve_url_for(self, spf, view_name, *args, **kwargs):
reg = self.find_plugin_registration(spf)
(spf, name, url_prefix) = reg
app = self.get_app_from_spf_context(reg)
if app is None:
return None
if isinstance(app, Blueprint):
self.warning("Cannot use url_for when plugin is registered "
"on a Blueprint. Use `app.url_for` instead.")
return None
constructed_name = "{}.{}".format(name, view_name)
return app.url_for(constructed_name, *args, **kwargs)
def log(self, spf, level, message, *args, **kwargs):
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.log(level, message, *args, reg=self, **kwargs)
def debug(self, message, *args, **kwargs):
return self.log(DEBUG, message, *args, **kwargs)
def info(self, message, *args, **kwargs):
return self.log(INFO, message, *args, **kwargs)
def warning(self, message, *args, **kwargs):
return self.log(WARNING, message, *args, **kwargs)
def error(self, message, *args, **kwargs):
return self.log(ERROR, message, *args, **kwargs)
def critical(self, message, *args, **kwargs):
return self.log(CRITICAL, message, *args, **kwargs)
@classmethod
def decorate(cls, app, *args, run_middleware=False, with_context=False,
**kwargs):
"""
This is a decorator that can be used to apply this plugin to a specific
route/view on your app, rather than the whole app.
:param app:
:type app: Sanic | Blueprint
:param args:
:type args: tuple(Any)
:param run_middleware:
:type run_middleware: bool
:param with_context:
:type with_context: bool
:param kwargs:
:param kwargs: dict(Any)
:return: the decorated route/view
:rtype: fn
"""
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app) # get the singleton from the app
try:
assoc = spf.register_plugin(cls, skip_reg=True)
except ValueError as e:
# this is normal, if this plugin has been registered previously
assert e.args and len(e.args) > 1
assoc = e.args[1]
(plugin, reg) = assoc
inst = spf.get_plugin(plugin) # plugin may not actually be registered
# registered might be True, False or None at this point
regd = True if inst else None
if regd is True:
# middleware will be run on this route anyway, because the plugin
# is registered on the app. Turn it off on the route-level.
run_middleware = False
req_middleware = deque()
resp_middleware = deque()
if run_middleware:
for i, m in enumerate(plugin._middlewares):
attach_to = m.kwargs.pop('attach_to', 'request')
priority = m.kwargs.pop('priority', 5)
with_context = m.kwargs.pop('with_context', False)
mw_handle_fn = m.middleware
if attach_to == 'response':
relative = m.kwargs.pop('relative', 'post')
if relative == "pre":
mw = (0, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
else: # relative = "post"
mw = (1, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
resp_middleware.append(mw)
else: # attach_to = "request"
relative = m.kwargs.pop('relative', 'pre')
if relative == "post":
mw = (1, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
else: # relative = "pre"
mw = (0, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
req_middleware.append(mw)
req_middleware = tuple(sorted(req_middleware))
resp_middleware = tuple(sorted(resp_middleware))
def _decorator(f):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, args, kwargs
async def wrapper(request, *a, **kw):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, f, args, kwargs
# the plugin was not registered on the app, it might be now
if regd is None:
_inst = spf.get_plugin(plugin)
regd = _inst is not None
context = plugin.get_context_from_spf(spf)
if run_middleware and not regd and len(req_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in req_middleware:
if with_context:
resp = handler(request, *args, context=context,
**kwargs)
else:
resp = handler(request, *args, **kwargs)
if isawaitable(resp):
resp = await resp
if resp:
return
response = await plugin.route_wrapper(
f, request, context, a, kw, *args,
with_context=with_context, **kwargs)
if isawaitable(response):
response = await response
if run_middleware and not regd and len(resp_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in resp_middleware:
if with_context:
_resp = handler(request, response, *args,
context=context, **kwargs)
else:
_resp = handler(request, response, *args, **kwargs)
if isawaitable(_resp):
_resp = await _resp
if _resp:
response = _resp
break
return response
return update_wrapper(wrapper, f)
return _decorator
async def route_wrapper(self, route, request, context, request_args,
request_kw, *decorator_args, with_context=None,
**decorator_kw):
"""This is the function that is called when a route is decorated with
your plugin decorator. Context will normally be None, but the user
can pass use_context=True so the route will get the plugin
context
"""
# by default, do nothing, just run the wrapped function
if with_context:
resp = route(request, context, *request_args, **request_kw)
else:
resp = route(request, *request_args, **request_kw)
if isawaitable(resp):
resp = await resp
return resp
def __new__(cls, *args, **kwargs):
# making a bold assumption here.
# Assuming that if a sanic plugin is initialized using
# `MyPlugin(app)`, then the user is attempting to do a legacy plugin
# instantiation, aka Flask-Style plugin instantiation.
if args and len(args) > 0 and \
(isinstance(args[0], Sanic) or isinstance(args[0], Blueprint)):
app = args[0]
try:
mod_name = cls.__module__
mod = importlib.import_module(mod_name)
assert mod
except (ImportError, AssertionError):
raise RuntimeError(
"Failed attempting a legacy plugin instantiation. "
"Cannot find the module this plugin belongs to.")
# Get the spf singleton from this app
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app)
# catch cases like when the module is "__main__" or
# "__call__" or "__init__"
if mod_name.startswith("__"):
# In this case, we cannot use the module to register the
# plugin. Try to use the class method.
assoc = spf.register_plugin(cls, *args, **kwargs)
else:
assoc = spf.register_plugin(mod, *args, **kwargs)
return assoc
self = super(SanicPlugin, cls).__new__(cls)
try:
self._initialized # initialized may be True or Unknown
except AttributeError:
self._initialized = False
return self
def is_registered_on_framework(self, check_spf):
for reg in self.registrations:
(spf, name, url) = reg
if spf is not None and spf == check_spf:
return True
return False
def __init__(self, *args, **kwargs):
# Sometimes __init__ can be called twice.
# Ignore it on subsequent times
if self._initialized:
return
assert len(args) < 1,\
"Unexpected arguments passed to this Sanic Plugins."
assert len(kwargs) < 1,\
"Unexpected keyword arguments passed to this Sanic Plugins."
super(SanicPlugin, self).__init__(*args, **kwargs)
self._routes = []
self._ws = []
self._static = []
self._middlewares = []
self._exceptions = []
self._listeners = defaultdict(list)
self.registrations = set()
self._initialized = True
def __getstate__(self):
state_dict = {}
for s in SanicPlugin.__slots__:
state_dict[s] = getattr(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
if s == "__weakref__":
if v is None:
continue
else:
raise NotImplementedError("Setting weakrefs on Plugin")
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
return SanicPlugin.__new__, (self.__class__,), state_dict
|
ashleysommer/sanicpluginsframework | spf/plugin.py | SanicPlugin.route | python | def route(self, uri, *args, **kwargs):
if len(args) == 0 and callable(uri): # pragma: no cover
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._routes.append(FutureRoute(handler_f, uri, args, kwargs))
return handler_f
return wrapper | Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugin.py#L107-L130 | null | class SanicPlugin(object):
__slots__ = ('registrations', '_routes', '_ws', '_static',
'_middlewares', '_exceptions', '_listeners', '_initialized',
'__weakref__')
AssociatedTuple = PluginAssociated
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs.setdefault('with_context', False)
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
self._middlewares.append(
FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))
return middle_f
def wrapper(middleware_f):
self._middlewares.append(
FutureMiddleware(middleware_f, args=args, kwargs=kwargs))
return middleware_f
return wrapper
def exception(self, *args, **kwargs):
"""Decorate and register an exception handler
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
if isinstance(args[0], type) and issubclass(args[0], Exception):
pass
else: # pragma: no cover
raise RuntimeError("Cannot use the @exception decorator "
"without arguments")
def wrapper(handler_f):
self._exceptions.append(FutureException(handler_f,
exceptions=args,
kwargs=kwargs))
return handler_f
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]): # pragma: no cover
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
def wrapper(listener_f):
if len(kwargs) > 0:
listener_f = (listener_f, kwargs)
self._listeners[event].append(listener_f)
return listener_f
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._ws.append(FutureWebsocket(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def static(self, uri, file_or_directory, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('pattern', r'/?.+')
kwargs.setdefault('use_modified_since', True)
kwargs.setdefault('use_content_range', False)
kwargs.setdefault('stream_large_files', False)
kwargs.setdefault('name', 'static')
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
self._static.append(FutureStatic(uri, file_or_directory, args, kwargs))
def on_before_registered(self, context, *args, **kwargs):
pass
def on_registered(self, context, reg, *args, **kwargs):
pass
def find_plugin_registration(self, spf):
if isinstance(spf, PluginRegistration):
return spf
for reg in self.registrations:
(s, n, u) = reg
if s is not None and s == spf:
return reg
return KeyError("Not found")
def first_plugin_context(self):
"""Returns the context is associated with the first app this plugin was
registered on"""
# Note, because registrations are stored in a set, its not _really_
# the first one, but whichever one it sees first in the set.
first_spf_reg = next(iter(self.registrations))
return self.get_context_from_spf(first_spf_reg)
def get_context_from_spf(self, spf):
rt_err = RuntimeError(
"Cannot use the plugin's Context before it is "
"registered.")
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
(s, n, u) = reg
try:
return s.get_context(n)
except KeyError as k:
raise k
except AttributeError:
raise rt_err
def get_app_from_spf_context(self, spf):
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.app
def spf_resolve_url_for(self, spf, view_name, *args, **kwargs):
reg = self.find_plugin_registration(spf)
(spf, name, url_prefix) = reg
app = self.get_app_from_spf_context(reg)
if app is None:
return None
if isinstance(app, Blueprint):
self.warning("Cannot use url_for when plugin is registered "
"on a Blueprint. Use `app.url_for` instead.")
return None
constructed_name = "{}.{}".format(name, view_name)
return app.url_for(constructed_name, *args, **kwargs)
def log(self, spf, level, message, *args, **kwargs):
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.log(level, message, *args, reg=self, **kwargs)
def debug(self, message, *args, **kwargs):
return self.log(DEBUG, message, *args, **kwargs)
def info(self, message, *args, **kwargs):
return self.log(INFO, message, *args, **kwargs)
def warning(self, message, *args, **kwargs):
return self.log(WARNING, message, *args, **kwargs)
def error(self, message, *args, **kwargs):
return self.log(ERROR, message, *args, **kwargs)
def critical(self, message, *args, **kwargs):
return self.log(CRITICAL, message, *args, **kwargs)
@classmethod
def decorate(cls, app, *args, run_middleware=False, with_context=False,
**kwargs):
"""
This is a decorator that can be used to apply this plugin to a specific
route/view on your app, rather than the whole app.
:param app:
:type app: Sanic | Blueprint
:param args:
:type args: tuple(Any)
:param run_middleware:
:type run_middleware: bool
:param with_context:
:type with_context: bool
:param kwargs:
:param kwargs: dict(Any)
:return: the decorated route/view
:rtype: fn
"""
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app) # get the singleton from the app
try:
assoc = spf.register_plugin(cls, skip_reg=True)
except ValueError as e:
# this is normal, if this plugin has been registered previously
assert e.args and len(e.args) > 1
assoc = e.args[1]
(plugin, reg) = assoc
inst = spf.get_plugin(plugin) # plugin may not actually be registered
# registered might be True, False or None at this point
regd = True if inst else None
if regd is True:
# middleware will be run on this route anyway, because the plugin
# is registered on the app. Turn it off on the route-level.
run_middleware = False
req_middleware = deque()
resp_middleware = deque()
if run_middleware:
for i, m in enumerate(plugin._middlewares):
attach_to = m.kwargs.pop('attach_to', 'request')
priority = m.kwargs.pop('priority', 5)
with_context = m.kwargs.pop('with_context', False)
mw_handle_fn = m.middleware
if attach_to == 'response':
relative = m.kwargs.pop('relative', 'post')
if relative == "pre":
mw = (0, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
else: # relative = "post"
mw = (1, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
resp_middleware.append(mw)
else: # attach_to = "request"
relative = m.kwargs.pop('relative', 'pre')
if relative == "post":
mw = (1, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
else: # relative = "pre"
mw = (0, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
req_middleware.append(mw)
req_middleware = tuple(sorted(req_middleware))
resp_middleware = tuple(sorted(resp_middleware))
def _decorator(f):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, args, kwargs
async def wrapper(request, *a, **kw):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, f, args, kwargs
# the plugin was not registered on the app, it might be now
if regd is None:
_inst = spf.get_plugin(plugin)
regd = _inst is not None
context = plugin.get_context_from_spf(spf)
if run_middleware and not regd and len(req_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in req_middleware:
if with_context:
resp = handler(request, *args, context=context,
**kwargs)
else:
resp = handler(request, *args, **kwargs)
if isawaitable(resp):
resp = await resp
if resp:
return
response = await plugin.route_wrapper(
f, request, context, a, kw, *args,
with_context=with_context, **kwargs)
if isawaitable(response):
response = await response
if run_middleware and not regd and len(resp_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in resp_middleware:
if with_context:
_resp = handler(request, response, *args,
context=context, **kwargs)
else:
_resp = handler(request, response, *args, **kwargs)
if isawaitable(_resp):
_resp = await _resp
if _resp:
response = _resp
break
return response
return update_wrapper(wrapper, f)
return _decorator
async def route_wrapper(self, route, request, context, request_args,
request_kw, *decorator_args, with_context=None,
**decorator_kw):
"""This is the function that is called when a route is decorated with
your plugin decorator. Context will normally be None, but the user
can pass use_context=True so the route will get the plugin
context
"""
# by default, do nothing, just run the wrapped function
if with_context:
resp = route(request, context, *request_args, **request_kw)
else:
resp = route(request, *request_args, **request_kw)
if isawaitable(resp):
resp = await resp
return resp
def __new__(cls, *args, **kwargs):
# making a bold assumption here.
# Assuming that if a sanic plugin is initialized using
# `MyPlugin(app)`, then the user is attempting to do a legacy plugin
# instantiation, aka Flask-Style plugin instantiation.
if args and len(args) > 0 and \
(isinstance(args[0], Sanic) or isinstance(args[0], Blueprint)):
app = args[0]
try:
mod_name = cls.__module__
mod = importlib.import_module(mod_name)
assert mod
except (ImportError, AssertionError):
raise RuntimeError(
"Failed attempting a legacy plugin instantiation. "
"Cannot find the module this plugin belongs to.")
# Get the spf singleton from this app
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app)
# catch cases like when the module is "__main__" or
# "__call__" or "__init__"
if mod_name.startswith("__"):
# In this case, we cannot use the module to register the
# plugin. Try to use the class method.
assoc = spf.register_plugin(cls, *args, **kwargs)
else:
assoc = spf.register_plugin(mod, *args, **kwargs)
return assoc
self = super(SanicPlugin, cls).__new__(cls)
try:
self._initialized # initialized may be True or Unknown
except AttributeError:
self._initialized = False
return self
def is_registered_on_framework(self, check_spf):
for reg in self.registrations:
(spf, name, url) = reg
if spf is not None and spf == check_spf:
return True
return False
def __init__(self, *args, **kwargs):
# Sometimes __init__ can be called twice.
# Ignore it on subsequent times
if self._initialized:
return
assert len(args) < 1,\
"Unexpected arguments passed to this Sanic Plugins."
assert len(kwargs) < 1,\
"Unexpected keyword arguments passed to this Sanic Plugins."
super(SanicPlugin, self).__init__(*args, **kwargs)
self._routes = []
self._ws = []
self._static = []
self._middlewares = []
self._exceptions = []
self._listeners = defaultdict(list)
self.registrations = set()
self._initialized = True
def __getstate__(self):
state_dict = {}
for s in SanicPlugin.__slots__:
state_dict[s] = getattr(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
if s == "__weakref__":
if v is None:
continue
else:
raise NotImplementedError("Setting weakrefs on Plugin")
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
return SanicPlugin.__new__, (self.__class__,), state_dict
|
ashleysommer/sanicpluginsframework | spf/plugin.py | SanicPlugin.websocket | python | def websocket(self, uri, *args, **kwargs):
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._ws.append(FutureWebsocket(handler_f, uri, args, kwargs))
return handler_f
return wrapper | Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugin.py#L132-L152 | null | class SanicPlugin(object):
__slots__ = ('registrations', '_routes', '_ws', '_static',
'_middlewares', '_exceptions', '_listeners', '_initialized',
'__weakref__')
AssociatedTuple = PluginAssociated
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs.setdefault('with_context', False)
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
self._middlewares.append(
FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))
return middle_f
def wrapper(middleware_f):
self._middlewares.append(
FutureMiddleware(middleware_f, args=args, kwargs=kwargs))
return middleware_f
return wrapper
def exception(self, *args, **kwargs):
"""Decorate and register an exception handler
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
if isinstance(args[0], type) and issubclass(args[0], Exception):
pass
else: # pragma: no cover
raise RuntimeError("Cannot use the @exception decorator "
"without arguments")
def wrapper(handler_f):
self._exceptions.append(FutureException(handler_f,
exceptions=args,
kwargs=kwargs))
return handler_f
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]): # pragma: no cover
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
def wrapper(listener_f):
if len(kwargs) > 0:
listener_f = (listener_f, kwargs)
self._listeners[event].append(listener_f)
return listener_f
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri): # pragma: no cover
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._routes.append(FutureRoute(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def static(self, uri, file_or_directory, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('pattern', r'/?.+')
kwargs.setdefault('use_modified_since', True)
kwargs.setdefault('use_content_range', False)
kwargs.setdefault('stream_large_files', False)
kwargs.setdefault('name', 'static')
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
self._static.append(FutureStatic(uri, file_or_directory, args, kwargs))
def on_before_registered(self, context, *args, **kwargs):
pass
def on_registered(self, context, reg, *args, **kwargs):
pass
def find_plugin_registration(self, spf):
if isinstance(spf, PluginRegistration):
return spf
for reg in self.registrations:
(s, n, u) = reg
if s is not None and s == spf:
return reg
return KeyError("Not found")
def first_plugin_context(self):
"""Returns the context is associated with the first app this plugin was
registered on"""
# Note, because registrations are stored in a set, its not _really_
# the first one, but whichever one it sees first in the set.
first_spf_reg = next(iter(self.registrations))
return self.get_context_from_spf(first_spf_reg)
def get_context_from_spf(self, spf):
rt_err = RuntimeError(
"Cannot use the plugin's Context before it is "
"registered.")
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
(s, n, u) = reg
try:
return s.get_context(n)
except KeyError as k:
raise k
except AttributeError:
raise rt_err
def get_app_from_spf_context(self, spf):
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.app
def spf_resolve_url_for(self, spf, view_name, *args, **kwargs):
reg = self.find_plugin_registration(spf)
(spf, name, url_prefix) = reg
app = self.get_app_from_spf_context(reg)
if app is None:
return None
if isinstance(app, Blueprint):
self.warning("Cannot use url_for when plugin is registered "
"on a Blueprint. Use `app.url_for` instead.")
return None
constructed_name = "{}.{}".format(name, view_name)
return app.url_for(constructed_name, *args, **kwargs)
def log(self, spf, level, message, *args, **kwargs):
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.log(level, message, *args, reg=self, **kwargs)
def debug(self, message, *args, **kwargs):
return self.log(DEBUG, message, *args, **kwargs)
def info(self, message, *args, **kwargs):
return self.log(INFO, message, *args, **kwargs)
def warning(self, message, *args, **kwargs):
return self.log(WARNING, message, *args, **kwargs)
def error(self, message, *args, **kwargs):
return self.log(ERROR, message, *args, **kwargs)
def critical(self, message, *args, **kwargs):
return self.log(CRITICAL, message, *args, **kwargs)
@classmethod
def decorate(cls, app, *args, run_middleware=False, with_context=False,
**kwargs):
"""
This is a decorator that can be used to apply this plugin to a specific
route/view on your app, rather than the whole app.
:param app:
:type app: Sanic | Blueprint
:param args:
:type args: tuple(Any)
:param run_middleware:
:type run_middleware: bool
:param with_context:
:type with_context: bool
:param kwargs:
:param kwargs: dict(Any)
:return: the decorated route/view
:rtype: fn
"""
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app) # get the singleton from the app
try:
assoc = spf.register_plugin(cls, skip_reg=True)
except ValueError as e:
# this is normal, if this plugin has been registered previously
assert e.args and len(e.args) > 1
assoc = e.args[1]
(plugin, reg) = assoc
inst = spf.get_plugin(plugin) # plugin may not actually be registered
# registered might be True, False or None at this point
regd = True if inst else None
if regd is True:
# middleware will be run on this route anyway, because the plugin
# is registered on the app. Turn it off on the route-level.
run_middleware = False
req_middleware = deque()
resp_middleware = deque()
if run_middleware:
for i, m in enumerate(plugin._middlewares):
attach_to = m.kwargs.pop('attach_to', 'request')
priority = m.kwargs.pop('priority', 5)
with_context = m.kwargs.pop('with_context', False)
mw_handle_fn = m.middleware
if attach_to == 'response':
relative = m.kwargs.pop('relative', 'post')
if relative == "pre":
mw = (0, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
else: # relative = "post"
mw = (1, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
resp_middleware.append(mw)
else: # attach_to = "request"
relative = m.kwargs.pop('relative', 'pre')
if relative == "post":
mw = (1, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
else: # relative = "pre"
mw = (0, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
req_middleware.append(mw)
req_middleware = tuple(sorted(req_middleware))
resp_middleware = tuple(sorted(resp_middleware))
def _decorator(f):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, args, kwargs
async def wrapper(request, *a, **kw):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, f, args, kwargs
# the plugin was not registered on the app, it might be now
if regd is None:
_inst = spf.get_plugin(plugin)
regd = _inst is not None
context = plugin.get_context_from_spf(spf)
if run_middleware and not regd and len(req_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in req_middleware:
if with_context:
resp = handler(request, *args, context=context,
**kwargs)
else:
resp = handler(request, *args, **kwargs)
if isawaitable(resp):
resp = await resp
if resp:
return
response = await plugin.route_wrapper(
f, request, context, a, kw, *args,
with_context=with_context, **kwargs)
if isawaitable(response):
response = await response
if run_middleware and not regd and len(resp_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in resp_middleware:
if with_context:
_resp = handler(request, response, *args,
context=context, **kwargs)
else:
_resp = handler(request, response, *args, **kwargs)
if isawaitable(_resp):
_resp = await _resp
if _resp:
response = _resp
break
return response
return update_wrapper(wrapper, f)
return _decorator
async def route_wrapper(self, route, request, context, request_args,
request_kw, *decorator_args, with_context=None,
**decorator_kw):
"""This is the function that is called when a route is decorated with
your plugin decorator. Context will normally be None, but the user
can pass use_context=True so the route will get the plugin
context
"""
# by default, do nothing, just run the wrapped function
if with_context:
resp = route(request, context, *request_args, **request_kw)
else:
resp = route(request, *request_args, **request_kw)
if isawaitable(resp):
resp = await resp
return resp
def __new__(cls, *args, **kwargs):
# making a bold assumption here.
# Assuming that if a sanic plugin is initialized using
# `MyPlugin(app)`, then the user is attempting to do a legacy plugin
# instantiation, aka Flask-Style plugin instantiation.
if args and len(args) > 0 and \
(isinstance(args[0], Sanic) or isinstance(args[0], Blueprint)):
app = args[0]
try:
mod_name = cls.__module__
mod = importlib.import_module(mod_name)
assert mod
except (ImportError, AssertionError):
raise RuntimeError(
"Failed attempting a legacy plugin instantiation. "
"Cannot find the module this plugin belongs to.")
# Get the spf singleton from this app
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app)
# catch cases like when the module is "__main__" or
# "__call__" or "__init__"
if mod_name.startswith("__"):
# In this case, we cannot use the module to register the
# plugin. Try to use the class method.
assoc = spf.register_plugin(cls, *args, **kwargs)
else:
assoc = spf.register_plugin(mod, *args, **kwargs)
return assoc
self = super(SanicPlugin, cls).__new__(cls)
try:
self._initialized # initialized may be True or Unknown
except AttributeError:
self._initialized = False
return self
def is_registered_on_framework(self, check_spf):
for reg in self.registrations:
(spf, name, url) = reg
if spf is not None and spf == check_spf:
return True
return False
def __init__(self, *args, **kwargs):
# Sometimes __init__ can be called twice.
# Ignore it on subsequent times
if self._initialized:
return
assert len(args) < 1,\
"Unexpected arguments passed to this Sanic Plugins."
assert len(kwargs) < 1,\
"Unexpected keyword arguments passed to this Sanic Plugins."
super(SanicPlugin, self).__init__(*args, **kwargs)
self._routes = []
self._ws = []
self._static = []
self._middlewares = []
self._exceptions = []
self._listeners = defaultdict(list)
self.registrations = set()
self._initialized = True
def __getstate__(self):
state_dict = {}
for s in SanicPlugin.__slots__:
state_dict[s] = getattr(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
if s == "__weakref__":
if v is None:
continue
else:
raise NotImplementedError("Setting weakrefs on Plugin")
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
return SanicPlugin.__new__, (self.__class__,), state_dict
|
ashleysommer/sanicpluginsframework | spf/plugin.py | SanicPlugin.static | python | def static(self, uri, file_or_directory, *args, **kwargs):
kwargs.setdefault('pattern', r'/?.+')
kwargs.setdefault('use_modified_since', True)
kwargs.setdefault('use_content_range', False)
kwargs.setdefault('stream_large_files', False)
kwargs.setdefault('name', 'static')
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
self._static.append(FutureStatic(uri, file_or_directory, args, kwargs)) | Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugin.py#L154-L174 | null | class SanicPlugin(object):
__slots__ = ('registrations', '_routes', '_ws', '_static',
'_middlewares', '_exceptions', '_listeners', '_initialized',
'__weakref__')
AssociatedTuple = PluginAssociated
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs.setdefault('with_context', False)
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
self._middlewares.append(
FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))
return middle_f
def wrapper(middleware_f):
self._middlewares.append(
FutureMiddleware(middleware_f, args=args, kwargs=kwargs))
return middleware_f
return wrapper
def exception(self, *args, **kwargs):
"""Decorate and register an exception handler
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
if isinstance(args[0], type) and issubclass(args[0], Exception):
pass
else: # pragma: no cover
raise RuntimeError("Cannot use the @exception decorator "
"without arguments")
def wrapper(handler_f):
self._exceptions.append(FutureException(handler_f,
exceptions=args,
kwargs=kwargs))
return handler_f
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]): # pragma: no cover
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
def wrapper(listener_f):
if len(kwargs) > 0:
listener_f = (listener_f, kwargs)
self._listeners[event].append(listener_f)
return listener_f
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri): # pragma: no cover
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._routes.append(FutureRoute(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._ws.append(FutureWebsocket(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def on_before_registered(self, context, *args, **kwargs):
pass
def on_registered(self, context, reg, *args, **kwargs):
pass
def find_plugin_registration(self, spf):
if isinstance(spf, PluginRegistration):
return spf
for reg in self.registrations:
(s, n, u) = reg
if s is not None and s == spf:
return reg
return KeyError("Not found")
def first_plugin_context(self):
"""Returns the context is associated with the first app this plugin was
registered on"""
# Note, because registrations are stored in a set, its not _really_
# the first one, but whichever one it sees first in the set.
first_spf_reg = next(iter(self.registrations))
return self.get_context_from_spf(first_spf_reg)
def get_context_from_spf(self, spf):
rt_err = RuntimeError(
"Cannot use the plugin's Context before it is "
"registered.")
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
(s, n, u) = reg
try:
return s.get_context(n)
except KeyError as k:
raise k
except AttributeError:
raise rt_err
def get_app_from_spf_context(self, spf):
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.app
def spf_resolve_url_for(self, spf, view_name, *args, **kwargs):
reg = self.find_plugin_registration(spf)
(spf, name, url_prefix) = reg
app = self.get_app_from_spf_context(reg)
if app is None:
return None
if isinstance(app, Blueprint):
self.warning("Cannot use url_for when plugin is registered "
"on a Blueprint. Use `app.url_for` instead.")
return None
constructed_name = "{}.{}".format(name, view_name)
return app.url_for(constructed_name, *args, **kwargs)
def log(self, spf, level, message, *args, **kwargs):
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.log(level, message, *args, reg=self, **kwargs)
def debug(self, message, *args, **kwargs):
return self.log(DEBUG, message, *args, **kwargs)
def info(self, message, *args, **kwargs):
return self.log(INFO, message, *args, **kwargs)
def warning(self, message, *args, **kwargs):
return self.log(WARNING, message, *args, **kwargs)
def error(self, message, *args, **kwargs):
return self.log(ERROR, message, *args, **kwargs)
def critical(self, message, *args, **kwargs):
return self.log(CRITICAL, message, *args, **kwargs)
@classmethod
def decorate(cls, app, *args, run_middleware=False, with_context=False,
**kwargs):
"""
This is a decorator that can be used to apply this plugin to a specific
route/view on your app, rather than the whole app.
:param app:
:type app: Sanic | Blueprint
:param args:
:type args: tuple(Any)
:param run_middleware:
:type run_middleware: bool
:param with_context:
:type with_context: bool
:param kwargs:
:param kwargs: dict(Any)
:return: the decorated route/view
:rtype: fn
"""
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app) # get the singleton from the app
try:
assoc = spf.register_plugin(cls, skip_reg=True)
except ValueError as e:
# this is normal, if this plugin has been registered previously
assert e.args and len(e.args) > 1
assoc = e.args[1]
(plugin, reg) = assoc
inst = spf.get_plugin(plugin) # plugin may not actually be registered
# registered might be True, False or None at this point
regd = True if inst else None
if regd is True:
# middleware will be run on this route anyway, because the plugin
# is registered on the app. Turn it off on the route-level.
run_middleware = False
req_middleware = deque()
resp_middleware = deque()
if run_middleware:
for i, m in enumerate(plugin._middlewares):
attach_to = m.kwargs.pop('attach_to', 'request')
priority = m.kwargs.pop('priority', 5)
with_context = m.kwargs.pop('with_context', False)
mw_handle_fn = m.middleware
if attach_to == 'response':
relative = m.kwargs.pop('relative', 'post')
if relative == "pre":
mw = (0, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
else: # relative = "post"
mw = (1, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
resp_middleware.append(mw)
else: # attach_to = "request"
relative = m.kwargs.pop('relative', 'pre')
if relative == "post":
mw = (1, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
else: # relative = "pre"
mw = (0, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
req_middleware.append(mw)
req_middleware = tuple(sorted(req_middleware))
resp_middleware = tuple(sorted(resp_middleware))
def _decorator(f):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, args, kwargs
async def wrapper(request, *a, **kw):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, f, args, kwargs
# the plugin was not registered on the app, it might be now
if regd is None:
_inst = spf.get_plugin(plugin)
regd = _inst is not None
context = plugin.get_context_from_spf(spf)
if run_middleware and not regd and len(req_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in req_middleware:
if with_context:
resp = handler(request, *args, context=context,
**kwargs)
else:
resp = handler(request, *args, **kwargs)
if isawaitable(resp):
resp = await resp
if resp:
return
response = await plugin.route_wrapper(
f, request, context, a, kw, *args,
with_context=with_context, **kwargs)
if isawaitable(response):
response = await response
if run_middleware and not regd and len(resp_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in resp_middleware:
if with_context:
_resp = handler(request, response, *args,
context=context, **kwargs)
else:
_resp = handler(request, response, *args, **kwargs)
if isawaitable(_resp):
_resp = await _resp
if _resp:
response = _resp
break
return response
return update_wrapper(wrapper, f)
return _decorator
async def route_wrapper(self, route, request, context, request_args,
request_kw, *decorator_args, with_context=None,
**decorator_kw):
"""This is the function that is called when a route is decorated with
your plugin decorator. Context will normally be None, but the user
can pass use_context=True so the route will get the plugin
context
"""
# by default, do nothing, just run the wrapped function
if with_context:
resp = route(request, context, *request_args, **request_kw)
else:
resp = route(request, *request_args, **request_kw)
if isawaitable(resp):
resp = await resp
return resp
def __new__(cls, *args, **kwargs):
# making a bold assumption here.
# Assuming that if a sanic plugin is initialized using
# `MyPlugin(app)`, then the user is attempting to do a legacy plugin
# instantiation, aka Flask-Style plugin instantiation.
if args and len(args) > 0 and \
(isinstance(args[0], Sanic) or isinstance(args[0], Blueprint)):
app = args[0]
try:
mod_name = cls.__module__
mod = importlib.import_module(mod_name)
assert mod
except (ImportError, AssertionError):
raise RuntimeError(
"Failed attempting a legacy plugin instantiation. "
"Cannot find the module this plugin belongs to.")
# Get the spf singleton from this app
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app)
# catch cases like when the module is "__main__" or
# "__call__" or "__init__"
if mod_name.startswith("__"):
# In this case, we cannot use the module to register the
# plugin. Try to use the class method.
assoc = spf.register_plugin(cls, *args, **kwargs)
else:
assoc = spf.register_plugin(mod, *args, **kwargs)
return assoc
self = super(SanicPlugin, cls).__new__(cls)
try:
self._initialized # initialized may be True or Unknown
except AttributeError:
self._initialized = False
return self
def is_registered_on_framework(self, check_spf):
for reg in self.registrations:
(spf, name, url) = reg
if spf is not None and spf == check_spf:
return True
return False
def __init__(self, *args, **kwargs):
# Sometimes __init__ can be called twice.
# Ignore it on subsequent times
if self._initialized:
return
assert len(args) < 1,\
"Unexpected arguments passed to this Sanic Plugins."
assert len(kwargs) < 1,\
"Unexpected keyword arguments passed to this Sanic Plugins."
super(SanicPlugin, self).__init__(*args, **kwargs)
self._routes = []
self._ws = []
self._static = []
self._middlewares = []
self._exceptions = []
self._listeners = defaultdict(list)
self.registrations = set()
self._initialized = True
def __getstate__(self):
state_dict = {}
for s in SanicPlugin.__slots__:
state_dict[s] = getattr(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
if s == "__weakref__":
if v is None:
continue
else:
raise NotImplementedError("Setting weakrefs on Plugin")
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
return SanicPlugin.__new__, (self.__class__,), state_dict
|
ashleysommer/sanicpluginsframework | spf/plugin.py | SanicPlugin.first_plugin_context | python | def first_plugin_context(self):
# Note, because registrations are stored in a set, its not _really_
# the first one, but whichever one it sees first in the set.
first_spf_reg = next(iter(self.registrations))
return self.get_context_from_spf(first_spf_reg) | Returns the context is associated with the first app this plugin was
registered on | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugin.py#L191-L197 | [
"def get_context_from_spf(self, spf):\n rt_err = RuntimeError(\n \"Cannot use the plugin's Context before it is \"\n \"registered.\")\n if isinstance(spf, PluginRegistration):\n reg = spf\n else:\n reg = self.find_plugin_registration(spf)\n (s, n, u) = reg\n try:\n return s.get_context(n)\n except KeyError as k:\n raise k\n except AttributeError:\n raise rt_err\n"
] | class SanicPlugin(object):
__slots__ = ('registrations', '_routes', '_ws', '_static',
'_middlewares', '_exceptions', '_listeners', '_initialized',
'__weakref__')
AssociatedTuple = PluginAssociated
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs.setdefault('with_context', False)
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
self._middlewares.append(
FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))
return middle_f
def wrapper(middleware_f):
self._middlewares.append(
FutureMiddleware(middleware_f, args=args, kwargs=kwargs))
return middleware_f
return wrapper
def exception(self, *args, **kwargs):
"""Decorate and register an exception handler
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
if isinstance(args[0], type) and issubclass(args[0], Exception):
pass
else: # pragma: no cover
raise RuntimeError("Cannot use the @exception decorator "
"without arguments")
def wrapper(handler_f):
self._exceptions.append(FutureException(handler_f,
exceptions=args,
kwargs=kwargs))
return handler_f
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]): # pragma: no cover
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
def wrapper(listener_f):
if len(kwargs) > 0:
listener_f = (listener_f, kwargs)
self._listeners[event].append(listener_f)
return listener_f
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri): # pragma: no cover
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._routes.append(FutureRoute(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._ws.append(FutureWebsocket(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def static(self, uri, file_or_directory, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('pattern', r'/?.+')
kwargs.setdefault('use_modified_since', True)
kwargs.setdefault('use_content_range', False)
kwargs.setdefault('stream_large_files', False)
kwargs.setdefault('name', 'static')
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
self._static.append(FutureStatic(uri, file_or_directory, args, kwargs))
def on_before_registered(self, context, *args, **kwargs):
pass
def on_registered(self, context, reg, *args, **kwargs):
pass
def find_plugin_registration(self, spf):
if isinstance(spf, PluginRegistration):
return spf
for reg in self.registrations:
(s, n, u) = reg
if s is not None and s == spf:
return reg
return KeyError("Not found")
def get_context_from_spf(self, spf):
rt_err = RuntimeError(
"Cannot use the plugin's Context before it is "
"registered.")
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
(s, n, u) = reg
try:
return s.get_context(n)
except KeyError as k:
raise k
except AttributeError:
raise rt_err
def get_app_from_spf_context(self, spf):
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.app
def spf_resolve_url_for(self, spf, view_name, *args, **kwargs):
reg = self.find_plugin_registration(spf)
(spf, name, url_prefix) = reg
app = self.get_app_from_spf_context(reg)
if app is None:
return None
if isinstance(app, Blueprint):
self.warning("Cannot use url_for when plugin is registered "
"on a Blueprint. Use `app.url_for` instead.")
return None
constructed_name = "{}.{}".format(name, view_name)
return app.url_for(constructed_name, *args, **kwargs)
def log(self, spf, level, message, *args, **kwargs):
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.log(level, message, *args, reg=self, **kwargs)
def debug(self, message, *args, **kwargs):
return self.log(DEBUG, message, *args, **kwargs)
def info(self, message, *args, **kwargs):
return self.log(INFO, message, *args, **kwargs)
def warning(self, message, *args, **kwargs):
return self.log(WARNING, message, *args, **kwargs)
def error(self, message, *args, **kwargs):
return self.log(ERROR, message, *args, **kwargs)
def critical(self, message, *args, **kwargs):
return self.log(CRITICAL, message, *args, **kwargs)
@classmethod
def decorate(cls, app, *args, run_middleware=False, with_context=False,
**kwargs):
"""
This is a decorator that can be used to apply this plugin to a specific
route/view on your app, rather than the whole app.
:param app:
:type app: Sanic | Blueprint
:param args:
:type args: tuple(Any)
:param run_middleware:
:type run_middleware: bool
:param with_context:
:type with_context: bool
:param kwargs:
:param kwargs: dict(Any)
:return: the decorated route/view
:rtype: fn
"""
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app) # get the singleton from the app
try:
assoc = spf.register_plugin(cls, skip_reg=True)
except ValueError as e:
# this is normal, if this plugin has been registered previously
assert e.args and len(e.args) > 1
assoc = e.args[1]
(plugin, reg) = assoc
inst = spf.get_plugin(plugin) # plugin may not actually be registered
# registered might be True, False or None at this point
regd = True if inst else None
if regd is True:
# middleware will be run on this route anyway, because the plugin
# is registered on the app. Turn it off on the route-level.
run_middleware = False
req_middleware = deque()
resp_middleware = deque()
if run_middleware:
for i, m in enumerate(plugin._middlewares):
attach_to = m.kwargs.pop('attach_to', 'request')
priority = m.kwargs.pop('priority', 5)
with_context = m.kwargs.pop('with_context', False)
mw_handle_fn = m.middleware
if attach_to == 'response':
relative = m.kwargs.pop('relative', 'post')
if relative == "pre":
mw = (0, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
else: # relative = "post"
mw = (1, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
resp_middleware.append(mw)
else: # attach_to = "request"
relative = m.kwargs.pop('relative', 'pre')
if relative == "post":
mw = (1, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
else: # relative = "pre"
mw = (0, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
req_middleware.append(mw)
req_middleware = tuple(sorted(req_middleware))
resp_middleware = tuple(sorted(resp_middleware))
def _decorator(f):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, args, kwargs
async def wrapper(request, *a, **kw):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, f, args, kwargs
# the plugin was not registered on the app, it might be now
if regd is None:
_inst = spf.get_plugin(plugin)
regd = _inst is not None
context = plugin.get_context_from_spf(spf)
if run_middleware and not regd and len(req_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in req_middleware:
if with_context:
resp = handler(request, *args, context=context,
**kwargs)
else:
resp = handler(request, *args, **kwargs)
if isawaitable(resp):
resp = await resp
if resp:
return
response = await plugin.route_wrapper(
f, request, context, a, kw, *args,
with_context=with_context, **kwargs)
if isawaitable(response):
response = await response
if run_middleware and not regd and len(resp_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in resp_middleware:
if with_context:
_resp = handler(request, response, *args,
context=context, **kwargs)
else:
_resp = handler(request, response, *args, **kwargs)
if isawaitable(_resp):
_resp = await _resp
if _resp:
response = _resp
break
return response
return update_wrapper(wrapper, f)
return _decorator
async def route_wrapper(self, route, request, context, request_args,
request_kw, *decorator_args, with_context=None,
**decorator_kw):
"""This is the function that is called when a route is decorated with
your plugin decorator. Context will normally be None, but the user
can pass use_context=True so the route will get the plugin
context
"""
# by default, do nothing, just run the wrapped function
if with_context:
resp = route(request, context, *request_args, **request_kw)
else:
resp = route(request, *request_args, **request_kw)
if isawaitable(resp):
resp = await resp
return resp
def __new__(cls, *args, **kwargs):
# making a bold assumption here.
# Assuming that if a sanic plugin is initialized using
# `MyPlugin(app)`, then the user is attempting to do a legacy plugin
# instantiation, aka Flask-Style plugin instantiation.
if args and len(args) > 0 and \
(isinstance(args[0], Sanic) or isinstance(args[0], Blueprint)):
app = args[0]
try:
mod_name = cls.__module__
mod = importlib.import_module(mod_name)
assert mod
except (ImportError, AssertionError):
raise RuntimeError(
"Failed attempting a legacy plugin instantiation. "
"Cannot find the module this plugin belongs to.")
# Get the spf singleton from this app
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app)
# catch cases like when the module is "__main__" or
# "__call__" or "__init__"
if mod_name.startswith("__"):
# In this case, we cannot use the module to register the
# plugin. Try to use the class method.
assoc = spf.register_plugin(cls, *args, **kwargs)
else:
assoc = spf.register_plugin(mod, *args, **kwargs)
return assoc
self = super(SanicPlugin, cls).__new__(cls)
try:
self._initialized # initialized may be True or Unknown
except AttributeError:
self._initialized = False
return self
def is_registered_on_framework(self, check_spf):
for reg in self.registrations:
(spf, name, url) = reg
if spf is not None and spf == check_spf:
return True
return False
def __init__(self, *args, **kwargs):
# Sometimes __init__ can be called twice.
# Ignore it on subsequent times
if self._initialized:
return
assert len(args) < 1,\
"Unexpected arguments passed to this Sanic Plugins."
assert len(kwargs) < 1,\
"Unexpected keyword arguments passed to this Sanic Plugins."
super(SanicPlugin, self).__init__(*args, **kwargs)
self._routes = []
self._ws = []
self._static = []
self._middlewares = []
self._exceptions = []
self._listeners = defaultdict(list)
self.registrations = set()
self._initialized = True
def __getstate__(self):
state_dict = {}
for s in SanicPlugin.__slots__:
state_dict[s] = getattr(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
if s == "__weakref__":
if v is None:
continue
else:
raise NotImplementedError("Setting weakrefs on Plugin")
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
return SanicPlugin.__new__, (self.__class__,), state_dict
|
ashleysommer/sanicpluginsframework | spf/plugin.py | SanicPlugin.decorate | python | def decorate(cls, app, *args, run_middleware=False, with_context=False,
**kwargs):
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app) # get the singleton from the app
try:
assoc = spf.register_plugin(cls, skip_reg=True)
except ValueError as e:
# this is normal, if this plugin has been registered previously
assert e.args and len(e.args) > 1
assoc = e.args[1]
(plugin, reg) = assoc
inst = spf.get_plugin(plugin) # plugin may not actually be registered
# registered might be True, False or None at this point
regd = True if inst else None
if regd is True:
# middleware will be run on this route anyway, because the plugin
# is registered on the app. Turn it off on the route-level.
run_middleware = False
req_middleware = deque()
resp_middleware = deque()
if run_middleware:
for i, m in enumerate(plugin._middlewares):
attach_to = m.kwargs.pop('attach_to', 'request')
priority = m.kwargs.pop('priority', 5)
with_context = m.kwargs.pop('with_context', False)
mw_handle_fn = m.middleware
if attach_to == 'response':
relative = m.kwargs.pop('relative', 'post')
if relative == "pre":
mw = (0, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
else: # relative = "post"
mw = (1, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
resp_middleware.append(mw)
else: # attach_to = "request"
relative = m.kwargs.pop('relative', 'pre')
if relative == "post":
mw = (1, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
else: # relative = "pre"
mw = (0, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
req_middleware.append(mw)
req_middleware = tuple(sorted(req_middleware))
resp_middleware = tuple(sorted(resp_middleware))
def _decorator(f):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, args, kwargs
async def wrapper(request, *a, **kw):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, f, args, kwargs
# the plugin was not registered on the app, it might be now
if regd is None:
_inst = spf.get_plugin(plugin)
regd = _inst is not None
context = plugin.get_context_from_spf(spf)
if run_middleware and not regd and len(req_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in req_middleware:
if with_context:
resp = handler(request, *args, context=context,
**kwargs)
else:
resp = handler(request, *args, **kwargs)
if isawaitable(resp):
resp = await resp
if resp:
return
response = await plugin.route_wrapper(
f, request, context, a, kw, *args,
with_context=with_context, **kwargs)
if isawaitable(response):
response = await response
if run_middleware and not regd and len(resp_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in resp_middleware:
if with_context:
_resp = handler(request, response, *args,
context=context, **kwargs)
else:
_resp = handler(request, response, *args, **kwargs)
if isawaitable(_resp):
_resp = await _resp
if _resp:
response = _resp
break
return response
return update_wrapper(wrapper, f)
return _decorator | This is a decorator that can be used to apply this plugin to a specific
route/view on your app, rather than the whole app.
:param app:
:type app: Sanic | Blueprint
:param args:
:type args: tuple(Any)
:param run_middleware:
:type run_middleware: bool
:param with_context:
:type with_context: bool
:param kwargs:
:param kwargs: dict(Any)
:return: the decorated route/view
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugin.py#L257-L368 | null | class SanicPlugin(object):
__slots__ = ('registrations', '_routes', '_ws', '_static',
'_middlewares', '_exceptions', '_listeners', '_initialized',
'__weakref__')
AssociatedTuple = PluginAssociated
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs.setdefault('with_context', False)
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
self._middlewares.append(
FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))
return middle_f
def wrapper(middleware_f):
self._middlewares.append(
FutureMiddleware(middleware_f, args=args, kwargs=kwargs))
return middleware_f
return wrapper
def exception(self, *args, **kwargs):
"""Decorate and register an exception handler
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
if isinstance(args[0], type) and issubclass(args[0], Exception):
pass
else: # pragma: no cover
raise RuntimeError("Cannot use the @exception decorator "
"without arguments")
def wrapper(handler_f):
self._exceptions.append(FutureException(handler_f,
exceptions=args,
kwargs=kwargs))
return handler_f
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]): # pragma: no cover
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
def wrapper(listener_f):
if len(kwargs) > 0:
listener_f = (listener_f, kwargs)
self._listeners[event].append(listener_f)
return listener_f
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri): # pragma: no cover
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._routes.append(FutureRoute(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._ws.append(FutureWebsocket(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def static(self, uri, file_or_directory, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('pattern', r'/?.+')
kwargs.setdefault('use_modified_since', True)
kwargs.setdefault('use_content_range', False)
kwargs.setdefault('stream_large_files', False)
kwargs.setdefault('name', 'static')
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
self._static.append(FutureStatic(uri, file_or_directory, args, kwargs))
def on_before_registered(self, context, *args, **kwargs):
pass
def on_registered(self, context, reg, *args, **kwargs):
pass
def find_plugin_registration(self, spf):
if isinstance(spf, PluginRegistration):
return spf
for reg in self.registrations:
(s, n, u) = reg
if s is not None and s == spf:
return reg
return KeyError("Not found")
def first_plugin_context(self):
"""Returns the context is associated with the first app this plugin was
registered on"""
# Note, because registrations are stored in a set, its not _really_
# the first one, but whichever one it sees first in the set.
first_spf_reg = next(iter(self.registrations))
return self.get_context_from_spf(first_spf_reg)
def get_context_from_spf(self, spf):
rt_err = RuntimeError(
"Cannot use the plugin's Context before it is "
"registered.")
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
(s, n, u) = reg
try:
return s.get_context(n)
except KeyError as k:
raise k
except AttributeError:
raise rt_err
def get_app_from_spf_context(self, spf):
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.app
def spf_resolve_url_for(self, spf, view_name, *args, **kwargs):
reg = self.find_plugin_registration(spf)
(spf, name, url_prefix) = reg
app = self.get_app_from_spf_context(reg)
if app is None:
return None
if isinstance(app, Blueprint):
self.warning("Cannot use url_for when plugin is registered "
"on a Blueprint. Use `app.url_for` instead.")
return None
constructed_name = "{}.{}".format(name, view_name)
return app.url_for(constructed_name, *args, **kwargs)
def log(self, spf, level, message, *args, **kwargs):
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.log(level, message, *args, reg=self, **kwargs)
def debug(self, message, *args, **kwargs):
return self.log(DEBUG, message, *args, **kwargs)
def info(self, message, *args, **kwargs):
return self.log(INFO, message, *args, **kwargs)
def warning(self, message, *args, **kwargs):
return self.log(WARNING, message, *args, **kwargs)
def error(self, message, *args, **kwargs):
return self.log(ERROR, message, *args, **kwargs)
def critical(self, message, *args, **kwargs):
return self.log(CRITICAL, message, *args, **kwargs)
@classmethod
async def route_wrapper(self, route, request, context, request_args,
request_kw, *decorator_args, with_context=None,
**decorator_kw):
"""This is the function that is called when a route is decorated with
your plugin decorator. Context will normally be None, but the user
can pass use_context=True so the route will get the plugin
context
"""
# by default, do nothing, just run the wrapped function
if with_context:
resp = route(request, context, *request_args, **request_kw)
else:
resp = route(request, *request_args, **request_kw)
if isawaitable(resp):
resp = await resp
return resp
def __new__(cls, *args, **kwargs):
# making a bold assumption here.
# Assuming that if a sanic plugin is initialized using
# `MyPlugin(app)`, then the user is attempting to do a legacy plugin
# instantiation, aka Flask-Style plugin instantiation.
if args and len(args) > 0 and \
(isinstance(args[0], Sanic) or isinstance(args[0], Blueprint)):
app = args[0]
try:
mod_name = cls.__module__
mod = importlib.import_module(mod_name)
assert mod
except (ImportError, AssertionError):
raise RuntimeError(
"Failed attempting a legacy plugin instantiation. "
"Cannot find the module this plugin belongs to.")
# Get the spf singleton from this app
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app)
# catch cases like when the module is "__main__" or
# "__call__" or "__init__"
if mod_name.startswith("__"):
# In this case, we cannot use the module to register the
# plugin. Try to use the class method.
assoc = spf.register_plugin(cls, *args, **kwargs)
else:
assoc = spf.register_plugin(mod, *args, **kwargs)
return assoc
self = super(SanicPlugin, cls).__new__(cls)
try:
self._initialized # initialized may be True or Unknown
except AttributeError:
self._initialized = False
return self
def is_registered_on_framework(self, check_spf):
for reg in self.registrations:
(spf, name, url) = reg
if spf is not None and spf == check_spf:
return True
return False
def __init__(self, *args, **kwargs):
# Sometimes __init__ can be called twice.
# Ignore it on subsequent times
if self._initialized:
return
assert len(args) < 1,\
"Unexpected arguments passed to this Sanic Plugins."
assert len(kwargs) < 1,\
"Unexpected keyword arguments passed to this Sanic Plugins."
super(SanicPlugin, self).__init__(*args, **kwargs)
self._routes = []
self._ws = []
self._static = []
self._middlewares = []
self._exceptions = []
self._listeners = defaultdict(list)
self.registrations = set()
self._initialized = True
def __getstate__(self):
state_dict = {}
for s in SanicPlugin.__slots__:
state_dict[s] = getattr(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
if s == "__weakref__":
if v is None:
continue
else:
raise NotImplementedError("Setting weakrefs on Plugin")
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
return SanicPlugin.__new__, (self.__class__,), state_dict
|
ashleysommer/sanicpluginsframework | spf/plugin.py | SanicPlugin.route_wrapper | python | async def route_wrapper(self, route, request, context, request_args,
request_kw, *decorator_args, with_context=None,
**decorator_kw):
# by default, do nothing, just run the wrapped function
if with_context:
resp = route(request, context, *request_args, **request_kw)
else:
resp = route(request, *request_args, **request_kw)
if isawaitable(resp):
resp = await resp
return resp | This is the function that is called when a route is decorated with
your plugin decorator. Context will normally be None, but the user
can pass use_context=True so the route will get the plugin
context | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugin.py#L370-L385 | null | class SanicPlugin(object):
__slots__ = ('registrations', '_routes', '_ws', '_static',
'_middlewares', '_exceptions', '_listeners', '_initialized',
'__weakref__')
AssociatedTuple = PluginAssociated
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs.setdefault('with_context', False)
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
self._middlewares.append(
FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))
return middle_f
def wrapper(middleware_f):
self._middlewares.append(
FutureMiddleware(middleware_f, args=args, kwargs=kwargs))
return middleware_f
return wrapper
def exception(self, *args, **kwargs):
"""Decorate and register an exception handler
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
if isinstance(args[0], type) and issubclass(args[0], Exception):
pass
else: # pragma: no cover
raise RuntimeError("Cannot use the @exception decorator "
"without arguments")
def wrapper(handler_f):
self._exceptions.append(FutureException(handler_f,
exceptions=args,
kwargs=kwargs))
return handler_f
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]): # pragma: no cover
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
def wrapper(listener_f):
if len(kwargs) > 0:
listener_f = (listener_f, kwargs)
self._listeners[event].append(listener_f)
return listener_f
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri): # pragma: no cover
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._routes.append(FutureRoute(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
def wrapper(handler_f):
self._ws.append(FutureWebsocket(handler_f, uri, args, kwargs))
return handler_f
return wrapper
def static(self, uri, file_or_directory, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('pattern', r'/?.+')
kwargs.setdefault('use_modified_since', True)
kwargs.setdefault('use_content_range', False)
kwargs.setdefault('stream_large_files', False)
kwargs.setdefault('name', 'static')
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
self._static.append(FutureStatic(uri, file_or_directory, args, kwargs))
def on_before_registered(self, context, *args, **kwargs):
pass
def on_registered(self, context, reg, *args, **kwargs):
pass
def find_plugin_registration(self, spf):
if isinstance(spf, PluginRegistration):
return spf
for reg in self.registrations:
(s, n, u) = reg
if s is not None and s == spf:
return reg
return KeyError("Not found")
def first_plugin_context(self):
"""Returns the context is associated with the first app this plugin was
registered on"""
# Note, because registrations are stored in a set, its not _really_
# the first one, but whichever one it sees first in the set.
first_spf_reg = next(iter(self.registrations))
return self.get_context_from_spf(first_spf_reg)
def get_context_from_spf(self, spf):
rt_err = RuntimeError(
"Cannot use the plugin's Context before it is "
"registered.")
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
(s, n, u) = reg
try:
return s.get_context(n)
except KeyError as k:
raise k
except AttributeError:
raise rt_err
def get_app_from_spf_context(self, spf):
if isinstance(spf, PluginRegistration):
reg = spf
else:
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.app
def spf_resolve_url_for(self, spf, view_name, *args, **kwargs):
reg = self.find_plugin_registration(spf)
(spf, name, url_prefix) = reg
app = self.get_app_from_spf_context(reg)
if app is None:
return None
if isinstance(app, Blueprint):
self.warning("Cannot use url_for when plugin is registered "
"on a Blueprint. Use `app.url_for` instead.")
return None
constructed_name = "{}.{}".format(name, view_name)
return app.url_for(constructed_name, *args, **kwargs)
def log(self, spf, level, message, *args, **kwargs):
reg = self.find_plugin_registration(spf)
context = self.get_context_from_spf(reg)
return context.log(level, message, *args, reg=self, **kwargs)
def debug(self, message, *args, **kwargs):
return self.log(DEBUG, message, *args, **kwargs)
def info(self, message, *args, **kwargs):
return self.log(INFO, message, *args, **kwargs)
def warning(self, message, *args, **kwargs):
return self.log(WARNING, message, *args, **kwargs)
def error(self, message, *args, **kwargs):
return self.log(ERROR, message, *args, **kwargs)
def critical(self, message, *args, **kwargs):
return self.log(CRITICAL, message, *args, **kwargs)
@classmethod
def decorate(cls, app, *args, run_middleware=False, with_context=False,
**kwargs):
"""
This is a decorator that can be used to apply this plugin to a specific
route/view on your app, rather than the whole app.
:param app:
:type app: Sanic | Blueprint
:param args:
:type args: tuple(Any)
:param run_middleware:
:type run_middleware: bool
:param with_context:
:type with_context: bool
:param kwargs:
:param kwargs: dict(Any)
:return: the decorated route/view
:rtype: fn
"""
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app) # get the singleton from the app
try:
assoc = spf.register_plugin(cls, skip_reg=True)
except ValueError as e:
# this is normal, if this plugin has been registered previously
assert e.args and len(e.args) > 1
assoc = e.args[1]
(plugin, reg) = assoc
inst = spf.get_plugin(plugin) # plugin may not actually be registered
# registered might be True, False or None at this point
regd = True if inst else None
if regd is True:
# middleware will be run on this route anyway, because the plugin
# is registered on the app. Turn it off on the route-level.
run_middleware = False
req_middleware = deque()
resp_middleware = deque()
if run_middleware:
for i, m in enumerate(plugin._middlewares):
attach_to = m.kwargs.pop('attach_to', 'request')
priority = m.kwargs.pop('priority', 5)
with_context = m.kwargs.pop('with_context', False)
mw_handle_fn = m.middleware
if attach_to == 'response':
relative = m.kwargs.pop('relative', 'post')
if relative == "pre":
mw = (0, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
else: # relative = "post"
mw = (1, 0 - priority, 0 - i, mw_handle_fn,
with_context, m.args, m.kwargs)
resp_middleware.append(mw)
else: # attach_to = "request"
relative = m.kwargs.pop('relative', 'pre')
if relative == "post":
mw = (1, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
else: # relative = "pre"
mw = (0, priority, i, mw_handle_fn, with_context,
m.args, m.kwargs)
req_middleware.append(mw)
req_middleware = tuple(sorted(req_middleware))
resp_middleware = tuple(sorted(resp_middleware))
def _decorator(f):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, args, kwargs
async def wrapper(request, *a, **kw):
nonlocal spf, plugin, regd, run_middleware, with_context
nonlocal req_middleware, resp_middleware, f, args, kwargs
# the plugin was not registered on the app, it might be now
if regd is None:
_inst = spf.get_plugin(plugin)
regd = _inst is not None
context = plugin.get_context_from_spf(spf)
if run_middleware and not regd and len(req_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in req_middleware:
if with_context:
resp = handler(request, *args, context=context,
**kwargs)
else:
resp = handler(request, *args, **kwargs)
if isawaitable(resp):
resp = await resp
if resp:
return
response = await plugin.route_wrapper(
f, request, context, a, kw, *args,
with_context=with_context, **kwargs)
if isawaitable(response):
response = await response
if run_middleware and not regd and len(resp_middleware) > 0:
for (_a, _p, _i, handler, with_context, args, kwargs) \
in resp_middleware:
if with_context:
_resp = handler(request, response, *args,
context=context, **kwargs)
else:
_resp = handler(request, response, *args, **kwargs)
if isawaitable(_resp):
_resp = await _resp
if _resp:
response = _resp
break
return response
return update_wrapper(wrapper, f)
return _decorator
def __new__(cls, *args, **kwargs):
# making a bold assumption here.
# Assuming that if a sanic plugin is initialized using
# `MyPlugin(app)`, then the user is attempting to do a legacy plugin
# instantiation, aka Flask-Style plugin instantiation.
if args and len(args) > 0 and \
(isinstance(args[0], Sanic) or isinstance(args[0], Blueprint)):
app = args[0]
try:
mod_name = cls.__module__
mod = importlib.import_module(mod_name)
assert mod
except (ImportError, AssertionError):
raise RuntimeError(
"Failed attempting a legacy plugin instantiation. "
"Cannot find the module this plugin belongs to.")
# Get the spf singleton from this app
from spf.framework import SanicPluginsFramework
spf = SanicPluginsFramework(app)
# catch cases like when the module is "__main__" or
# "__call__" or "__init__"
if mod_name.startswith("__"):
# In this case, we cannot use the module to register the
# plugin. Try to use the class method.
assoc = spf.register_plugin(cls, *args, **kwargs)
else:
assoc = spf.register_plugin(mod, *args, **kwargs)
return assoc
self = super(SanicPlugin, cls).__new__(cls)
try:
self._initialized # initialized may be True or Unknown
except AttributeError:
self._initialized = False
return self
def is_registered_on_framework(self, check_spf):
for reg in self.registrations:
(spf, name, url) = reg
if spf is not None and spf == check_spf:
return True
return False
def __init__(self, *args, **kwargs):
# Sometimes __init__ can be called twice.
# Ignore it on subsequent times
if self._initialized:
return
assert len(args) < 1,\
"Unexpected arguments passed to this Sanic Plugins."
assert len(kwargs) < 1,\
"Unexpected keyword arguments passed to this Sanic Plugins."
super(SanicPlugin, self).__init__(*args, **kwargs)
self._routes = []
self._ws = []
self._static = []
self._middlewares = []
self._exceptions = []
self._listeners = defaultdict(list)
self.registrations = set()
self._initialized = True
def __getstate__(self):
state_dict = {}
for s in SanicPlugin.__slots__:
state_dict[s] = getattr(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
if s == "__weakref__":
if v is None:
continue
else:
raise NotImplementedError("Setting weakrefs on Plugin")
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
return SanicPlugin.__new__, (self.__class__,), state_dict
|
ashleysommer/sanicpluginsframework | spf/context.py | ContextDict.replace | python | def replace(self, key, value):
if key in self._inner().keys():
return self.__setitem__(key, value)
parents_searched = [self]
parent = self._parent_context
while parent:
try:
if key in parent.keys():
return parent.__setitem__(key, value)
except (KeyError, AttributeError):
pass
parents_searched.append(parent)
# noinspection PyProtectedMember
next_parent = parent._parent_context
if next_parent in parents_searched:
raise RuntimeError("Recursive ContextDict found!")
parent = next_parent
return self.__setitem__(key, value) | If this ContextDict doesn't already have this key, it sets
the value on a parent ContextDict if that parent has the key,
otherwise sets the value on this ContextDict.
:param key:
:param value:
:return: Nothing
:rtype: None | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/context.py#L123-L149 | [
"def _inner(self):\n \"\"\"\n :return: the internal dictionary\n :rtype: dict\n \"\"\"\n return object.__getattribute__(self, '_dict')\n"
] | class ContextDict(object):
"""
This is the specialised dictionary that is used by Sanic Plugins Framework
to manage Context objects. It can be hierarchical, and it searches its
parents if it cannot find an item in its own dictionary. It can create its
own children.
"""
__slots__ = ('_spf', '_parent_context', '_dict', '__weakref__')
def _inner(self):
"""
:return: the internal dictionary
:rtype: dict
"""
return object.__getattribute__(self, '_dict')
def __repr__(self):
_dict_repr = repr(self._inner())
return "ContextDict({:s})".format(_dict_repr)
def __str__(self):
_dict_str = str(self._inner())
return "ContextDict({:s})".format(_dict_str)
def __len__(self):
return len(self._inner())
def __setitem__(self, key, value):
# TODO: If key is in __slots__, ignore it and return
return self._inner().__setitem__(key, value)
def __getitem__(self, item):
try:
return self._inner().__getitem__(item)
except KeyError as e1:
parents_searched = [self]
parent = self._parent_context
while parent:
try:
return parent._inner().__getitem__(item)
except KeyError:
parents_searched.append(parent)
# noinspection PyProtectedMember
next_parent = parent._parent_context
if next_parent in parents_searched:
raise RuntimeError("Recursive ContextDict found!")
parent = next_parent
raise e1
def __delitem__(self, key):
self._inner().__delitem__(key)
def __getattr__(self, item):
if item in self.__slots__:
return object.__getattribute__(self, item)
try:
return self.__getitem__(item)
except KeyError as e:
raise AttributeError(*e.args)
def __setattr__(self, key, value):
if key in self.__slots__:
if key == '__weakref__':
if value is None:
return
else:
raise ValueError("Cannot set weakrefs on Context")
return object.__setattr__(self, key, value)
try:
return self.__setitem__(key, value)
except Exception as e: # pragma: no cover
# what exceptions can occur on setting an item?
raise e
def __contains__(self, item):
return self._inner().__contains__(item)
def get(self, key, default=None):
try:
return self.__getattr__(key)
except (AttributeError, KeyError):
return default
def set(self, key, value):
try:
return self.__setattr__(key, value)
except Exception as e: # pragma: no cover
raise e
def items(self):
"""
A set-like read-only view ContextDict's (K,V) tuples
:return:
:rtype: frozenset
"""
return self._inner().items()
def keys(self):
"""
An object containing a view on the ContextDict's keys
:return:
:rtype: tuple # using tuple to represent an immutable list
"""
return self._inner().keys()
def values(self):
"""
An object containing a view on the ContextDict's values
:return:
:rtype: tuple # using tuple to represent an immutable list
"""
return self._inner().values()
# noinspection PyPep8Naming
def update(self, E=None, **F):
"""
Update ContextDict from dict/iterable E and F
:return: Nothing
:rtype: None
"""
if E is not None:
if hasattr(E, 'keys'):
for K in E:
self.replace(K, E[K])
elif hasattr(E, 'items'):
for K, V in E.items():
self.replace(K, V)
else:
for K, V in E:
self.replace(K, V)
for K in F:
self.replace(K, F[K])
def create_child_context(self, *args, **kwargs):
return ContextDict(self._spf, self, *args, **kwargs)
def __new__(cls, spf, parent, *args, **kwargs):
self = super(ContextDict, cls).__new__(cls)
self._dict = dict(*args, **kwargs)
if parent is not None:
assert isinstance(parent, ContextDict),\
"Parent context must be a valid initialised ContextDict"
self._parent_context = parent
else:
self._parent_context = None
self._spf = spf
return self
def __init__(self, *args, **kwargs):
args = list(args)
args.pop(0) # remove spf
args.pop(0) # remove parent
super(ContextDict, self).__init__()
def __getstate__(self):
state_dict = {}
for s in self.__slots__:
state_dict[s] = object.__getattribute__(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
spf = state_dict.pop('_spf')
parent_context = state_dict.pop('_parent_context')
return (ContextDict.__new__, (self.__class__, spf, parent_context),
state_dict)
|
ashleysommer/sanicpluginsframework | spf/context.py | ContextDict.update | python | def update(self, E=None, **F):
if E is not None:
if hasattr(E, 'keys'):
for K in E:
self.replace(K, E[K])
elif hasattr(E, 'items'):
for K, V in E.items():
self.replace(K, V)
else:
for K, V in E:
self.replace(K, V)
for K in F:
self.replace(K, F[K]) | Update ContextDict from dict/iterable E and F
:return: Nothing
:rtype: None | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/context.py#L152-L169 | [
"def replace(self, key, value):\n \"\"\"\n If this ContextDict doesn't already have this key, it sets\n the value on a parent ContextDict if that parent has the key,\n otherwise sets the value on this ContextDict.\n :param key:\n :param value:\n :return: Nothing\n :rtype: None\n \"\"\"\n if key in self._inner().keys():\n return self.__setitem__(key, value)\n parents_searched = [self]\n parent = self._parent_context\n while parent:\n try:\n if key in parent.keys():\n return parent.__setitem__(key, value)\n except (KeyError, AttributeError):\n pass\n parents_searched.append(parent)\n # noinspection PyProtectedMember\n next_parent = parent._parent_context\n if next_parent in parents_searched:\n raise RuntimeError(\"Recursive ContextDict found!\")\n parent = next_parent\n return self.__setitem__(key, value)\n"
] | class ContextDict(object):
"""
This is the specialised dictionary that is used by Sanic Plugins Framework
to manage Context objects. It can be hierarchical, and it searches its
parents if it cannot find an item in its own dictionary. It can create its
own children.
"""
__slots__ = ('_spf', '_parent_context', '_dict', '__weakref__')
def _inner(self):
"""
:return: the internal dictionary
:rtype: dict
"""
return object.__getattribute__(self, '_dict')
def __repr__(self):
_dict_repr = repr(self._inner())
return "ContextDict({:s})".format(_dict_repr)
def __str__(self):
_dict_str = str(self._inner())
return "ContextDict({:s})".format(_dict_str)
def __len__(self):
return len(self._inner())
def __setitem__(self, key, value):
# TODO: If key is in __slots__, ignore it and return
return self._inner().__setitem__(key, value)
def __getitem__(self, item):
try:
return self._inner().__getitem__(item)
except KeyError as e1:
parents_searched = [self]
parent = self._parent_context
while parent:
try:
return parent._inner().__getitem__(item)
except KeyError:
parents_searched.append(parent)
# noinspection PyProtectedMember
next_parent = parent._parent_context
if next_parent in parents_searched:
raise RuntimeError("Recursive ContextDict found!")
parent = next_parent
raise e1
def __delitem__(self, key):
self._inner().__delitem__(key)
def __getattr__(self, item):
if item in self.__slots__:
return object.__getattribute__(self, item)
try:
return self.__getitem__(item)
except KeyError as e:
raise AttributeError(*e.args)
def __setattr__(self, key, value):
if key in self.__slots__:
if key == '__weakref__':
if value is None:
return
else:
raise ValueError("Cannot set weakrefs on Context")
return object.__setattr__(self, key, value)
try:
return self.__setitem__(key, value)
except Exception as e: # pragma: no cover
# what exceptions can occur on setting an item?
raise e
def __contains__(self, item):
return self._inner().__contains__(item)
def get(self, key, default=None):
try:
return self.__getattr__(key)
except (AttributeError, KeyError):
return default
def set(self, key, value):
try:
return self.__setattr__(key, value)
except Exception as e: # pragma: no cover
raise e
def items(self):
"""
A set-like read-only view ContextDict's (K,V) tuples
:return:
:rtype: frozenset
"""
return self._inner().items()
def keys(self):
"""
An object containing a view on the ContextDict's keys
:return:
:rtype: tuple # using tuple to represent an immutable list
"""
return self._inner().keys()
def values(self):
"""
An object containing a view on the ContextDict's values
:return:
:rtype: tuple # using tuple to represent an immutable list
"""
return self._inner().values()
def replace(self, key, value):
"""
If this ContextDict doesn't already have this key, it sets
the value on a parent ContextDict if that parent has the key,
otherwise sets the value on this ContextDict.
:param key:
:param value:
:return: Nothing
:rtype: None
"""
if key in self._inner().keys():
return self.__setitem__(key, value)
parents_searched = [self]
parent = self._parent_context
while parent:
try:
if key in parent.keys():
return parent.__setitem__(key, value)
except (KeyError, AttributeError):
pass
parents_searched.append(parent)
# noinspection PyProtectedMember
next_parent = parent._parent_context
if next_parent in parents_searched:
raise RuntimeError("Recursive ContextDict found!")
parent = next_parent
return self.__setitem__(key, value)
# noinspection PyPep8Naming
def create_child_context(self, *args, **kwargs):
return ContextDict(self._spf, self, *args, **kwargs)
def __new__(cls, spf, parent, *args, **kwargs):
self = super(ContextDict, cls).__new__(cls)
self._dict = dict(*args, **kwargs)
if parent is not None:
assert isinstance(parent, ContextDict),\
"Parent context must be a valid initialised ContextDict"
self._parent_context = parent
else:
self._parent_context = None
self._spf = spf
return self
def __init__(self, *args, **kwargs):
args = list(args)
args.pop(0) # remove spf
args.pop(0) # remove parent
super(ContextDict, self).__init__()
def __getstate__(self):
state_dict = {}
for s in self.__slots__:
state_dict[s] = object.__getattribute__(self, s)
return state_dict
def __setstate__(self, state):
for s, v in state.items():
setattr(self, s, v)
def __reduce__(self):
state_dict = self.__getstate__()
spf = state_dict.pop('_spf')
parent_context = state_dict.pop('_parent_context')
return (ContextDict.__new__, (self.__class__, spf, parent_context),
state_dict)
|
ashleysommer/sanicpluginsframework | spf/plugins/contextualize.py | ContextualizeAssociated.middleware | python | def middleware(self, *args, **kwargs):
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
return plugin._add_new_middleware(reg, middle_f, **kwargs)
def wrapper(middle_f):
nonlocal plugin, reg
nonlocal args, kwargs
return plugin._add_new_middleware(reg, middle_f, *args, **kwargs)
return wrapper | Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugins/contextualize.py#L14-L38 | null | class ContextualizeAssociated(ContextualizeAssociatedTuple):
__slots__ = ()
# Decorator
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri):
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(handler_f):
nonlocal plugin, reg
nonlocal uri, args, kwargs
return plugin._add_new_route(reg, uri, handler_f, *args, **kwargs)
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(listener_f):
nonlocal plugin, reg
nonlocal event, args, kwargs
return plugin._add_new_listener(reg, event, listener_f, *args,
**kwargs)
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(handler_f):
nonlocal plugin, reg
nonlocal uri, args, kwargs
return plugin._add_new_ws_route(reg, uri, handler_f,
*args, **kwargs)
return wrapper
|
ashleysommer/sanicpluginsframework | spf/plugins/contextualize.py | ContextualizeAssociated.route | python | def route(self, uri, *args, **kwargs):
if len(args) == 0 and callable(uri):
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(handler_f):
nonlocal plugin, reg
nonlocal uri, args, kwargs
return plugin._add_new_route(reg, uri, handler_f, *args, **kwargs)
return wrapper | Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugins/contextualize.py#L40-L67 | null | class ContextualizeAssociated(ContextualizeAssociatedTuple):
__slots__ = ()
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
return plugin._add_new_middleware(reg, middle_f, **kwargs)
def wrapper(middle_f):
nonlocal plugin, reg
nonlocal args, kwargs
return plugin._add_new_middleware(reg, middle_f, *args, **kwargs)
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(listener_f):
nonlocal plugin, reg
nonlocal event, args, kwargs
return plugin._add_new_listener(reg, event, listener_f, *args,
**kwargs)
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(handler_f):
nonlocal plugin, reg
nonlocal uri, args, kwargs
return plugin._add_new_ws_route(reg, uri, handler_f,
*args, **kwargs)
return wrapper
|
ashleysommer/sanicpluginsframework | spf/plugins/contextualize.py | ContextualizeAssociated.listener | python | def listener(self, event, *args, **kwargs):
if len(args) == 1 and callable(args[0]):
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(listener_f):
nonlocal plugin, reg
nonlocal event, args, kwargs
return plugin._add_new_listener(reg, event, listener_f, *args,
**kwargs)
return wrapper | Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugins/contextualize.py#L69-L92 | null | class ContextualizeAssociated(ContextualizeAssociatedTuple):
__slots__ = ()
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
return plugin._add_new_middleware(reg, middle_f, **kwargs)
def wrapper(middle_f):
nonlocal plugin, reg
nonlocal args, kwargs
return plugin._add_new_middleware(reg, middle_f, *args, **kwargs)
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri):
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(handler_f):
nonlocal plugin, reg
nonlocal uri, args, kwargs
return plugin._add_new_route(reg, uri, handler_f, *args, **kwargs)
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(handler_f):
nonlocal plugin, reg
nonlocal uri, args, kwargs
return plugin._add_new_ws_route(reg, uri, handler_f,
*args, **kwargs)
return wrapper
|
ashleysommer/sanicpluginsframework | spf/plugins/contextualize.py | ContextualizeAssociated.websocket | python | def websocket(self, uri, *args, **kwargs):
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(handler_f):
nonlocal plugin, reg
nonlocal uri, args, kwargs
return plugin._add_new_ws_route(reg, uri, handler_f,
*args, **kwargs)
return wrapper | Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugins/contextualize.py#L94-L119 | null | class ContextualizeAssociated(ContextualizeAssociatedTuple):
__slots__ = ()
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
return plugin._add_new_middleware(reg, middle_f, **kwargs)
def wrapper(middle_f):
nonlocal plugin, reg
nonlocal args, kwargs
return plugin._add_new_middleware(reg, middle_f, *args, **kwargs)
return wrapper
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri):
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(handler_f):
nonlocal plugin, reg
nonlocal uri, args, kwargs
return plugin._add_new_route(reg, uri, handler_f, *args, **kwargs)
return wrapper
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
kwargs['with_context'] = True # This is the whole point of this plugin
plugin = self.plugin
reg = self.reg
def wrapper(listener_f):
nonlocal plugin, reg
nonlocal event, args, kwargs
return plugin._add_new_listener(reg, event, listener_f, *args,
**kwargs)
return wrapper
|
ashleysommer/sanicpluginsframework | spf/plugins/contextualize.py | Contextualize.middleware | python | def middleware(self, *args, **kwargs):
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs['with_context'] = True # This is the whole point of this plugin
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
return super(Contextualize, self).middleware(middle_f, **kwargs)
def wrapper(middle_f):
nonlocal self, args, kwargs
return super(Contextualize, self).middleware(
*args, **kwargs)(middle_f)
return wrapper | Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugins/contextualize.py#L173-L194 | [
"def middleware(self, *args, **kwargs):\n \"\"\"Decorate and register middleware\n :param args: captures all of the positional arguments passed in\n :type args: tuple(Any)\n :param kwargs: captures the keyword arguments passed in\n :type kwargs: dict(Any)\n :return: The middleware function to use as the decorator\n :rtype: fn\n \"\"\"\n kwargs.setdefault('priority', 5)\n kwargs.setdefault('relative', None)\n kwargs.setdefault('attach_to', None)\n kwargs.setdefault('with_context', False)\n if len(args) == 1 and callable(args[0]):\n middle_f = args[0]\n self._middlewares.append(\n FutureMiddleware(middle_f, args=tuple(), kwargs=kwargs))\n return middle_f\n\n def wrapper(middleware_f):\n self._middlewares.append(\n FutureMiddleware(middleware_f, args=args, kwargs=kwargs))\n return middleware_f\n return wrapper\n"
] | class Contextualize(SanicPlugin):
__slots__ = ()
AssociatedTuple = ContextualizeAssociated
def _add_new_middleware(self, reg, middle_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new middleware _after_ the plugin is registered
m = FutureMiddleware(middle_f, args, kwargs)
spf._register_middleware_helper(m, spf, self, context)
return middle_f
def _add_new_route(self, reg, uri, handler_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new route _after_ the plugin is registered
r = FutureRoute(handler_f, uri, args, kwargs)
spf._register_route_helper(r, spf, self, context, p_name, url_prefix)
return handler_f
def _add_new_listener(self, reg, event, listener_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new listener _after_ the plugin is registered
spf._plugin_register_listener(event, listener_f, self, context,
*args, **kwargs)
return listener_f
def _add_new_ws_route(self, reg, uri, handler_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new route _after_ the plugin is registered
w = FutureWebsocket(handler_f, uri, args, kwargs)
spf._register_websocket_route_helper(w, spf, self, context, p_name,
url_prefix)
return handler_f
# Decorator
# Decorator
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri):
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(handler_f):
nonlocal self, uri, args, kwargs
return super(Contextualize, self).route(
uri, *args, **kwargs)(handler_f)
return wrapper
# Decorator
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(listener_f):
nonlocal self, event, args, kwargs
return super(Contextualize, self).listener(
event, *args, **kwargs)(listener_f)
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(handler_f):
nonlocal self, uri, args, kwargs
return super(Contextualize, self).websocket(
uri, *args, **kwargs)(handler_f)
return wrapper
def __init__(self, *args, **kwargs):
super(Contextualize, self).__init__(*args, **kwargs)
|
ashleysommer/sanicpluginsframework | spf/plugins/contextualize.py | Contextualize.route | python | def route(self, uri, *args, **kwargs):
if len(args) == 0 and callable(uri):
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(handler_f):
nonlocal self, uri, args, kwargs
return super(Contextualize, self).route(
uri, *args, **kwargs)(handler_f)
return wrapper | Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugins/contextualize.py#L197-L222 | null | class Contextualize(SanicPlugin):
__slots__ = ()
AssociatedTuple = ContextualizeAssociated
def _add_new_middleware(self, reg, middle_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new middleware _after_ the plugin is registered
m = FutureMiddleware(middle_f, args, kwargs)
spf._register_middleware_helper(m, spf, self, context)
return middle_f
def _add_new_route(self, reg, uri, handler_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new route _after_ the plugin is registered
r = FutureRoute(handler_f, uri, args, kwargs)
spf._register_route_helper(r, spf, self, context, p_name, url_prefix)
return handler_f
def _add_new_listener(self, reg, event, listener_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new listener _after_ the plugin is registered
spf._plugin_register_listener(event, listener_f, self, context,
*args, **kwargs)
return listener_f
def _add_new_ws_route(self, reg, uri, handler_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new route _after_ the plugin is registered
w = FutureWebsocket(handler_f, uri, args, kwargs)
spf._register_websocket_route_helper(w, spf, self, context, p_name,
url_prefix)
return handler_f
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs['with_context'] = True # This is the whole point of this plugin
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
return super(Contextualize, self).middleware(middle_f, **kwargs)
def wrapper(middle_f):
nonlocal self, args, kwargs
return super(Contextualize, self).middleware(
*args, **kwargs)(middle_f)
return wrapper
# Decorator
# Decorator
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(listener_f):
nonlocal self, event, args, kwargs
return super(Contextualize, self).listener(
event, *args, **kwargs)(listener_f)
return wrapper
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(handler_f):
nonlocal self, uri, args, kwargs
return super(Contextualize, self).websocket(
uri, *args, **kwargs)(handler_f)
return wrapper
def __init__(self, *args, **kwargs):
super(Contextualize, self).__init__(*args, **kwargs)
|
ashleysommer/sanicpluginsframework | spf/plugins/contextualize.py | Contextualize.listener | python | def listener(self, event, *args, **kwargs):
if len(args) == 1 and callable(args[0]):
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(listener_f):
nonlocal self, event, args, kwargs
return super(Contextualize, self).listener(
event, *args, **kwargs)(listener_f)
return wrapper | Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the listener
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugins/contextualize.py#L225-L245 | null | class Contextualize(SanicPlugin):
__slots__ = ()
AssociatedTuple = ContextualizeAssociated
def _add_new_middleware(self, reg, middle_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new middleware _after_ the plugin is registered
m = FutureMiddleware(middle_f, args, kwargs)
spf._register_middleware_helper(m, spf, self, context)
return middle_f
def _add_new_route(self, reg, uri, handler_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new route _after_ the plugin is registered
r = FutureRoute(handler_f, uri, args, kwargs)
spf._register_route_helper(r, spf, self, context, p_name, url_prefix)
return handler_f
def _add_new_listener(self, reg, event, listener_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new listener _after_ the plugin is registered
spf._plugin_register_listener(event, listener_f, self, context,
*args, **kwargs)
return listener_f
def _add_new_ws_route(self, reg, uri, handler_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new route _after_ the plugin is registered
w = FutureWebsocket(handler_f, uri, args, kwargs)
spf._register_websocket_route_helper(w, spf, self, context, p_name,
url_prefix)
return handler_f
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs['with_context'] = True # This is the whole point of this plugin
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
return super(Contextualize, self).middleware(middle_f, **kwargs)
def wrapper(middle_f):
nonlocal self, args, kwargs
return super(Contextualize, self).middleware(
*args, **kwargs)(middle_f)
return wrapper
# Decorator
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri):
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(handler_f):
nonlocal self, uri, args, kwargs
return super(Contextualize, self).route(
uri, *args, **kwargs)(handler_f)
return wrapper
# Decorator
def websocket(self, uri, *args, **kwargs):
"""Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(handler_f):
nonlocal self, uri, args, kwargs
return super(Contextualize, self).websocket(
uri, *args, **kwargs)(handler_f)
return wrapper
def __init__(self, *args, **kwargs):
super(Contextualize, self).__init__(*args, **kwargs)
|
ashleysommer/sanicpluginsframework | spf/plugins/contextualize.py | Contextualize.websocket | python | def websocket(self, uri, *args, **kwargs):
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', None)
kwargs.setdefault('subprotocols', None)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(handler_f):
nonlocal self, uri, args, kwargs
return super(Contextualize, self).websocket(
uri, *args, **kwargs)(handler_f)
return wrapper | Create a websocket route from a decorated function
:param uri: endpoint at which the socket endpoint will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn | train | https://github.com/ashleysommer/sanicpluginsframework/blob/2cb1656d9334f04c30c738074784b0450c1b893e/spf/plugins/contextualize.py#L247-L269 | null | class Contextualize(SanicPlugin):
__slots__ = ()
AssociatedTuple = ContextualizeAssociated
def _add_new_middleware(self, reg, middle_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new middleware _after_ the plugin is registered
m = FutureMiddleware(middle_f, args, kwargs)
spf._register_middleware_helper(m, spf, self, context)
return middle_f
def _add_new_route(self, reg, uri, handler_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new route _after_ the plugin is registered
r = FutureRoute(handler_f, uri, args, kwargs)
spf._register_route_helper(r, spf, self, context, p_name, url_prefix)
return handler_f
def _add_new_listener(self, reg, event, listener_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new listener _after_ the plugin is registered
spf._plugin_register_listener(event, listener_f, self, context,
*args, **kwargs)
return listener_f
def _add_new_ws_route(self, reg, uri, handler_f, *args, **kwargs):
# A user should never call this directly.
# it should be called only by the AssociatedTuple
assert reg in self.registrations
(spf, p_name, url_prefix) = reg
context = self.get_context_from_spf(reg)
# This is how we add a new route _after_ the plugin is registered
w = FutureWebsocket(handler_f, uri, args, kwargs)
spf._register_websocket_route_helper(w, spf, self, context, p_name,
url_prefix)
return handler_f
# Decorator
def middleware(self, *args, **kwargs):
"""Decorate and register middleware
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The middleware function to use as the decorator
:rtype: fn
"""
kwargs.setdefault('priority', 5)
kwargs.setdefault('relative', None)
kwargs.setdefault('attach_to', None)
kwargs['with_context'] = True # This is the whole point of this plugin
if len(args) == 1 and callable(args[0]):
middle_f = args[0]
return super(Contextualize, self).middleware(middle_f, **kwargs)
def wrapper(middle_f):
nonlocal self, args, kwargs
return super(Contextualize, self).middleware(
*args, **kwargs)(middle_f)
return wrapper
# Decorator
def route(self, uri, *args, **kwargs):
"""Create a plugin route from a decorated function.
:param uri: endpoint at which the route will be accessible.
:type uri: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the decorator
:rtype: fn
"""
if len(args) == 0 and callable(uri):
raise RuntimeError("Cannot use the @route decorator without "
"arguments.")
kwargs.setdefault('methods', frozenset({'GET'}))
kwargs.setdefault('host', None)
kwargs.setdefault('strict_slashes', False)
kwargs.setdefault('stream', False)
kwargs.setdefault('name', None)
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(handler_f):
nonlocal self, uri, args, kwargs
return super(Contextualize, self).route(
uri, *args, **kwargs)(handler_f)
return wrapper
# Decorator
def listener(self, event, *args, **kwargs):
"""Create a listener from a decorated function.
:param event: Event to listen to.
:type event: str
:param args: captures all of the positional arguments passed in
:type args: tuple(Any)
:param kwargs: captures the keyword arguments passed in
:type kwargs: dict(Any)
:return: The exception function to use as the listener
:rtype: fn
"""
if len(args) == 1 and callable(args[0]):
raise RuntimeError("Cannot use the @listener decorator without "
"arguments")
kwargs['with_context'] = True # This is the whole point of this plugin
def wrapper(listener_f):
nonlocal self, event, args, kwargs
return super(Contextualize, self).listener(
event, *args, **kwargs)(listener_f)
return wrapper
def __init__(self, *args, **kwargs):
super(Contextualize, self).__init__(*args, **kwargs)
|
saltstack/salt-pylint | saltpylint/fileperms.py | FilePermsChecker.process_module | python | def process_module(self, node):
'''
process a module
'''
for listing in self.config.fileperms_ignore_paths:
if node.file.split('{0}/'.format(os.getcwd()))[-1] in glob.glob(listing):
# File is ignored, no checking should be done
return
desired_perm = self.config.fileperms_default
if '-' in desired_perm:
desired_perm = desired_perm.split('-')
else:
desired_perm = [desired_perm]
if len(desired_perm) > 2:
raise RuntimeError('Permission ranges should be like XXXX-YYYY')
for idx, perm in enumerate(desired_perm):
desired_perm[idx] = desired_perm[idx].strip('"').strip('\'').lstrip('0').zfill(4)
if desired_perm[idx][0] != '0':
# Always include a leading zero
desired_perm[idx] = '0{0}'.format(desired_perm[idx])
if sys.version_info > (3,):
# The octal representation in python 3 has changed to 0o644 instead of 0644
if desired_perm[idx][1] != 'o':
desired_perm[idx] = '0o' + desired_perm[idx][1:]
if sys.platform.startswith('win'):
# Windows does not distinguish between user/group/other.
# They must all be the same. Also, Windows will automatically
# set the execution bit on files with a known extension
# (eg .exe, .bat, .com). So we cannot reliably test the
# execution bit on other files such as .py files.
user_perm_noexec = int(desired_perm[idx][-3])
if user_perm_noexec % 2 == 1:
user_perm_noexec -= 1
desired_perm[idx] = desired_perm[idx][:-3] + (str(user_perm_noexec) * 3)
module_perms = oct(stat.S_IMODE(os.stat(node.file).st_mode))
if sys.version_info < (3,):
module_perms = str(module_perms)
if len(desired_perm) == 1:
if module_perms != desired_perm[0]:
if sys.platform.startswith('win'):
# Check the variant with execution bit set due to the
# unreliability of checking the execution bit on Windows.
user_perm_noexec = int(desired_perm[0][-3])
desired_perm_exec = desired_perm[0][:-3] + (str(user_perm_noexec + 1) * 3)
if module_perms == desired_perm_exec:
return
self.add_message('E0599', line=1, args=(desired_perm[0], module_perms))
else:
if module_perms < desired_perm[0] or module_perms > desired_perm[1]:
if sys.platform.startswith('win'):
# Check the variant with execution bit set due to the
# unreliability of checking the execution bit on Windows.
user_perm_noexec0 = int(desired_perm[0][-3])
desired_perm_exec0 = desired_perm[0][:-3] + (str(user_perm_noexec0 + 1) * 3)
user_perm_noexec1 = int(desired_perm[1][-3])
desired_perm_exec1 = desired_perm[1][:-3] + (str(user_perm_noexec1 + 1) * 3)
if desired_perm_exec0 <= module_perms <= desired_perm_exec1:
return
desired_perm = '>= {0} OR <= {1}'.format(*desired_perm)
self.add_message('E0599', line=1, args=(desired_perm, module_perms)) | process a module | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/fileperms.py#L52-L117 | null | class FilePermsChecker(BaseChecker):
'''
Check for files with undesirable permissions
'''
__implements__ = IRawChecker
name = 'fileperms'
msgs = {'E0599': ('Module file has the wrong file permissions(expected %s): %s',
'file-perms',
('Wrong file permissions')),
}
priority = -1
options = (('fileperms-default',
{'default': '0644', 'type': 'string', 'metavar': 'ZERO_PADDED_PERM',
'help': 'Desired file permissons. Default: 0644'}
),
('fileperms-ignore-paths',
{'default': (), 'type': 'csv', 'metavar': '<comma-separated-list>',
'help': 'File paths to ignore file permission. Glob patterns allowed.'}
)
)
|
saltstack/salt-pylint | setup.py | _parse_requirements_file | python | def _parse_requirements_file(requirements_file):
'''
Parse requirements.txt and return list suitable for
passing to ``install_requires`` parameter in ``setup()``.
'''
parsed_requirements = []
with open(requirements_file) as rfh:
for line in rfh.readlines():
line = line.strip()
if not line or line.startswith(('#', '-r')):
continue
parsed_requirements.append(line)
return parsed_requirements | Parse requirements.txt and return list suitable for
passing to ``install_requires`` parameter in ``setup()``. | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/setup.py#L30-L42 | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
The setup script for SaltPyLint
'''
# pylint: disable=file-perms,wrong-import-position
from __future__ import absolute_import, with_statement
import io
import os
import sys
SETUP_KWARGS = {}
USE_SETUPTOOLS = False
# Change to salt source's directory prior to running any command
try:
SETUP_DIRNAME = os.path.dirname(__file__)
except NameError:
# We're most likely being frozen and __file__ triggered this NameError
# Let's work around that
SETUP_DIRNAME = os.path.dirname(sys.argv[0])
if SETUP_DIRNAME != '':
os.chdir(SETUP_DIRNAME)
SALT_PYLINT_REQS = os.path.join(os.path.abspath(SETUP_DIRNAME), 'requirements.txt')
def _release_version():
'''
Returns release version
'''
with io.open(os.path.join(SETUP_DIRNAME, 'saltpylint', 'version.py'), encoding='utf-8') as fh_:
exec_locals = {}
exec_globals = {}
contents = fh_.read()
if not isinstance(contents, str):
contents = contents.encode('utf-8')
exec(contents, exec_globals, exec_locals) # pylint: disable=exec-used
return exec_locals['__version__']
# Use setuptools only if the user opts-in by setting the USE_SETUPTOOLS env var.
# Or if setuptools was previously imported (which is the case when using 'pip').
# This ensures consistent behavior, but allows for advanced usage with
# virtualenv, buildout, and others.
if 'USE_SETUPTOOLS' in os.environ or 'setuptools' in sys.modules:
try:
from setuptools import setup
USE_SETUPTOOLS = True
# This allows correct installation of dependencies with ``pip install``.
SETUP_KWARGS['install_requires'] = _parse_requirements_file(SALT_PYLINT_REQS)
except ImportError:
USE_SETUPTOOLS = False
if USE_SETUPTOOLS is False:
from distutils.core import setup # pylint: disable=import-error,no-name-in-module
NAME = 'SaltPyLint'
VERSION = _release_version()
DESCRIPTION = (
'Required PyLint plugins needed in the several SaltStack projects.'
)
setup(
name=NAME,
version=VERSION,
description=DESCRIPTION,
author='Pedro Algarvio',
author_email='pedro@algarvio.me',
url='https://github.com/saltstack/salt-pylint',
classifiers=[
'Programming Language :: Python',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3.4',
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'Intended Audience :: Developers',
'Intended Audience :: Information Technology',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: Apache Software License',
'Operating System :: POSIX :: Linux',
],
packages=[
'saltpylint',
'saltpylint.ext',
'saltpylint/py3modernize',
'saltpylint/py3modernize/fixes',
],
**SETUP_KWARGS
)
|
saltstack/salt-pylint | setup.py | _release_version | python | def _release_version():
'''
Returns release version
'''
with io.open(os.path.join(SETUP_DIRNAME, 'saltpylint', 'version.py'), encoding='utf-8') as fh_:
exec_locals = {}
exec_globals = {}
contents = fh_.read()
if not isinstance(contents, str):
contents = contents.encode('utf-8')
exec(contents, exec_globals, exec_locals) # pylint: disable=exec-used
return exec_locals['__version__'] | Returns release version | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/setup.py#L45-L56 | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
The setup script for SaltPyLint
'''
# pylint: disable=file-perms,wrong-import-position
from __future__ import absolute_import, with_statement
import io
import os
import sys
SETUP_KWARGS = {}
USE_SETUPTOOLS = False
# Change to salt source's directory prior to running any command
try:
SETUP_DIRNAME = os.path.dirname(__file__)
except NameError:
# We're most likely being frozen and __file__ triggered this NameError
# Let's work around that
SETUP_DIRNAME = os.path.dirname(sys.argv[0])
if SETUP_DIRNAME != '':
os.chdir(SETUP_DIRNAME)
SALT_PYLINT_REQS = os.path.join(os.path.abspath(SETUP_DIRNAME), 'requirements.txt')
def _parse_requirements_file(requirements_file):
'''
Parse requirements.txt and return list suitable for
passing to ``install_requires`` parameter in ``setup()``.
'''
parsed_requirements = []
with open(requirements_file) as rfh:
for line in rfh.readlines():
line = line.strip()
if not line or line.startswith(('#', '-r')):
continue
parsed_requirements.append(line)
return parsed_requirements
# Use setuptools only if the user opts-in by setting the USE_SETUPTOOLS env var.
# Or if setuptools was previously imported (which is the case when using 'pip').
# This ensures consistent behavior, but allows for advanced usage with
# virtualenv, buildout, and others.
if 'USE_SETUPTOOLS' in os.environ or 'setuptools' in sys.modules:
try:
from setuptools import setup
USE_SETUPTOOLS = True
# This allows correct installation of dependencies with ``pip install``.
SETUP_KWARGS['install_requires'] = _parse_requirements_file(SALT_PYLINT_REQS)
except ImportError:
USE_SETUPTOOLS = False
if USE_SETUPTOOLS is False:
from distutils.core import setup # pylint: disable=import-error,no-name-in-module
NAME = 'SaltPyLint'
VERSION = _release_version()
DESCRIPTION = (
'Required PyLint plugins needed in the several SaltStack projects.'
)
setup(
name=NAME,
version=VERSION,
description=DESCRIPTION,
author='Pedro Algarvio',
author_email='pedro@algarvio.me',
url='https://github.com/saltstack/salt-pylint',
classifiers=[
'Programming Language :: Python',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3.4',
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'Intended Audience :: Developers',
'Intended Audience :: Information Technology',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: Apache Software License',
'Operating System :: POSIX :: Linux',
],
packages=[
'saltpylint',
'saltpylint.ext',
'saltpylint/py3modernize',
'saltpylint/py3modernize/fixes',
],
**SETUP_KWARGS
)
|
saltstack/salt-pylint | saltpylint/py3modernize/__init__.py | Py3Modernize.process_module | python | def process_module(self, node):
'''
process a module
'''
# Patch lib2to3.fixer_util.touch_import!
fixer_util.touch_import = salt_lib2to3_touch_import
flags = {}
if self.config.modernize_print_function:
flags['print_function'] = True
salt_avail_fixes = set(
refactor.get_fixers_from_package(
'saltpylint.py3modernize.fixes'
)
)
avail_fixes = set(refactor.get_fixers_from_package('libmodernize.fixes'))
avail_fixes.update(lib2to3_fix_names)
avail_fixes.update(salt_avail_fixes)
default_fixes = avail_fixes.difference(opt_in_fix_names)
unwanted_fixes = set(self.config.modernize_nofix)
# Explicitly disable libmodernize.fixes.fix_dict_six since we have our own implementation
# which only fixes `dict.iter<items|keys|values>()` calls
unwanted_fixes.add('libmodernize.fixes.fix_dict_six')
if self.config.modernize_six_unicode:
unwanted_fixes.add('libmodernize.fixes.fix_unicode_future')
elif self.config.modernize_future_unicode:
unwanted_fixes.add('libmodernize.fixes.fix_unicode')
else:
unwanted_fixes.add('libmodernize.fixes.fix_unicode_future')
unwanted_fixes.add('libmodernize.fixes.fix_unicode')
if self.config.modernize_no_six:
unwanted_fixes.update(six_fix_names)
unwanted_fixes.update(salt_avail_fixes)
else:
# We explicitly will remove fix_imports_six from libmodernize and will add
# our own fix_imports_six
unwanted_fixes.add('libmodernize.fixes.fix_imports_six')
# Remove a bunch of libmodernize.fixes since we need to properly skip them
# and we provide the proper skip rule
unwanted_fixes.add('libmodernize.fixes.fix_input_six')
unwanted_fixes.add('libmodernize.fixes.fix_filter')
unwanted_fixes.add('libmodernize.fixes.fix_map')
unwanted_fixes.add('libmodernize.fixes.fix_xrange_six')
unwanted_fixes.add('libmodernize.fixes.fix_zip')
explicit = set()
if self.config.modernize_fix:
default_present = False
for fix in self.config.modernize_fix:
if fix == 'default':
default_present = True
else:
explicit.add(fix)
requested = default_fixes.union(explicit) if default_present else explicit
else:
requested = default_fixes
requested = default_fixes
fixer_names = requested.difference(unwanted_fixes)
rft = PyLintRefactoringTool(sorted(fixer_names), flags, sorted(explicit))
try:
rft.refactor_file(node.file,
write=False,
doctests_only=self.config.modernize_doctests_only)
except ParseError as exc:
# Unable to refactor, let's not make PyLint crash
try:
lineno = exc.context[1][0]
line_contents = node.file_stream.readlines()[lineno-1].rstrip()
self.add_message('W1698', line=lineno, args=line_contents)
except Exception: # pylint: disable=broad-except
self.add_message('W1698', line=1, args=exc)
return
except AssertionError as exc:
self.add_message('W1698', line=1, args=exc)
return
except (IOError, OSError) as exc:
logging.getLogger(__name__).warn('Error while processing {0}: {1}'.format(node.file, exc))
return
for lineno, diff in rft.diff:
# Since PyLint's python3 checker uses <Type>16<int><int>, we'll also use that range
self.add_message('W1699', line=lineno, args=diff)
# Restore lib2to3.fixer_util.touch_import!
fixer_util.touch_import = FIXER_UTIL_TOUCH_IMPORT | process a module | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/py3modernize/__init__.py#L144-L239 | null | class Py3Modernize(BaseChecker):
'''
Check for PEP263 compliant file encoding in file.
'''
__implements__ = IRawChecker
name = 'modernize'
msgs = {'W1698': ('Unable to run modernize. Parse Error: %s',
'modernize-parse-error',
('Incompatible Python 3 code found')),
'W1699': ('Incompatible Python 3 code found. Proposed fix:\n%s',
'incompatible-py3-code',
('Incompatible Python 3 code found')),
}
priority = -1
options = (('modernize-doctests-only',
{'default': 0, 'type': 'yn', 'metavar': '<y_or_n>',
'help': 'Fix up doctests only'}
),
('modernize-fix',
{'default': (), 'type': 'csv', 'metavar': '<comma-separated-list>',
'help': 'Each FIX specifies a transformation; "default" includes '
'default fixes.'}
),
('modernize-nofix',
{'default': '', 'type': 'multiple_choice', 'metavar': '<comma-separated-list>',
'choices': sorted(ALL_FIXES),
'help': 'Comma separated list of fixer names not to fix.'}
),
('modernize-print-function',
{'default': 1, 'type': 'yn', 'metavar': '<y_or_n>',
'help': 'Modify the grammar so that print() is a function.'}
),
('modernize-six-unicode',
{'default': 0, 'type': 'yn', 'metavar': '<y_or_n>',
'help': 'Wrap unicode literals in six.u().'}
),
('modernize-future-unicode',
{'default': 0, 'type': 'yn', 'metavar': '<y_or_n>',
'help': 'Use \'from __future__ import unicode_literals\' (only '
'useful for Python 2.6+).'}
),
('modernize-no-six',
{'default': 0, 'type': 'yn', 'metavar': '<y_or_n>',
'help': 'Exclude fixes that depend on the six package.'}
)
)
|
saltstack/salt-pylint | saltpylint/virt.py | VirtChecker.visit_functiondef | python | def visit_functiondef(self, node):
'''
Verifies no logger statements inside __virtual__
'''
if (not isinstance(node, astroid.FunctionDef) or
node.is_method()
or node.type != 'function'
or not node.body
):
# only process functions
return
try:
if not node.name == '__virtual__':
# only need to process the __virtual__ function
return
except AttributeError:
return
# walk contents of __virtual__ function
for child in node.get_children():
for functions in child.get_children():
if isinstance(functions, astroid.Call):
if isinstance(functions.func, astroid.Attribute):
try:
# Inspect each statement for an instance of 'logging'
for inferred in functions.func.expr.infer():
try:
instance_type = inferred.pytype().split('.')[0]
except TypeError:
continue
if instance_type == 'logging':
self.add_message(
self.VIRT_LOG, node=functions
)
# Found logger, don't need to keep processing this line
break
except AttributeError:
# Not a log function
return | Verifies no logger statements inside __virtual__ | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/virt.py#L26-L65 | null | class VirtChecker(BaseChecker):
'''
checks for compliance inside __virtual__
'''
__implements__ = IAstroidChecker
name = 'virt-checker'
VIRT_LOG = 'log-in-virtual'
msgs = {
'E1401': ('Log statement detected inside __virtual__ function. Remove it.',
VIRT_LOG,
'Loader processes __virtual__ so logging not in scope'),
}
options = ()
priority = -1
|
saltstack/salt-pylint | saltpylint/pep263.py | FileEncodingChecker.process_module | python | def process_module(self, node):
'''
process a module
the module's content is accessible via node.file_stream object
'''
pep263 = re.compile(six.b(self.RE_PEP263))
try:
file_stream = node.file_stream
except AttributeError:
# Pylint >= 1.8.1
file_stream = node.stream()
# Store a reference to the node's file stream position
current_stream_position = file_stream.tell()
# Go to the start of stream to achieve our logic
file_stream.seek(0)
# Grab the first two lines
twolines = list(itertools.islice(file_stream, 2))
pep263_encoding = [m.group(1).lower() for l in twolines for m in [pep263.search(l)] if m]
multiple_encodings = len(pep263_encoding) > 1
file_empty = len(twolines) == 0
# Reset the node's file stream position
file_stream.seek(current_stream_position)
# - If the file has an UTF-8 BOM and yet uses any other
# encoding, it will be caught by F0002
# - If the file has a PEP263 UTF-8 encoding and yet uses any
# other encoding, it will be caught by W0512
# - If there are non-ASCII characters and no PEP263, or UTF-8
# BOM, it will be caught by W0512
# - If there are ambiguous PEP263 encodings it will be caught
# by E0001, we still test for this
if multiple_encodings:
self.add_message('W9901', line=1)
if node.file_encoding:
pylint_encoding = node.file_encoding.lower()
if six.PY3:
pylint_encoding = pylint_encoding.encode('utf-8')
if pep263_encoding and pylint_encoding not in pep263_encoding:
self.add_message('W9902', line=1)
if not pep263_encoding:
if file_empty:
self.add_message('W9905', line=1)
else:
self.add_message('W9903', line=1)
elif self.REQ_ENCOD not in pep263_encoding:
self.add_message('W9904', line=1) | process a module
the module's content is accessible via node.file_stream object | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/pep263.py#L57-L110 | null | class FileEncodingChecker(BaseChecker):
'''
Check for PEP263 compliant file encoding in file.
'''
__implements__ = IRawChecker
name = 'pep263'
msgs = {'W9901': ('PEP263: Multiple file encodings',
'multiple-encoding-in-file',
('There are multiple encodings in file.')),
'W9902': ('PEP263: Parser and PEP263 encoding mismatch',
'encoding-mismatch-in-file',
('The pylint parser and the PEP263 file encoding in file '
'does not match.')),
'W9903': ('PEP263: Use UTF-8 file encoding',
'no-encoding-in-file',
('There is no PEP263 compliant file encoding in file.')),
'W9904': ('PEP263: Use UTF-8 file encoding',
'wrongly-encoded-file',
('Change file encoding and PEP263 header in file.')),
'W9905': ('PEP263: Use UTF-8 file encoding',
'no-encoding-in-empty-file',
('There is no PEP263 compliant file encoding in file.')),
}
priority = -1
options = ()
RE_PEP263 = r'coding[:=]\s*([-\w.]+)'
REQ_ENCOD = six.b('utf-8')
|
saltstack/salt-pylint | saltpylint/ext/pyqver2.py | get_versions | python | def get_versions(source):
tree = compiler.parse(source)
checker = compiler.walk(tree, NodeChecker())
return checker.vers | Return information about the Python versions required for specific features.
The return value is a dictionary with keys as a version number as a tuple
(for example Python 2.6 is (2,6)) and the value are a list of features that
require the indicated Python version. | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/ext/pyqver2.py#L252-L261 | null | # -*- coding: utf-8 -*-
# This software is provided 'as-is', without any express or implied
# warranty. In no event will the author be held liable for any damages
# arising from the use of this software.
#
# Permission is granted to anyone to use this software for any purpose,
# including commercial applications, and to alter it and redistribute it
# freely, subject to the following restrictions:
#
# 1. The origin of this software must not be misrepresented; you must not
# claim that you wrote the original software. If you use this software
# in a product, an acknowledgment in the product documentation would be
# appreciated but is not required.
# 2. Altered source versions must be plainly marked as such, and must not be
# misrepresented as being the original software.
# 3. This notice may not be removed or altered from any source distribution.
#
# Copyright (c) 2009-2013 Greg Hewgill http://hewgill.com
#
#
# ############################################################################
#
# Changed the code in order for it not to process sys.argv
#
# ############################################################################
# pylint: skip-file
from __future__ import print_function
import compiler
import platform
import sys
StandardModules = {
"__future__": (2, 1),
"abc": (2, 6),
"argparse": (2, 7),
"ast": (2, 6),
"atexit": (2, 0),
"bz2": (2, 3),
"cgitb": (2, 2),
"collections": (2, 4),
"contextlib": (2, 5),
"cookielib": (2, 4),
"cProfile": (2, 5),
"csv": (2, 3),
"ctypes": (2, 5),
"datetime": (2, 3),
"decimal": (2, 4),
"difflib": (2, 1),
"DocXMLRPCServer": (2, 3),
"dummy_thread": (2, 3),
"dummy_threading": (2, 3),
"email": (2, 2),
"fractions": (2, 6),
"functools": (2, 5),
"future_builtins": (2, 6),
"hashlib": (2, 5),
"heapq": (2, 3),
"hmac": (2, 2),
"hotshot": (2, 2),
"HTMLParser": (2, 2),
"importlib": (2, 7),
"inspect": (2, 1),
"io": (2, 6),
"itertools": (2, 3),
"json": (2, 6),
"logging": (2, 3),
"modulefinder": (2, 3),
"msilib": (2, 5),
"multiprocessing": (2, 6),
"netrc": (1, 5, 2),
"numbers": (2, 6),
"optparse": (2, 3),
"ossaudiodev": (2, 3),
"pickletools": (2, 3),
"pkgutil": (2, 3),
"platform": (2, 3),
"pydoc": (2, 1),
"runpy": (2, 5),
"sets": (2, 3),
"shlex": (1, 5, 2),
"SimpleXMLRPCServer": (2, 2),
"spwd": (2, 5),
"sqlite3": (2, 5),
"ssl": (2, 6),
"stringprep": (2, 3),
"subprocess": (2, 4),
"sysconfig": (2, 7),
"tarfile": (2, 3),
"textwrap": (2, 3),
"timeit": (2, 3),
"unittest": (2, 1),
"uuid": (2, 5),
"warnings": (2, 1),
"weakref": (2, 1),
"winsound": (1, 5, 2),
"wsgiref": (2, 5),
"xml.dom": (2, 0),
"xml.dom.minidom": (2, 0),
"xml.dom.pulldom": (2, 0),
"xml.etree.ElementTree": (2, 5),
"xml.parsers.expat":(2, 0),
"xml.sax": (2, 0),
"xml.sax.handler": (2, 0),
"xml.sax.saxutils": (2, 0),
"xml.sax.xmlreader":(2, 0),
"xmlrpclib": (2, 2),
"zipfile": (1, 6),
"zipimport": (2, 3),
"_ast": (2, 5),
"_winreg": (2, 0),
}
Functions = {
"all": (2, 5),
"any": (2, 5),
"collections.Counter": (2, 7),
"collections.defaultdict": (2, 5),
"collections.OrderedDict": (2, 7),
"enumerate": (2, 3),
"frozenset": (2, 4),
"itertools.compress": (2, 7),
"math.erf": (2, 7),
"math.erfc": (2, 7),
"math.expm1": (2, 7),
"math.gamma": (2, 7),
"math.lgamma": (2, 7),
"memoryview": (2, 7),
"next": (2, 6),
"os.getresgid": (2, 7),
"os.getresuid": (2, 7),
"os.initgroups": (2, 7),
"os.setresgid": (2, 7),
"os.setresuid": (2, 7),
"reversed": (2, 4),
"set": (2, 4),
"subprocess.check_call": (2, 5),
"subprocess.check_output": (2, 7),
"sum": (2, 3),
"symtable.is_declared_global": (2, 7),
"weakref.WeakSet": (2, 7),
}
Identifiers = {
"False": (2, 2),
"True": (2, 2),
}
def uniq(a):
if len(a) == 0:
return []
else:
return [a[0]] + uniq([x for x in a if x != a[0]])
class NodeChecker(object):
def __init__(self):
self.vers = dict()
self.vers[(2,0)] = []
def add(self, node, ver, msg):
if ver not in self.vers:
self.vers[ver] = []
self.vers[ver].append((node.lineno, msg))
def default(self, node):
for child in node.getChildNodes():
self.visit(child)
def visitCallFunc(self, node):
def rollup(n):
if isinstance(n, compiler.ast.Name):
return n.name
elif isinstance(n, compiler.ast.Getattr):
r = rollup(n.expr)
if r:
return r + "." + n.attrname
name = rollup(node.node)
if name:
v = Functions.get(name)
if v is not None:
self.add(node, v, name)
self.default(node)
def visitClass(self, node):
if node.bases:
self.add(node, (2,2), "new-style class")
if node.decorators:
self.add(node, (2,6), "class decorator")
self.default(node)
def visitDictComp(self, node):
self.add(node, (2,7), "dictionary comprehension")
self.default(node)
def visitFloorDiv(self, node):
self.add(node, (2,2), "// operator")
self.default(node)
def visitFrom(self, node):
v = StandardModules.get(node.modname)
if v is not None:
self.add(node, v, node.modname)
for n in node.names:
name = node.modname + "." + n[0]
v = Functions.get(name)
if v is not None:
self.add(node, v, name)
def visitFunction(self, node):
if node.decorators:
self.add(node, (2,4), "function decorator")
self.default(node)
def visitGenExpr(self, node):
self.add(node, (2,4), "generator expression")
self.default(node)
def visitGetattr(self, node):
if (isinstance(node.expr, compiler.ast.Const)
and isinstance(node.expr.value, str)
and node.attrname == "format"):
self.add(node, (2,6), "string literal .format()")
self.default(node)
def visitIfExp(self, node):
self.add(node, (2,5), "inline if expression")
self.default(node)
def visitImport(self, node):
for n in node.names:
v = StandardModules.get(n[0])
if v is not None:
self.add(node, v, n[0])
self.default(node)
def visitName(self, node):
v = Identifiers.get(node.name)
if v is not None:
self.add(node, v, node.name)
self.default(node)
def visitSet(self, node):
self.add(node, (2,7), "set literal")
self.default(node)
def visitSetComp(self, node):
self.add(node, (2,7), "set comprehension")
self.default(node)
def visitTryFinally(self, node):
# try/finally with a suite generates a Stmt node as the body,
# but try/except/finally generates a TryExcept as the body
if isinstance(node.body, compiler.ast.TryExcept):
self.add(node, (2,5), "try/except/finally")
self.default(node)
def visitWith(self, node):
if isinstance(node.body, compiler.ast.With):
self.add(node, (2,7), "with statement with multiple contexts")
else:
self.add(node, (2,5), "with statement")
self.default(node)
def visitYield(self, node):
self.add(node, (2,2), "yield expression")
self.default(node)
def v27(source):
if sys.version_info >= (2, 7):
return qver(source)
else:
print("Not all features tested, run --test with Python 2.7", file=sys.stderr)
return (2, 7)
def qver(source):
"""Return the minimum Python version required to run a particular bit of code.
>>> qver('print "hello world"')
(2, 0)
>>> qver('class test(object): pass')
(2, 2)
>>> qver('yield 1')
(2, 2)
>>> qver('a // b')
(2, 2)
>>> qver('True')
(2, 2)
>>> qver('enumerate(a)')
(2, 3)
>>> qver('total = sum')
(2, 0)
>>> qver('sum(a)')
(2, 3)
>>> qver('(x*x for x in range(5))')
(2, 4)
>>> qver('class C:\\n @classmethod\\n def m(): pass')
(2, 4)
>>> qver('y if x else z')
(2, 5)
>>> qver('import hashlib')
(2, 5)
>>> qver('from hashlib import md5')
(2, 5)
>>> qver('import xml.etree.ElementTree')
(2, 5)
>>> qver('try:\\n try: pass;\\n except: pass;\\nfinally: pass')
(2, 0)
>>> qver('try: pass;\\nexcept: pass;\\nfinally: pass')
(2, 5)
>>> qver('from __future__ import with_statement\\nwith x: pass')
(2, 5)
>>> qver('collections.defaultdict(list)')
(2, 5)
>>> qver('from collections import defaultdict')
(2, 5)
>>> qver('"{0}".format(0)')
(2, 6)
>>> qver('memoryview(x)')
(2, 7)
>>> v27('{1, 2, 3}')
(2, 7)
>>> v27('{x for x in s}')
(2, 7)
>>> v27('{x: y for x in s}')
(2, 7)
>>> qver('from __future__ import with_statement\\nwith x:\\n with y: pass')
(2, 5)
>>> v27('from __future__ import with_statement\\nwith x, y: pass')
(2, 7)
>>> qver('@decorator\\ndef f(): pass')
(2, 4)
>>> qver('@decorator\\nclass test:\\n pass')
(2, 6)
#>>> qver('0o0')
#(2, 6)
#>>> qver('@foo\\nclass C: pass')
#(2, 6)
"""
return max(get_versions(source).keys())
if __name__ == '__main__':
Verbose = False
MinVersion = (2, 3)
Lint = False
files = []
i = 1
while i < len(sys.argv):
a = sys.argv[i]
if a == "--test":
import doctest
doctest.testmod()
sys.exit(0)
if a == "-v" or a == "--verbose":
Verbose = True
elif a == "-l" or a == "--lint":
Lint = True
elif a == "-m" or a == "--min-version":
i += 1
MinVersion = tuple(map(int, sys.argv[i].split(".")))
else:
files.append(a)
i += 1
if not files:
print("""Usage: %s [options] source ...
Report minimum Python version required to run given source files.
-m x.y or --min-version x.y (default 2.3)
report version triggers at or above version x.y in verbose mode
-v or --verbose
print more detailed report of version triggers for each version
""" % sys.argv[0], file=sys.stderr)
sys.exit(1)
for fn in files:
try:
f = open(fn)
source = f.read()
f.close()
ver = get_versions(source)
if Verbose:
print(fn)
for v in sorted([k for k in ver.keys() if k >= MinVersion], reverse=True):
reasons = [x for x in uniq(ver[v]) if x]
if reasons:
# each reason is (lineno, message)
print("\t%s\t%s" % (".".join(map(str, v)), ", ".join([x[1] for x in reasons])))
elif Lint:
for v in sorted([k for k in ver.keys() if k >= MinVersion], reverse=True):
reasons = [x for x in uniq(ver[v]) if x]
for r in reasons:
# each reason is (lineno, message)
print("%s:%s: %s %s" % (fn, r[0], ".".join(map(str, v)), r[1]))
else:
print("%s\t%s" % (".".join(map(str, max(ver.keys()))), fn))
except SyntaxError as x:
print("%s: syntax error compiling with Python %s: %s" % (fn, platform.python_version(), x))
|
saltstack/salt-pylint | saltpylint/strings.py | register | python | def register(linter):
'''required method to auto register this checker '''
linter.register_checker(StringCurlyBracesFormatIndexChecker(linter))
linter.register_checker(StringLiteralChecker(linter)) | required method to auto register this checker | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/strings.py#L260-L263 | null | # -*- coding: utf-8 -*-
'''
:codeauthor: :email:`Pedro Algarvio (pedro@algarvio.me)`
==========================================
PyLint Extended String Formatting Checkers
==========================================
Proper string formatting PyLint checker
'''
# Import Python libs
from __future__ import absolute_import
import re
import sys
import tokenize
# Import PyLint libs
try:
# >= pylint 1.0
import astroid
except ImportError: # pylint < 1.0
from logilab import astng as astroid # pylint: disable=no-name-in-module
from saltpylint.checkers import BaseChecker, utils
from pylint.checkers import BaseTokenChecker
try:
# >= pylint 1.0
from pylint.interfaces import IAstroidChecker
except ImportError: # < pylint 1.0
from pylint.interfaces import IASTNGChecker as IAstroidChecker # pylint: disable=no-name-in-module
from pylint.interfaces import ITokenChecker, IRawChecker
from astroid.exceptions import InferenceError
try:
from astroid.exceptions import NameInferenceError
except ImportError:
class NameInferenceError(Exception):
pass
# Import 3rd-party libs
import six
STRING_FORMAT_MSGS = {
'W1320': ('String format call with un-indexed curly braces: %r',
'un-indexed-curly-braces-warning',
'Under python 2.6 the curly braces on a \'string.format()\' '
'call MUST be indexed.'),
'E1320': ('String format call with un-indexed curly braces: %r',
'un-indexed-curly-braces-error',
'Under python 2.6 the curly braces on a \'string.format()\' '
'call MUST be indexed.'),
'W1321': ('String substitution used instead of string formattting on: %r',
'string-substitution-usage-warning',
'String substitution used instead of string formattting'),
'E1321': ('String substitution used instead of string formattting on: %r',
'string-substitution-usage-error',
'String substitution used instead of string formattting'),
'E1322': ('Repr flag (!r) used in string: %r',
'repr-flag-used-in-string',
'Repr flag (!r) used in string'),
'E1323': ('String formatting used in logging: %r',
'str-format-in-logging',
'String formatting used in logging'),
}
BAD_FORMATTING_SLOT = re.compile(r'(\{![\w]{1}\}|\{\})')
class StringCurlyBracesFormatIndexChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'string'
msgs = STRING_FORMAT_MSGS
priority = -1
options = (('un-indexed-curly-braces-always-error',
{'default': 1, 'type': 'yn', 'metavar': '<y_or_n>',
'help': 'Force un-indexed curly braces on a '
'\'string.format()\' call to always be an error.'}
),
('enforce-string-formatting-over-substitution',
{'default': 1, 'type': 'yn', 'metavar': '<y_or_n>',
'help': 'Enforce string formatting over string substitution'}
),
('string-substitutions-usage-is-an-error',
{'default': 1, 'type': 'yn', 'metavar': '<y_or_n>',
'help': 'Force string substitution usage on strings '
'to always be an error.'}
),
)
@utils.check_messages(*(STRING_FORMAT_MSGS.keys()))
def visit_binop(self, node):
if not self.config.enforce_string_formatting_over_substitution:
return
if node.op != '%':
return
if not (isinstance(node.left, astroid.Const) and
isinstance(node.left.value, six.string_types)):
return
try:
required_keys, required_num_args = utils.parse_format_string(node.left.value)[:2]
except (utils.UnsupportedFormatCharacter, utils.IncompleteFormatString):
# This is handled elsewere
return
if required_keys or required_num_args:
if self.config.string_substitutions_usage_is_an_error:
msgid = 'E1321'
else:
msgid = 'W1321'
self.add_message(
msgid, node=node.left, args=node.left.value
)
if '!r}' in node.left.value:
self.add_message(
'E1322', node=node.left, args=node.left.value
)
@utils.check_messages(*(STRING_FORMAT_MSGS.keys()))
def visit_call(self, node):
func = utils.safe_infer(node.func)
if isinstance(func, astroid.BoundMethod) and func.name == 'format':
# If there's a .format() call, run the code below
if isinstance(node.func.expr, (astroid.Name, astroid.Const)):
# This is for:
# foo = 'Foo {} bar'
# print(foo.format(blah)
for inferred in node.func.expr.infer():
if not hasattr(inferred, 'value'):
# If there's no value attribute, it's not worth
# checking.
continue
if not isinstance(inferred.value, six.string_types):
# If it's not a string, continue
continue
if '!r}' in inferred.value:
self.add_message(
'E1322', node=inferred, args=inferred.value
)
if BAD_FORMATTING_SLOT.findall(inferred.value):
if self.config.un_indexed_curly_braces_always_error or \
sys.version_info[:2] < (2, 7):
self.add_message(
'E1320', node=inferred, args=inferred.value
)
elif six.PY2:
self.add_message(
'W1320', node=inferred, args=inferred.value
)
try:
# Walk back up until no parents are found and look for a
# logging.RootLogger instance in the parent types
ptr = node
while True:
parent = ptr.parent
for inferred in parent.func.expr.infer():
try:
instance_type = inferred.pytype().split('.')[0]
except TypeError:
continue
if instance_type == 'logging':
self.add_message(
'E1323',
node=node,
args=node.as_string(),
)
break
ptr = parent
except (AttributeError, InferenceError, NameInferenceError):
pass
elif not hasattr(node.func.expr, 'value'):
# If it does not have an value attribute, it's not worth
# checking
return
elif isinstance(node.func.expr.value, astroid.Name):
# No need to check these either
return
elif BAD_FORMATTING_SLOT.findall(node.func.expr.value):
if self.config.un_indexed_curly_braces_always_error or \
sys.version_info[:2] < (2, 7):
msgid = 'E1320'
else:
msgid = 'W1320'
self.add_message(
'E1320', node=node, args=node.func.expr.value
)
STRING_LITERALS_MSGS = {
'E1400': ('Null byte used in unicode string literal (should be wrapped in str())',
'null-byte-unicode-literal',
'Null byte used in unicode string literal'),
}
class StringLiteralChecker(BaseTokenChecker):
'''
Check string literals
'''
__implements__ = (ITokenChecker, IRawChecker)
name = 'string_literal'
msgs = STRING_LITERALS_MSGS
def process_module(self, module):
self._unicode_literals = 'unicode_literals' in module.future_imports
def process_tokens(self, tokens):
for (tok_type, token, (start_row, _), _, _) in tokens:
if tok_type == tokenize.STRING:
# 'token' is the whole un-parsed token; we can look at the start
# of it to see whether it's a raw or unicode string etc.
self.process_string_token(token, start_row)
def process_string_token(self, token, start_row):
if not six.PY2:
return
for i, c in enumerate(token):
if c in '\'\"':
quote_char = c
break
# pylint: disable=undefined-loop-variable
prefix = token[:i].lower() # markers like u, b, r.
after_prefix = token[i:]
if after_prefix[:3] == after_prefix[-3:] == 3 * quote_char:
string_body = after_prefix[3:-3]
else:
string_body = after_prefix[1:-1] # Chop off quotes
# No special checks on raw strings at the moment.
if 'r' not in prefix:
self.process_non_raw_string_token(prefix, string_body, start_row)
def process_non_raw_string_token(self, prefix, string_body, start_row):
'''
check for bad escapes in a non-raw string.
prefix: lowercase string of eg 'ur' string prefix markers.
string_body: the un-parsed body of the string, not including the quote
marks.
start_row: integer line number in the source.
'''
if 'u' in prefix:
if string_body.find('\\0') != -1:
self.add_message('null-byte-unicode-literal', line=start_row)
|
saltstack/salt-pylint | saltpylint/strings.py | StringLiteralChecker.process_non_raw_string_token | python | def process_non_raw_string_token(self, prefix, string_body, start_row):
'''
check for bad escapes in a non-raw string.
prefix: lowercase string of eg 'ur' string prefix markers.
string_body: the un-parsed body of the string, not including the quote
marks.
start_row: integer line number in the source.
'''
if 'u' in prefix:
if string_body.find('\\0') != -1:
self.add_message('null-byte-unicode-literal', line=start_row) | check for bad escapes in a non-raw string.
prefix: lowercase string of eg 'ur' string prefix markers.
string_body: the un-parsed body of the string, not including the quote
marks.
start_row: integer line number in the source. | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/strings.py#L247-L258 | null | class StringLiteralChecker(BaseTokenChecker):
'''
Check string literals
'''
__implements__ = (ITokenChecker, IRawChecker)
name = 'string_literal'
msgs = STRING_LITERALS_MSGS
def process_module(self, module):
self._unicode_literals = 'unicode_literals' in module.future_imports
def process_tokens(self, tokens):
for (tok_type, token, (start_row, _), _, _) in tokens:
if tok_type == tokenize.STRING:
# 'token' is the whole un-parsed token; we can look at the start
# of it to see whether it's a raw or unicode string etc.
self.process_string_token(token, start_row)
def process_string_token(self, token, start_row):
if not six.PY2:
return
for i, c in enumerate(token):
if c in '\'\"':
quote_char = c
break
# pylint: disable=undefined-loop-variable
prefix = token[:i].lower() # markers like u, b, r.
after_prefix = token[i:]
if after_prefix[:3] == after_prefix[-3:] == 3 * quote_char:
string_body = after_prefix[3:-3]
else:
string_body = after_prefix[1:-1] # Chop off quotes
# No special checks on raw strings at the moment.
if 'r' not in prefix:
self.process_non_raw_string_token(prefix, string_body, start_row)
|
saltstack/salt-pylint | saltpylint/blacklist.py | register | python | def register(linter):
'''
Required method to auto register this checker
'''
linter.register_checker(ResourceLeakageChecker(linter))
linter.register_checker(BlacklistedImportsChecker(linter))
linter.register_checker(MovedTestCaseClassChecker(linter))
linter.register_checker(BlacklistedLoaderModulesUsageChecker(linter))
linter.register_checker(BlacklistedFunctionsChecker(linter)) | Required method to auto register this checker | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/blacklist.py#L560-L568 | null | # -*- coding: utf-8 -*-
'''
:codeauthor: :email:`Pedro Algarvio (pedro@algarvio.me)`
:copyright: © 2017 by the SaltStack Team, see AUTHORS for more details.
:license: Apache 2.0, see LICENSE for more details.
saltpylint.blacklist
~~~~~~~~~~~~~~~~~~~~
Checks blacklisted imports and code usage on salt
'''
# Import python libs
from __future__ import absolute_import
import os
import fnmatch
# Import pylint libs
import astroid
from saltpylint.checkers import BaseChecker, utils
from pylint.interfaces import IAstroidChecker
BLACKLISTED_IMPORTS_MSGS = {
'E9402': ('Uses of a blacklisted module %r: %s',
'blacklisted-module',
'Used a module marked as blacklisted is imported.'),
'E9403': ('Uses of a blacklisted external module %r: %s',
'blacklisted-external-module',
'Used a module marked as blacklisted is imported.'),
'E9404': ('Uses of a blacklisted import %r: %s',
'blacklisted-import',
'Used an import marked as blacklisted.'),
'E9405': ('Uses of an external blacklisted import %r: %s',
'blacklisted-external-import',
'Used an external import marked as blacklisted.'),
'E9406': ('Uses of blacklisted test module execution code: %s',
'blacklisted-test-module-execution',
'Uses of blacklisted test module execution code.'),
'E9407': ('Uses of blacklisted sys.path updating through \'ensure_in_syspath\'. '
'Please remove the import and any calls to \'ensure_in_syspath()\'.',
'blacklisted-syspath-update',
'Uses of blacklisted sys.path updating through ensure_in_syspath.'),
}
class BlacklistedImportsChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'blacklisted-imports'
msgs = BLACKLISTED_IMPORTS_MSGS
priority = -2
def open(self):
self.blacklisted_modules = ('salttesting',
'integration',
'unit',
'mock',
'six',
'distutils.version',
'unittest',
'unittest2')
@utils.check_messages('blacklisted-imports')
def visit_import(self, node):
'''triggered when an import statement is seen'''
module_filename = node.root().file
if fnmatch.fnmatch(module_filename, '__init__.py*') and \
not fnmatch.fnmatch(module_filename, 'test_*.py*'):
return
modnode = node.root()
names = [name for name, _ in node.names]
for name in names:
self._check_blacklisted_module(node, name)
@utils.check_messages('blacklisted-imports')
def visit_importfrom(self, node):
'''triggered when a from statement is seen'''
module_filename = node.root().file
if fnmatch.fnmatch(module_filename, '__init__.py*') and \
not fnmatch.fnmatch(module_filename, 'test_*.py*'):
return
basename = node.modname
self._check_blacklisted_module(node, basename)
def _check_blacklisted_module(self, node, mod_path):
'''check if the module is blacklisted'''
for mod_name in self.blacklisted_modules:
if mod_path == mod_name or mod_path.startswith(mod_name + '.'):
names = []
for name, name_as in node.names:
if name_as:
names.append('{0} as {1}'.format(name, name_as))
else:
names.append(name)
try:
import_from_module = node.modname
if import_from_module == 'salttesting.helpers':
for name in names:
if name == 'ensure_in_syspath':
self.add_message('blacklisted-syspath-update', node=node)
continue
msg = 'Please use \'from tests.support.helpers import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module in ('salttesting.mock', 'mock', 'unittest.mock', 'unittest2.mock'):
for name in names:
msg = 'Please use \'from tests.support.mock import {0}\''.format(name)
if import_from_module in ('salttesting.mock', 'unittest.mock', 'unittest2.mock'):
message_id = 'blacklisted-module'
else:
message_id = 'blacklisted-external-module'
self.add_message(message_id, node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.parser':
for name in names:
msg = 'Please use \'from tests.support.parser import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.case':
for name in names:
msg = 'Please use \'from tests.support.case import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.unit':
for name in names:
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module.startswith(('unittest', 'unittest2')):
for name in names:
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.mixins':
for name in names:
msg = 'Please use \'from tests.support.mixins import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'six':
for name in names:
msg = 'Please use \'from salt.ext.six import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'distutils.version':
for name in names:
msg = 'Please use \'from salt.utils.versions import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if names:
for name in names:
if name in ('TestLoader', 'TextTestRunner', 'TestCase', 'expectedFailure',
'TestSuite', 'skipIf', 'TestResult'):
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name in ('SaltReturnAssertsMixin', 'SaltMinionEventAssertsMixin'):
msg = 'Please use \'from tests.support.mixins import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name in ('ModuleCase', 'SyndicCase', 'ShellCase', 'SSHCase'):
msg = 'Please use \'from tests.support.case import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name == 'run_tests':
msg = 'Please remove the \'if __name__ == "__main__":\' section from the end of the module'
self.add_message('blacklisted-test-module-execution', node=node, args=msg)
continue
if mod_name in ('integration', 'unit'):
if name in ('SYS_TMP_DIR',
'TMP',
'FILES',
'PYEXEC',
'MOCKBIN',
'SCRIPT_DIR',
'TMP_STATE_TREE',
'TMP_PRODENV_STATE_TREE',
'TMP_CONF_DIR',
'TMP_SUB_MINION_CONF_DIR',
'TMP_SYNDIC_MINION_CONF_DIR',
'TMP_SYNDIC_MASTER_CONF_DIR',
'CODE_DIR',
'TESTS_DIR',
'CONF_DIR',
'PILLAR_DIR',
'TMP_SCRIPT_DIR',
'ENGINES_DIR',
'LOG_HANDLERS_DIR',
'INTEGRATION_TEST_DIR'):
msg = 'Please use \'from tests.support.paths import {0}\''.format(name)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
continue
msg = 'Please use \'from tests.{0} import {1}\''.format(mod_path, name)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
continue
msg = 'Please report this error to SaltStack so we can fix it: Trying to import {0} from {1}'.format(name, mod_path)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
except AttributeError:
if mod_name in ('integration', 'unit', 'mock', 'six', 'distutils.version',
'unittest', 'unittest2'):
if mod_name in ('integration', 'unit'):
msg = 'Please use \'import tests.{0} as {0}\''.format(mod_name)
message_id = 'blacklisted-import'
elif mod_name == 'mock':
msg = 'Please use \'import tests.support.{0} as {0}\''.format(mod_name)
message_id = 'blacklisted-external-import'
elif mod_name == 'six':
msg = 'Please use \'import salt.ext.{0} as {0}\''.format(name)
message_id = 'blacklisted-external-import'
elif mod_name == 'distutils.version':
msg = 'Please use \'import salt.utils.versions\' instead'
message_id = 'blacklisted-import'
elif mod_name.startswith(('unittest', 'unittest2')):
msg = 'Please use \'import tests.support.unit as {}\' instead'.format(mod_name)
message_id = 'blacklisted-import'
self.add_message(message_id, node=node, args=(mod_path, msg))
continue
msg = 'Please report this error to SaltStack so we can fix it: Trying to import {0}'.format(mod_path)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
BLACKLISTED_LOADER_USAGE_MSGS = {
'E9501': ('Blacklisted salt loader dunder usage. Setting dunder attribute %r to module %r. '
'Use \'salt.support.mock\' and \'patch.dict()\' instead.',
'unmocked-patch-dunder',
'Uses a blacklisted salt loader dunder usage in tests.'),
'E9502': ('Blacklisted salt loader dunder usage. Setting attribute %r to module %r. '
'Use \'salt.support.mock\' and \'patch()\' instead.',
'unmocked-patch',
'Uses a blacklisted salt loader dunder usage in tests.'),
'E9503': ('Blacklisted salt loader dunder usage. Updating dunder attribute %r on module %r. '
'Use \'salt.support.mock\' and \'patch.dict()\' instead.',
'unmocked-patch-dunder-update',
'Uses a blacklisted salt loader dunder usage in tests.'),
}
class BlacklistedLoaderModulesUsageChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'blacklisted-unmocked-patching'
msgs = BLACKLISTED_LOADER_USAGE_MSGS
priority = -2
def open(self):
self.process_module = False
self.salt_dunders = (
'__opts__', '__salt__', '__runner__', '__context__', '__utils__',
'__ext_pillar__', '__thorium__', '__states__', '__serializers__',
'__ret__', '__grains__', '__pillar__', '__sdb__', '__proxy__',
'__low__', '__orchestration_jid__', '__running__', '__intance_id__',
'__lowstate__', '__env__'
)
self.imported_salt_modules = {}
def close(self):
self.process_module = False
self.imported_salt_modules = {}
@utils.check_messages('blacklisted-unmocked-patching')
def visit_module(self, node):
module_filename = node.root().file
if not fnmatch.fnmatch(os.path.basename(module_filename), 'test_*.py*'):
return
self.process_module = True
@utils.check_messages('blacklisted-unmocked-patching')
def leave_module(self, node):
if self.process_module:
# Reset
self.process_module = False
self.imported_salt_modules = {}
@utils.check_messages('blacklisted-unmocked-patching')
def visit_import(self, node):
'''triggered when an import statement is seen'''
if self.process_module:
# Store salt imported modules
for module, import_as in node.names:
if not module.startswith('salt'):
continue
if import_as and import_as not in self.imported_salt_modules:
self.imported_salt_modules[import_as] = module
continue
if module not in self.imported_salt_modules:
self.imported_salt_modules[module] = module
@utils.check_messages('blacklisted-unmocked-patching')
def visit_importfrom(self, node):
'''triggered when a from statement is seen'''
if self.process_module:
if not node.modname.startswith('salt'):
return
# Store salt imported modules
for module, import_as in node.names:
if import_as and import_as not in self.imported_salt_modules:
self.imported_salt_modules[import_as] = import_as
continue
if module not in self.imported_salt_modules:
self.imported_salt_modules[module] = module
@utils.check_messages('blacklisted-loader-usage')
def visit_assign(self, node, *args):
if not self.process_module:
return
node_left = node.targets[0]
if isinstance(node_left, astroid.Subscript):
# Were're changing an existing attribute
if not isinstance(node_left.value, astroid.Attribute):
return
if node_left.value.attrname in self.salt_dunders:
self.add_message(
'unmocked-patch-dunder-update',
node=node,
args=(node_left.value.attrname,
self.imported_salt_modules[node_left.value.expr.name])
)
return
if not isinstance(node_left, astroid.AssignAttr):
return
try:
if node_left.expr.name not in self.imported_salt_modules:
# If attributes are not being set on salt's modules,
# leave it alone, for now!
return
except AttributeError:
# This mmight not be what we're looking for
return
# we're assigning to an imported salt module!
if node_left.attrname in self.salt_dunders:
# We're changing salt dunders
self.add_message(
'unmocked-patch-dunder',
node=node,
args=(node_left.attrname,
self.imported_salt_modules[node_left.expr.name])
)
return
# Changing random attributes
self.add_message(
'unmocked-patch',
node=node,
args=(node_left.attrname,
self.imported_salt_modules[node_left.expr.name])
)
RESOURCE_LEAKAGE_MSGS = {
'W8470': ('Resource leakage detected. %s ',
'resource-leakage',
'Resource leakage detected.'),
}
class ResourceLeakageChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'resource-leakage'
msgs = RESOURCE_LEAKAGE_MSGS
priority = -2
def open(self):
self.inside_with_ctx = False
def close(self):
self.inside_with_ctx = False
def visit_with(self, node):
self.inside_with_ctx = True
def leave_with(self, node):
self.inside_with_ctx = False
def visit_call(self, node):
if isinstance(node.func, astroid.Attribute):
if node.func.attrname == 'fopen' and self.inside_with_ctx is False:
msg = ('Please call \'salt.utils.fopen\' using the \'with\' context '
'manager, otherwise the file handle won\'t be closed and '
'resource leakage will occur.')
self.add_message('resource-leakage', node=node, args=(msg,))
elif isinstance(node.func, astroid.Name):
if utils.is_builtin(node.func.name) and node.func.name == 'open':
if self.inside_with_ctx:
msg = ('Please use \'with salt.utils.fopen()\' instead of '
'\'with open()\'. It assures salt does not leak '
'file handles.')
else:
msg = ('Please use \'salt.utils.fopen()\' instead of \'open()\' '
'using the \'with\' context manager, otherwise the file '
'handle won\'t be closed and resource leakage will occur.')
self.add_message('resource-leakage', node=node, args=(msg,))
MOVED_TEST_CASE_CLASSES_MSGS = {
'E9490': ('Moved test case base class detected. %s',
'moved-test-case-class',
'Moved test case base class detected.'),
'E9491': ('Moved test case mixin class detected. %s',
'moved-test-case-mixin',
'Moved test case mixin class detected.'),
}
class MovedTestCaseClassChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'moved-test-case-class'
msgs = MOVED_TEST_CASE_CLASSES_MSGS
priority = -2
def open(self):
self.process_module = False
def close(self):
self.process_module = False
@utils.check_messages('moved-test-case-class')
def visit_module(self, node):
module_filename = node.root().file
if not fnmatch.fnmatch(os.path.basename(module_filename), 'test_*.py*'):
return
self.process_module = True
@utils.check_messages('moved-test-case-class')
def leave_module(self, node):
if self.process_module:
# Reset
self.process_module = False
@utils.check_messages('moved-test-case-class')
def visit_importfrom(self, node):
'''triggered when a from statement is seen'''
if self.process_module:
if not node.modname.startswith('tests.integration'):
return
# Store salt imported modules
for module, import_as in node.names:
if import_as:
self._check_moved_imports(node, module, import_as)
continue
self._check_moved_imports(node, module)
@utils.check_messages('moved-test-case-class')
def visit_classdef(self, node):
for base in node.bases:
if not hasattr(base, 'attrname'):
continue
if base.attrname in ('TestCase',):
msg = 'Please use \'from tests.support.unit import {0}\''.format(base.attrname)
self.add_message('moved-test-case-class', node=node, args=(msg,))
if base.attrname in ('ModuleCase', 'SyndicCase', 'ShellCase', 'SSHCase'):
msg = 'Please use \'from tests.support.case import {0}\''.format(base.attrname)
self.add_message('moved-test-case-class', node=node, args=(msg,))
if base.attrname in ('AdaptedConfigurationTestCaseMixin', 'ShellCaseCommonTestsMixin',
'SaltMinionEventAssertsMixin'):
msg = 'Please use \'from tests.support.mixins import {0}\''.format(base.attrname)
self.add_message('moved-test-case-mixin', node=node, args=(msg,))
def _check_moved_imports(self, node, module, import_as=None):
names = []
for name, name_as in node.names:
if name not in ('ModuleCase', 'SyndicCase', 'ShellCase', 'SSHCase'):
continue
if name_as:
msg = 'Please use \'from tests.support.case import {0} as {1}\''.format(name, name_as)
else:
msg = 'Please use \'from tests.support.case import {0}\''.format(name)
self.add_message('moved-test-case-class', node=node, args=(msg,))
BLACKLISTED_FUNCTIONS_MSGS = {
'E9601': ('Use of blacklisted function %s (use %s instead)',
'blacklisted-function',
'Used a function marked as blacklisted'),
}
class BlacklistedFunctionsChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'blacklisted-functions'
msgs = BLACKLISTED_FUNCTIONS_MSGS
priority = -2
max_depth = 20
options = (
('blacklisted-functions',
{'default': '', 'type': 'string',
'metavar': 'bad1=good1,bad2=good2',
'help': 'List of blacklisted functions and their recommended '
'replacements'}),
)
def open(self):
self.blacklisted_functions = {}
blacklist = [
x.strip() for x in self.config.blacklisted_functions.split(',')]
for item in [x.strip() for x in
self.config.blacklisted_functions.split(',')]:
try:
key, val = [x.strip() for x in item.split('=')]
except ValueError:
pass
else:
self.blacklisted_functions[key] = val
def _get_full_name(self, node):
try:
func = utils.safe_infer(node.func)
if func.name.__str__() == 'Uninferable':
return
except Exception:
func = None
if func is None:
return
ret = []
depth = 0
while func is not None:
depth += 1
if depth > self.max_depth:
# Prevent endless loop
return
try:
ret.append(func.name)
except AttributeError:
return
func = func.parent
# ret will contain the levels of the function from last to first (e.g.
# ['walk', 'os']. Reverse it and join with dots to get the correct
# full name for the function.
return '.'.join(ret[::-1])
@utils.check_messages('blacklisted-functions')
def visit_call(self, node):
if self.blacklisted_functions:
full_name = self._get_full_name(node)
if full_name is not None:
try:
self.add_message(
'blacklisted-function',
node=node,
args=(full_name, self.blacklisted_functions[full_name])
)
except KeyError:
# Not blacklisted
pass
|
saltstack/salt-pylint | saltpylint/blacklist.py | BlacklistedImportsChecker.visit_import | python | def visit_import(self, node):
'''triggered when an import statement is seen'''
module_filename = node.root().file
if fnmatch.fnmatch(module_filename, '__init__.py*') and \
not fnmatch.fnmatch(module_filename, 'test_*.py*'):
return
modnode = node.root()
names = [name for name, _ in node.names]
for name in names:
self._check_blacklisted_module(node, name) | triggered when an import statement is seen | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/blacklist.py#L66-L76 | null | class BlacklistedImportsChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'blacklisted-imports'
msgs = BLACKLISTED_IMPORTS_MSGS
priority = -2
def open(self):
self.blacklisted_modules = ('salttesting',
'integration',
'unit',
'mock',
'six',
'distutils.version',
'unittest',
'unittest2')
@utils.check_messages('blacklisted-imports')
@utils.check_messages('blacklisted-imports')
def visit_importfrom(self, node):
'''triggered when a from statement is seen'''
module_filename = node.root().file
if fnmatch.fnmatch(module_filename, '__init__.py*') and \
not fnmatch.fnmatch(module_filename, 'test_*.py*'):
return
basename = node.modname
self._check_blacklisted_module(node, basename)
def _check_blacklisted_module(self, node, mod_path):
'''check if the module is blacklisted'''
for mod_name in self.blacklisted_modules:
if mod_path == mod_name or mod_path.startswith(mod_name + '.'):
names = []
for name, name_as in node.names:
if name_as:
names.append('{0} as {1}'.format(name, name_as))
else:
names.append(name)
try:
import_from_module = node.modname
if import_from_module == 'salttesting.helpers':
for name in names:
if name == 'ensure_in_syspath':
self.add_message('blacklisted-syspath-update', node=node)
continue
msg = 'Please use \'from tests.support.helpers import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module in ('salttesting.mock', 'mock', 'unittest.mock', 'unittest2.mock'):
for name in names:
msg = 'Please use \'from tests.support.mock import {0}\''.format(name)
if import_from_module in ('salttesting.mock', 'unittest.mock', 'unittest2.mock'):
message_id = 'blacklisted-module'
else:
message_id = 'blacklisted-external-module'
self.add_message(message_id, node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.parser':
for name in names:
msg = 'Please use \'from tests.support.parser import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.case':
for name in names:
msg = 'Please use \'from tests.support.case import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.unit':
for name in names:
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module.startswith(('unittest', 'unittest2')):
for name in names:
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.mixins':
for name in names:
msg = 'Please use \'from tests.support.mixins import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'six':
for name in names:
msg = 'Please use \'from salt.ext.six import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'distutils.version':
for name in names:
msg = 'Please use \'from salt.utils.versions import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if names:
for name in names:
if name in ('TestLoader', 'TextTestRunner', 'TestCase', 'expectedFailure',
'TestSuite', 'skipIf', 'TestResult'):
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name in ('SaltReturnAssertsMixin', 'SaltMinionEventAssertsMixin'):
msg = 'Please use \'from tests.support.mixins import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name in ('ModuleCase', 'SyndicCase', 'ShellCase', 'SSHCase'):
msg = 'Please use \'from tests.support.case import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name == 'run_tests':
msg = 'Please remove the \'if __name__ == "__main__":\' section from the end of the module'
self.add_message('blacklisted-test-module-execution', node=node, args=msg)
continue
if mod_name in ('integration', 'unit'):
if name in ('SYS_TMP_DIR',
'TMP',
'FILES',
'PYEXEC',
'MOCKBIN',
'SCRIPT_DIR',
'TMP_STATE_TREE',
'TMP_PRODENV_STATE_TREE',
'TMP_CONF_DIR',
'TMP_SUB_MINION_CONF_DIR',
'TMP_SYNDIC_MINION_CONF_DIR',
'TMP_SYNDIC_MASTER_CONF_DIR',
'CODE_DIR',
'TESTS_DIR',
'CONF_DIR',
'PILLAR_DIR',
'TMP_SCRIPT_DIR',
'ENGINES_DIR',
'LOG_HANDLERS_DIR',
'INTEGRATION_TEST_DIR'):
msg = 'Please use \'from tests.support.paths import {0}\''.format(name)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
continue
msg = 'Please use \'from tests.{0} import {1}\''.format(mod_path, name)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
continue
msg = 'Please report this error to SaltStack so we can fix it: Trying to import {0} from {1}'.format(name, mod_path)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
except AttributeError:
if mod_name in ('integration', 'unit', 'mock', 'six', 'distutils.version',
'unittest', 'unittest2'):
if mod_name in ('integration', 'unit'):
msg = 'Please use \'import tests.{0} as {0}\''.format(mod_name)
message_id = 'blacklisted-import'
elif mod_name == 'mock':
msg = 'Please use \'import tests.support.{0} as {0}\''.format(mod_name)
message_id = 'blacklisted-external-import'
elif mod_name == 'six':
msg = 'Please use \'import salt.ext.{0} as {0}\''.format(name)
message_id = 'blacklisted-external-import'
elif mod_name == 'distutils.version':
msg = 'Please use \'import salt.utils.versions\' instead'
message_id = 'blacklisted-import'
elif mod_name.startswith(('unittest', 'unittest2')):
msg = 'Please use \'import tests.support.unit as {}\' instead'.format(mod_name)
message_id = 'blacklisted-import'
self.add_message(message_id, node=node, args=(mod_path, msg))
continue
msg = 'Please report this error to SaltStack so we can fix it: Trying to import {0}'.format(mod_path)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
|
saltstack/salt-pylint | saltpylint/blacklist.py | BlacklistedImportsChecker.visit_importfrom | python | def visit_importfrom(self, node):
'''triggered when a from statement is seen'''
module_filename = node.root().file
if fnmatch.fnmatch(module_filename, '__init__.py*') and \
not fnmatch.fnmatch(module_filename, 'test_*.py*'):
return
basename = node.modname
self._check_blacklisted_module(node, basename) | triggered when a from statement is seen | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/blacklist.py#L79-L86 | null | class BlacklistedImportsChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'blacklisted-imports'
msgs = BLACKLISTED_IMPORTS_MSGS
priority = -2
def open(self):
self.blacklisted_modules = ('salttesting',
'integration',
'unit',
'mock',
'six',
'distutils.version',
'unittest',
'unittest2')
@utils.check_messages('blacklisted-imports')
def visit_import(self, node):
'''triggered when an import statement is seen'''
module_filename = node.root().file
if fnmatch.fnmatch(module_filename, '__init__.py*') and \
not fnmatch.fnmatch(module_filename, 'test_*.py*'):
return
modnode = node.root()
names = [name for name, _ in node.names]
for name in names:
self._check_blacklisted_module(node, name)
@utils.check_messages('blacklisted-imports')
def _check_blacklisted_module(self, node, mod_path):
'''check if the module is blacklisted'''
for mod_name in self.blacklisted_modules:
if mod_path == mod_name or mod_path.startswith(mod_name + '.'):
names = []
for name, name_as in node.names:
if name_as:
names.append('{0} as {1}'.format(name, name_as))
else:
names.append(name)
try:
import_from_module = node.modname
if import_from_module == 'salttesting.helpers':
for name in names:
if name == 'ensure_in_syspath':
self.add_message('blacklisted-syspath-update', node=node)
continue
msg = 'Please use \'from tests.support.helpers import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module in ('salttesting.mock', 'mock', 'unittest.mock', 'unittest2.mock'):
for name in names:
msg = 'Please use \'from tests.support.mock import {0}\''.format(name)
if import_from_module in ('salttesting.mock', 'unittest.mock', 'unittest2.mock'):
message_id = 'blacklisted-module'
else:
message_id = 'blacklisted-external-module'
self.add_message(message_id, node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.parser':
for name in names:
msg = 'Please use \'from tests.support.parser import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.case':
for name in names:
msg = 'Please use \'from tests.support.case import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.unit':
for name in names:
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module.startswith(('unittest', 'unittest2')):
for name in names:
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.mixins':
for name in names:
msg = 'Please use \'from tests.support.mixins import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'six':
for name in names:
msg = 'Please use \'from salt.ext.six import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'distutils.version':
for name in names:
msg = 'Please use \'from salt.utils.versions import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if names:
for name in names:
if name in ('TestLoader', 'TextTestRunner', 'TestCase', 'expectedFailure',
'TestSuite', 'skipIf', 'TestResult'):
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name in ('SaltReturnAssertsMixin', 'SaltMinionEventAssertsMixin'):
msg = 'Please use \'from tests.support.mixins import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name in ('ModuleCase', 'SyndicCase', 'ShellCase', 'SSHCase'):
msg = 'Please use \'from tests.support.case import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name == 'run_tests':
msg = 'Please remove the \'if __name__ == "__main__":\' section from the end of the module'
self.add_message('blacklisted-test-module-execution', node=node, args=msg)
continue
if mod_name in ('integration', 'unit'):
if name in ('SYS_TMP_DIR',
'TMP',
'FILES',
'PYEXEC',
'MOCKBIN',
'SCRIPT_DIR',
'TMP_STATE_TREE',
'TMP_PRODENV_STATE_TREE',
'TMP_CONF_DIR',
'TMP_SUB_MINION_CONF_DIR',
'TMP_SYNDIC_MINION_CONF_DIR',
'TMP_SYNDIC_MASTER_CONF_DIR',
'CODE_DIR',
'TESTS_DIR',
'CONF_DIR',
'PILLAR_DIR',
'TMP_SCRIPT_DIR',
'ENGINES_DIR',
'LOG_HANDLERS_DIR',
'INTEGRATION_TEST_DIR'):
msg = 'Please use \'from tests.support.paths import {0}\''.format(name)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
continue
msg = 'Please use \'from tests.{0} import {1}\''.format(mod_path, name)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
continue
msg = 'Please report this error to SaltStack so we can fix it: Trying to import {0} from {1}'.format(name, mod_path)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
except AttributeError:
if mod_name in ('integration', 'unit', 'mock', 'six', 'distutils.version',
'unittest', 'unittest2'):
if mod_name in ('integration', 'unit'):
msg = 'Please use \'import tests.{0} as {0}\''.format(mod_name)
message_id = 'blacklisted-import'
elif mod_name == 'mock':
msg = 'Please use \'import tests.support.{0} as {0}\''.format(mod_name)
message_id = 'blacklisted-external-import'
elif mod_name == 'six':
msg = 'Please use \'import salt.ext.{0} as {0}\''.format(name)
message_id = 'blacklisted-external-import'
elif mod_name == 'distutils.version':
msg = 'Please use \'import salt.utils.versions\' instead'
message_id = 'blacklisted-import'
elif mod_name.startswith(('unittest', 'unittest2')):
msg = 'Please use \'import tests.support.unit as {}\' instead'.format(mod_name)
message_id = 'blacklisted-import'
self.add_message(message_id, node=node, args=(mod_path, msg))
continue
msg = 'Please report this error to SaltStack so we can fix it: Trying to import {0}'.format(mod_path)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
|
saltstack/salt-pylint | saltpylint/blacklist.py | BlacklistedImportsChecker._check_blacklisted_module | python | def _check_blacklisted_module(self, node, mod_path):
'''check if the module is blacklisted'''
for mod_name in self.blacklisted_modules:
if mod_path == mod_name or mod_path.startswith(mod_name + '.'):
names = []
for name, name_as in node.names:
if name_as:
names.append('{0} as {1}'.format(name, name_as))
else:
names.append(name)
try:
import_from_module = node.modname
if import_from_module == 'salttesting.helpers':
for name in names:
if name == 'ensure_in_syspath':
self.add_message('blacklisted-syspath-update', node=node)
continue
msg = 'Please use \'from tests.support.helpers import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module in ('salttesting.mock', 'mock', 'unittest.mock', 'unittest2.mock'):
for name in names:
msg = 'Please use \'from tests.support.mock import {0}\''.format(name)
if import_from_module in ('salttesting.mock', 'unittest.mock', 'unittest2.mock'):
message_id = 'blacklisted-module'
else:
message_id = 'blacklisted-external-module'
self.add_message(message_id, node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.parser':
for name in names:
msg = 'Please use \'from tests.support.parser import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.case':
for name in names:
msg = 'Please use \'from tests.support.case import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.unit':
for name in names:
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module.startswith(('unittest', 'unittest2')):
for name in names:
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'salttesting.mixins':
for name in names:
msg = 'Please use \'from tests.support.mixins import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'six':
for name in names:
msg = 'Please use \'from salt.ext.six import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if import_from_module == 'distutils.version':
for name in names:
msg = 'Please use \'from salt.utils.versions import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if names:
for name in names:
if name in ('TestLoader', 'TextTestRunner', 'TestCase', 'expectedFailure',
'TestSuite', 'skipIf', 'TestResult'):
msg = 'Please use \'from tests.support.unit import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name in ('SaltReturnAssertsMixin', 'SaltMinionEventAssertsMixin'):
msg = 'Please use \'from tests.support.mixins import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name in ('ModuleCase', 'SyndicCase', 'ShellCase', 'SSHCase'):
msg = 'Please use \'from tests.support.case import {0}\''.format(name)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
continue
if name == 'run_tests':
msg = 'Please remove the \'if __name__ == "__main__":\' section from the end of the module'
self.add_message('blacklisted-test-module-execution', node=node, args=msg)
continue
if mod_name in ('integration', 'unit'):
if name in ('SYS_TMP_DIR',
'TMP',
'FILES',
'PYEXEC',
'MOCKBIN',
'SCRIPT_DIR',
'TMP_STATE_TREE',
'TMP_PRODENV_STATE_TREE',
'TMP_CONF_DIR',
'TMP_SUB_MINION_CONF_DIR',
'TMP_SYNDIC_MINION_CONF_DIR',
'TMP_SYNDIC_MASTER_CONF_DIR',
'CODE_DIR',
'TESTS_DIR',
'CONF_DIR',
'PILLAR_DIR',
'TMP_SCRIPT_DIR',
'ENGINES_DIR',
'LOG_HANDLERS_DIR',
'INTEGRATION_TEST_DIR'):
msg = 'Please use \'from tests.support.paths import {0}\''.format(name)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
continue
msg = 'Please use \'from tests.{0} import {1}\''.format(mod_path, name)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg))
continue
msg = 'Please report this error to SaltStack so we can fix it: Trying to import {0} from {1}'.format(name, mod_path)
self.add_message('blacklisted-module', node=node, args=(mod_path, msg))
except AttributeError:
if mod_name in ('integration', 'unit', 'mock', 'six', 'distutils.version',
'unittest', 'unittest2'):
if mod_name in ('integration', 'unit'):
msg = 'Please use \'import tests.{0} as {0}\''.format(mod_name)
message_id = 'blacklisted-import'
elif mod_name == 'mock':
msg = 'Please use \'import tests.support.{0} as {0}\''.format(mod_name)
message_id = 'blacklisted-external-import'
elif mod_name == 'six':
msg = 'Please use \'import salt.ext.{0} as {0}\''.format(name)
message_id = 'blacklisted-external-import'
elif mod_name == 'distutils.version':
msg = 'Please use \'import salt.utils.versions\' instead'
message_id = 'blacklisted-import'
elif mod_name.startswith(('unittest', 'unittest2')):
msg = 'Please use \'import tests.support.unit as {}\' instead'.format(mod_name)
message_id = 'blacklisted-import'
self.add_message(message_id, node=node, args=(mod_path, msg))
continue
msg = 'Please report this error to SaltStack so we can fix it: Trying to import {0}'.format(mod_path)
self.add_message('blacklisted-import', node=node, args=(mod_path, msg)) | check if the module is blacklisted | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/blacklist.py#L88-L221 | null | class BlacklistedImportsChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'blacklisted-imports'
msgs = BLACKLISTED_IMPORTS_MSGS
priority = -2
def open(self):
self.blacklisted_modules = ('salttesting',
'integration',
'unit',
'mock',
'six',
'distutils.version',
'unittest',
'unittest2')
@utils.check_messages('blacklisted-imports')
def visit_import(self, node):
'''triggered when an import statement is seen'''
module_filename = node.root().file
if fnmatch.fnmatch(module_filename, '__init__.py*') and \
not fnmatch.fnmatch(module_filename, 'test_*.py*'):
return
modnode = node.root()
names = [name for name, _ in node.names]
for name in names:
self._check_blacklisted_module(node, name)
@utils.check_messages('blacklisted-imports')
def visit_importfrom(self, node):
'''triggered when a from statement is seen'''
module_filename = node.root().file
if fnmatch.fnmatch(module_filename, '__init__.py*') and \
not fnmatch.fnmatch(module_filename, 'test_*.py*'):
return
basename = node.modname
self._check_blacklisted_module(node, basename)
|
saltstack/salt-pylint | saltpylint/blacklist.py | BlacklistedLoaderModulesUsageChecker.visit_import | python | def visit_import(self, node):
'''triggered when an import statement is seen'''
if self.process_module:
# Store salt imported modules
for module, import_as in node.names:
if not module.startswith('salt'):
continue
if import_as and import_as not in self.imported_salt_modules:
self.imported_salt_modules[import_as] = module
continue
if module not in self.imported_salt_modules:
self.imported_salt_modules[module] = module | triggered when an import statement is seen | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/blacklist.py#L277-L288 | null | class BlacklistedLoaderModulesUsageChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'blacklisted-unmocked-patching'
msgs = BLACKLISTED_LOADER_USAGE_MSGS
priority = -2
def open(self):
self.process_module = False
self.salt_dunders = (
'__opts__', '__salt__', '__runner__', '__context__', '__utils__',
'__ext_pillar__', '__thorium__', '__states__', '__serializers__',
'__ret__', '__grains__', '__pillar__', '__sdb__', '__proxy__',
'__low__', '__orchestration_jid__', '__running__', '__intance_id__',
'__lowstate__', '__env__'
)
self.imported_salt_modules = {}
def close(self):
self.process_module = False
self.imported_salt_modules = {}
@utils.check_messages('blacklisted-unmocked-patching')
def visit_module(self, node):
module_filename = node.root().file
if not fnmatch.fnmatch(os.path.basename(module_filename), 'test_*.py*'):
return
self.process_module = True
@utils.check_messages('blacklisted-unmocked-patching')
def leave_module(self, node):
if self.process_module:
# Reset
self.process_module = False
self.imported_salt_modules = {}
@utils.check_messages('blacklisted-unmocked-patching')
@utils.check_messages('blacklisted-unmocked-patching')
def visit_importfrom(self, node):
'''triggered when a from statement is seen'''
if self.process_module:
if not node.modname.startswith('salt'):
return
# Store salt imported modules
for module, import_as in node.names:
if import_as and import_as not in self.imported_salt_modules:
self.imported_salt_modules[import_as] = import_as
continue
if module not in self.imported_salt_modules:
self.imported_salt_modules[module] = module
@utils.check_messages('blacklisted-loader-usage')
def visit_assign(self, node, *args):
if not self.process_module:
return
node_left = node.targets[0]
if isinstance(node_left, astroid.Subscript):
# Were're changing an existing attribute
if not isinstance(node_left.value, astroid.Attribute):
return
if node_left.value.attrname in self.salt_dunders:
self.add_message(
'unmocked-patch-dunder-update',
node=node,
args=(node_left.value.attrname,
self.imported_salt_modules[node_left.value.expr.name])
)
return
if not isinstance(node_left, astroid.AssignAttr):
return
try:
if node_left.expr.name not in self.imported_salt_modules:
# If attributes are not being set on salt's modules,
# leave it alone, for now!
return
except AttributeError:
# This mmight not be what we're looking for
return
# we're assigning to an imported salt module!
if node_left.attrname in self.salt_dunders:
# We're changing salt dunders
self.add_message(
'unmocked-patch-dunder',
node=node,
args=(node_left.attrname,
self.imported_salt_modules[node_left.expr.name])
)
return
# Changing random attributes
self.add_message(
'unmocked-patch',
node=node,
args=(node_left.attrname,
self.imported_salt_modules[node_left.expr.name])
)
|
saltstack/salt-pylint | saltpylint/blacklist.py | BlacklistedLoaderModulesUsageChecker.visit_importfrom | python | def visit_importfrom(self, node):
'''triggered when a from statement is seen'''
if self.process_module:
if not node.modname.startswith('salt'):
return
# Store salt imported modules
for module, import_as in node.names:
if import_as and import_as not in self.imported_salt_modules:
self.imported_salt_modules[import_as] = import_as
continue
if module not in self.imported_salt_modules:
self.imported_salt_modules[module] = module | triggered when a from statement is seen | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/blacklist.py#L291-L302 | null | class BlacklistedLoaderModulesUsageChecker(BaseChecker):
__implements__ = IAstroidChecker
name = 'blacklisted-unmocked-patching'
msgs = BLACKLISTED_LOADER_USAGE_MSGS
priority = -2
def open(self):
self.process_module = False
self.salt_dunders = (
'__opts__', '__salt__', '__runner__', '__context__', '__utils__',
'__ext_pillar__', '__thorium__', '__states__', '__serializers__',
'__ret__', '__grains__', '__pillar__', '__sdb__', '__proxy__',
'__low__', '__orchestration_jid__', '__running__', '__intance_id__',
'__lowstate__', '__env__'
)
self.imported_salt_modules = {}
def close(self):
self.process_module = False
self.imported_salt_modules = {}
@utils.check_messages('blacklisted-unmocked-patching')
def visit_module(self, node):
module_filename = node.root().file
if not fnmatch.fnmatch(os.path.basename(module_filename), 'test_*.py*'):
return
self.process_module = True
@utils.check_messages('blacklisted-unmocked-patching')
def leave_module(self, node):
if self.process_module:
# Reset
self.process_module = False
self.imported_salt_modules = {}
@utils.check_messages('blacklisted-unmocked-patching')
def visit_import(self, node):
'''triggered when an import statement is seen'''
if self.process_module:
# Store salt imported modules
for module, import_as in node.names:
if not module.startswith('salt'):
continue
if import_as and import_as not in self.imported_salt_modules:
self.imported_salt_modules[import_as] = module
continue
if module not in self.imported_salt_modules:
self.imported_salt_modules[module] = module
@utils.check_messages('blacklisted-unmocked-patching')
@utils.check_messages('blacklisted-loader-usage')
def visit_assign(self, node, *args):
if not self.process_module:
return
node_left = node.targets[0]
if isinstance(node_left, astroid.Subscript):
# Were're changing an existing attribute
if not isinstance(node_left.value, astroid.Attribute):
return
if node_left.value.attrname in self.salt_dunders:
self.add_message(
'unmocked-patch-dunder-update',
node=node,
args=(node_left.value.attrname,
self.imported_salt_modules[node_left.value.expr.name])
)
return
if not isinstance(node_left, astroid.AssignAttr):
return
try:
if node_left.expr.name not in self.imported_salt_modules:
# If attributes are not being set on salt's modules,
# leave it alone, for now!
return
except AttributeError:
# This mmight not be what we're looking for
return
# we're assigning to an imported salt module!
if node_left.attrname in self.salt_dunders:
# We're changing salt dunders
self.add_message(
'unmocked-patch-dunder',
node=node,
args=(node_left.attrname,
self.imported_salt_modules[node_left.expr.name])
)
return
# Changing random attributes
self.add_message(
'unmocked-patch',
node=node,
args=(node_left.attrname,
self.imported_salt_modules[node_left.expr.name])
)
|
saltstack/salt-pylint | saltpylint/smartup.py | register | python | def register(linter):
'''
Register the transformation functions.
'''
try:
MANAGER.register_transform(nodes.Class, rootlogger_transform)
except AttributeError:
MANAGER.register_transform(nodes.ClassDef, rootlogger_transform) | Register the transformation functions. | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/smartup.py#L39-L46 | null | # -*- coding: utf-8 -*-
'''
:codeauthor: :email:`Pedro Algarvio (pedro@algarvio.me)`
:copyright: © 2013-2018 by the SaltStack Team, see AUTHORS for more details.
:license: Apache 2.0, see LICENSE for more details.
===========================
Pylint Smartup Transformers
===========================
This plugin will register some transform functions which will allow PyLint to better
understand some classed used in Salt which trigger, `no-member` and `maybe-no-member`
A bridge between the `pep8`_ library and PyLint
'''
# Import Python libs
from __future__ import absolute_import
# Import PyLint libs
from astroid import nodes, MANAGER
def rootlogger_transform(obj):
if obj.name != 'RootLogger':
return
def _inject_method(cls, msg, *args, **kwargs):
pass
if not hasattr(obj, 'trace'):
setattr(obj, 'trace', _inject_method)
if not hasattr(obj, 'garbage'):
setattr(obj, 'garbage', _inject_method)
|
saltstack/salt-pylint | saltpylint/pep8.py | register | python | def register(linter):
'''
required method to auto register this checker
'''
if HAS_PEP8 is False:
return
linter.register_checker(PEP8Indentation(linter))
linter.register_checker(PEP8Whitespace(linter))
linter.register_checker(PEP8BlankLine(linter))
linter.register_checker(PEP8Import(linter))
linter.register_checker(PEP8LineLength(linter))
linter.register_checker(PEP8Statement(linter))
linter.register_checker(PEP8Runtime(linter))
linter.register_checker(PEP8IndentationWarning(linter))
linter.register_checker(PEP8WhitespaceWarning(linter))
linter.register_checker(PEP8BlankLineWarning(linter))
linter.register_checker(PEP8DeprecationWarning(linter)) | required method to auto register this checker | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/pep8.py#L462-L479 | null | # -*- coding: utf-8 -*-
'''
:codeauthor: :email:`Pedro Algarvio (pedro@algarvio.me)`
:copyright: © 2013-2018 by the SaltStack Team, see AUTHORS for more details.
:license: Apache 2.0, see LICENSE for more details.
===================
PEP-8 PyLint Plugin
===================
A bridge between the `pep8`_ library and PyLint
.. _`pep8`: http://pep8.readthedocs.org
'''
# Import Python libs
from __future__ import absolute_import
import sys
import logging
import warnings
# Import 3rd-party libs
import six
# Import PyLint libs
from pylint.interfaces import IRawChecker
from pylint.__pkginfo__ import numversion as pylint_version_info
from saltpylint.checkers import BaseChecker
# Import PEP8 libs
try:
from pycodestyle import StyleGuide, BaseReport
HAS_PEP8 = True
except ImportError:
HAS_PEP8 = False
warnings.warn(
'No pycodestyle library could be imported. No PEP8 check\'s will be done',
RuntimeWarning
)
_PROCESSED_NODES = {}
_KNOWN_PEP8_IDS = []
_UNHANDLED_PEP8_IDS = []
if HAS_PEP8 is True:
class PyLintPEP8Reporter(BaseReport):
def __init__(self, options):
super(PyLintPEP8Reporter, self).__init__(options)
self.locations = []
def error(self, line_number, offset, text, check):
code = super(PyLintPEP8Reporter, self).error(
line_number, offset, text, check
)
if code:
# E123, at least, is not reporting it's code in the above call,
# don't want to bother about that now
self.locations.append((code, line_number, text.split(code, 1)[-1].strip()))
class _PEP8BaseChecker(BaseChecker):
__implements__ = IRawChecker
name = 'pep8'
priority = -1
options = ()
msgs = None
_msgs = {}
msgs_map = {}
def __init__(self, linter=None):
# To avoid PyLints deprecation about a missing symbolic name and
# because I don't want to add descriptions, let's make the descriptions
# equal to the messages.
if self.msgs is None:
self.msgs = {}
for code, (message, symbolic) in six.iteritems(self._msgs):
self.msgs[code] = (message, symbolic, message)
BaseChecker.__init__(self, linter=linter)
def process_module(self, node):
'''
process a module
the module's content is accessible via node.file_stream object
'''
nodepaths = []
if not isinstance(node.path, list):
nodepaths = [node.path]
else:
nodepaths = node.path
for node_path in nodepaths:
if node_path not in _PROCESSED_NODES:
stylechecker = StyleGuide(
parse_argv=False, config_file=False, quiet=2,
reporter=PyLintPEP8Reporter
)
_PROCESSED_NODES[node_path] = stylechecker.check_files([node_path])
for code, lineno, text in _PROCESSED_NODES[node_path].locations:
pylintcode = '{0}8{1}'.format(code[0], code[1:])
if pylintcode in self.msgs_map:
# This will be handled by PyLint itself, skip it
continue
if pylintcode not in _KNOWN_PEP8_IDS:
if pylintcode not in _UNHANDLED_PEP8_IDS:
_UNHANDLED_PEP8_IDS.append(pylintcode)
msg = 'The following code, {0}, was not handled by the PEP8 plugin'.format(pylintcode)
if logging.root.handlers:
logging.getLogger(__name__).warning(msg)
else:
sys.stderr.write('{0}\n'.format(msg))
continue
if pylintcode not in self._msgs:
# Not for our class implementation to handle
continue
if code in ('E111', 'E113'):
if _PROCESSED_NODES[node_path].lines[lineno-1].strip().startswith('#'):
# If E111 is triggered in a comment I consider it, at
# least, bad judgement. See https://github.com/jcrocholl/pep8/issues/300
# If E113 is triggered in comments, which I consider a bug,
# skip it. See https://github.com/jcrocholl/pep8/issues/274
continue
try:
self.add_message(pylintcode, line=lineno, args=(code, text))
except TypeError as exc:
if 'not all arguments' not in str(exc):
raise
# Message does not support being passed the text arg
self.add_message(pylintcode, line=lineno, args=(code,))
class PEP8Indentation(_PEP8BaseChecker):
'''
Process PEP8 E1 codes
'''
_msgs = {
'E8101': ('PEP8 %s: %s',
'indentation-contains-mixed-spaces-and-tabs'),
'E8111': ('PEP8 %s: %s',
'indentation-is-not-a-multiple-of-four'),
'E8112': ('PEP8 %s: %s',
'expected-an-indented-block'),
'E8113': ('PEP8 %s: %s',
'unexpected-indentation'),
'E8114': ('PEP8 %s: %s',
'indentation-is-not-a-multiple-of-four-comment'),
'E8115': ('PEP8 %s: %s',
'expected-an-indented-block-comment'),
'E8116': ('PEP8 %s: %s',
'unexpected-indentation-comment'),
'E8121': ('PEP8 %s: %s',
'continuation-line-indentation-is-not-a-multiple-of-four'),
'E8122': ('PEP8 %s: %s',
'continuation-line-missing-indentation-or-outdented'),
'E8123': ('PEP8 %s: %s',
'closing-bracket-does-not-match-indentation-of-opening-brackets-line'),
'E8124': ('PEP8 %s: %s',
'closing-bracket-does-not-match-visual-indentation'),
'E8125': ('PEP8 %s: %s',
'continuation-line-does-not-distinguish-itself-from-next-logical-line'),
'E8126': ('PEP8 %s: %s',
'continuation-line-over-indented-for-hanging-indent'),
'E8127': ('PEP8 %s: %s',
'continuation-line-over-indented-for-visual-indent'),
'E8128': ('PEP8 %s: %s',
'continuation-line-under-indented-for-visual-indent'),
'E8129': ('PEP8 %s: %s',
'visually-indented-line-with-same-indent-as-next-logical-line'),
'E8131': ('PEP8 %s: %s',
'unaligned-for-hanging-indent'),
'E8133': ('PEP8 %s: %s',
'closing-bracket-is-missing-indentation'),
}
msgs_map = {
'E8126': 'C0330'
}
class PEP8Whitespace(_PEP8BaseChecker):
'''
Process PEP8 E2 codes
'''
_msgs = {
'E8201': ('PEP8 %s: %s',
'whitespace-after-left-parenthesis'),
'E8202': ('PEP8 %s: %s',
'whitespace-before-right-parenthesis'),
'E8203': ('PEP8 %s: %s',
'whitespace-before-colon'),
'E8211': ('PEP8 %s: %s',
'whitespace-before-left-parenthesis'),
'E8221': ('PEP8 %s: %s',
'multiple-spaces-before-operator'),
'E8222': ('PEP8 %s: %s',
'multiple-spaces-after-operator'),
'E8223': ('PEP8 %s: %s',
'tab-before-operator'),
'E8224': ('PEP8 %s: %s',
'tab-after-operator'),
'E8225': ('PEP8 %s: %s',
'missing-whitespace-around-operator'),
'E8226': ('PEP8 %s: %s',
'missing-whitespace-around-arithmetic-operator'),
'E8227': ('PEP8 %s: %s',
'missing-whitespace-around-bitwise-or-shift-operator'),
'E8228': ('PEP8 %s: %s',
'missing-whitespace-around-modulo-operator'),
'E8231': ('PEP8 %s: %s',
'missing-whitespace-after-comma'),
'E8241': ('PEP8 %s: %s',
'multiple-spaces-after-comma'),
'E8242': ('PEP8 %s: %s',
'tab-after-comma'),
'E8251': ('PEP8 %s: %s',
'unexpected-spaces-around-keyword-or-parameter-equals'),
'E8261': ('PEP8 %s: %s',
'at-least-two-spaces-before-inline-comment'),
'E8262': ('PEP8 %s: %s',
'inline-comment-should-start-with-cardinal-space'),
'E8265': ('PEP8 %s: %s',
'block-comment-should-start-with-cardinal-space'),
'E8266': ('PEP8 %s: %s',
'too-many-leading-hastag-for-block-comment'),
'E8271': ('PEP8 %s: %s',
'multiple-spaces-after-keyword'),
'E8272': ('PEP8 %s: %s',
'multiple-spaces-before-keyword'),
'E8273': ('PEP8 %s: %s',
'tab-after-keyword'),
'E8274': ('PEP8 %s: %s',
'tab-before-keyword'),
'E8275': ('PEP8 %s: %s',
'pep8-missing-whitespace-after-keyword')
}
msgs_map = {
'E8222': 'C0326',
'E8225': 'C0326',
'E8251': 'C0326'
}
class PEP8BlankLine(_PEP8BaseChecker):
'''
Process PEP8 E3 codes
'''
_msgs = {
'E8301': ('PEP8 %s: %s',
'expected-1-blank-line-found-0'),
'E8302': ('PEP8 %s: %s',
'expected-2-blank-lines-found-0'),
'E8303': ('PEP8 %s: %s',
'too-many-blank-lines'),
'E8304': ('PEP8 %s: %s',
'blank-lines-found-after-function-decorator'),
'E8305': ('PEP8 %s: %s',
'blank-lines-found-after-class-or-function-decorator'),
'E8306': ('PEP8 %s: %s',
'pep8-blank-lines-before-nested-definition'),
}
class PEP8Import(_PEP8BaseChecker):
'''
Process PEP8 E4 codes
'''
_msgs = {
'E8401': ('PEP8 %s: %s',
'multiple-imports-on-one-line'),
'E8402': ('PEP8 %s: %s',
'module-level-import-not-at-top-of-file')
}
class PEP8LineLength(_PEP8BaseChecker):
'''
Process PEP8 E5 codes
'''
_msgs = {
'E8501': ('PEP8 %s: %s',
'line-too-long)'),
'E8502': ('PEP8 %s: %s',
'the-backslash-is-redundant-between-brackets')
}
msgs_map = {
'E8501': 'C0301'
}
class PEP8Statement(_PEP8BaseChecker):
'''
Process PEP8 E7 codes
'''
_msgs = {
'E8701': ('PEP8 %s: %s',
'multiple-statements-on-one-line-colon'),
'E8702': ('PEP8 %s: %s',
'multiple-statements-on-one-line-semicolon'),
'E8703': ('PEP8 %s: %s',
'statement-ends-with-a-semicolon'),
'E8704': ('PEP8 %s: %s',
'pep8-multiple-statements-on-one-line'),
'E8711': ('PEP8 %s: %s',
'comparison-to-None-should-be-if-cond-is-None'),
'E8712': ('PEP8 %s: %s',
'comparison-to-True-should-be-if-cond-is-True-or-if-cond'),
'E8713': ('PEP8 %s: %s',
'test-for-membership-should-be-not-in'),
'E8714': ('PEP8 %s: %s',
'test-for-object-identity-should-be-is-not'),
'E8721': ('PEP8 %s: %s',
'do-not-compare-types-use-isinstance'),
'E8722': ('PEP8 %s: %s',
'pep8-bare-except'),
'E8731': ('PEP8 %s: %s',
'do-not-assign-a-lambda-expression-use-a-def'),
'E8741': ('PEP8 %s: %s',
'bad-variable-identifier-name'),
'E8742': ('PEP8 %s: %s',
'bad-class-identifier-name'),
'E8743': ('PEP8 %s: %s',
'bad-funtion-identifier-name'),
}
msgs_map = {
'E8722': 'W0702',
'E8741': 'C0103'
}
class PEP8Runtime(_PEP8BaseChecker):
'''
Process PEP8 E9 codes
'''
_msgs = {
'E8901': ('PEP8 %s: %s',
'SyntaxError-or-IndentationError'),
'E8902': ('PEP8 %s: %s',
'IOError'),
}
class PEP8IndentationWarning(_PEP8BaseChecker):
'''
Process PEP8 W1 codes
'''
_msgs = {
'W8191': ('PEP8 %s: %s',
'indentation-contains-tabs'),
}
class PEP8WhitespaceWarning(_PEP8BaseChecker):
'''
Process PEP8 W2 codes
'''
_msgs = {
'W8291': ('PEP8 %s: %s',
'trailing-whitespace' if pylint_version_info < (1, 0) else
'pep8-trailing-whitespace'),
'W8292': ('PEP8 %s: %s',
'no-newline-at-end-of-file'),
'W8293': ('PEP8 %s: %s',
'blank-line-contains-whitespace'),
}
msgs_map = {
'W8291': 'C0303',
'W8293': 'C0303'
}
class PEP8BlankLineWarning(_PEP8BaseChecker):
'''
Process PEP8 W3 codes
'''
_msgs = {
'W8391': ('PEP8 %s: %s',
'blank-line-at-end-of-file'),
}
class BinaryOperatorLineBreaks(_PEP8BaseChecker):
'''
Process PEP8 W5 codes
'''
_msgs = {
'W8503': ('PEP8 %s: %s',
'line-break-before-binary-operator'),
'W8504': ('PEP8 %s: %s',
'pep8-line-break-after-binary-operator'),
'W8505': ('PEP8 %s: %s',
'pep8-line-doc-too-long'),
}
class PEP8DeprecationWarning(_PEP8BaseChecker):
'''
Process PEP8 W6 codes
'''
_msgs = {
'W8601': ('PEP8 %s: %s',
'.has_key-is-deprecated-use-in'),
'W8602': ('PEP8 %s: %s',
'deprecated-form-of-raising-exception'),
'W8603': ('PEP8 %s: %s',
'less-or-more-is-deprecated-use-no-equal'),
'W8604': ('PEP8 %s: %s',
'backticks-are-deprecated-use-repr'),
'W8605': ('PEP8 %s: %s',
'pep8-invalid-escape-sequence'),
'W8606': ('PEP8 %s: %s',
'pep8-reserved-keywords')
}
msgs_map = {
'W8605': 'W1401'
}
# ----- Keep Track Of Handled PEP8 MSG IDs -------------------------------------------------------------------------->
for checker in list(locals().values()):
try:
if issubclass(checker, _PEP8BaseChecker):
_KNOWN_PEP8_IDS.extend(checker._msgs.keys())
except TypeError:
# Not class
continue
# <---- Keep Track Of Handled PEP8 MSG IDs ---------------------------------------------------------------------------
|
saltstack/salt-pylint | saltpylint/pep8.py | _PEP8BaseChecker.process_module | python | def process_module(self, node):
'''
process a module
the module's content is accessible via node.file_stream object
'''
nodepaths = []
if not isinstance(node.path, list):
nodepaths = [node.path]
else:
nodepaths = node.path
for node_path in nodepaths:
if node_path not in _PROCESSED_NODES:
stylechecker = StyleGuide(
parse_argv=False, config_file=False, quiet=2,
reporter=PyLintPEP8Reporter
)
_PROCESSED_NODES[node_path] = stylechecker.check_files([node_path])
for code, lineno, text in _PROCESSED_NODES[node_path].locations:
pylintcode = '{0}8{1}'.format(code[0], code[1:])
if pylintcode in self.msgs_map:
# This will be handled by PyLint itself, skip it
continue
if pylintcode not in _KNOWN_PEP8_IDS:
if pylintcode not in _UNHANDLED_PEP8_IDS:
_UNHANDLED_PEP8_IDS.append(pylintcode)
msg = 'The following code, {0}, was not handled by the PEP8 plugin'.format(pylintcode)
if logging.root.handlers:
logging.getLogger(__name__).warning(msg)
else:
sys.stderr.write('{0}\n'.format(msg))
continue
if pylintcode not in self._msgs:
# Not for our class implementation to handle
continue
if code in ('E111', 'E113'):
if _PROCESSED_NODES[node_path].lines[lineno-1].strip().startswith('#'):
# If E111 is triggered in a comment I consider it, at
# least, bad judgement. See https://github.com/jcrocholl/pep8/issues/300
# If E113 is triggered in comments, which I consider a bug,
# skip it. See https://github.com/jcrocholl/pep8/issues/274
continue
try:
self.add_message(pylintcode, line=lineno, args=(code, text))
except TypeError as exc:
if 'not all arguments' not in str(exc):
raise
# Message does not support being passed the text arg
self.add_message(pylintcode, line=lineno, args=(code,)) | process a module
the module's content is accessible via node.file_stream object | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/pep8.py#L90-L145 | null | class _PEP8BaseChecker(BaseChecker):
__implements__ = IRawChecker
name = 'pep8'
priority = -1
options = ()
msgs = None
_msgs = {}
msgs_map = {}
def __init__(self, linter=None):
# To avoid PyLints deprecation about a missing symbolic name and
# because I don't want to add descriptions, let's make the descriptions
# equal to the messages.
if self.msgs is None:
self.msgs = {}
for code, (message, symbolic) in six.iteritems(self._msgs):
self.msgs[code] = (message, symbolic, message)
BaseChecker.__init__(self, linter=linter)
|
saltstack/salt-pylint | saltpylint/minpyver.py | MininumPythonVersionChecker.process_module | python | def process_module(self, node):
'''
process a module
'''
if not HAS_PYQVER:
return
minimum_version = tuple([int(x) for x in self.config.minimum_python_version.split('.')])
with open(node.path, 'r') as rfh:
for version, reasons in pyqver2.get_versions(rfh.read()).iteritems():
if version > minimum_version:
for lineno, msg in reasons:
self.add_message(
'E0598', line=lineno,
args=(self.config.minimum_python_version, msg)
) | process a module | train | https://github.com/saltstack/salt-pylint/blob/524a419d3bfc7dbd91c9c85040bc64935a275b24/saltpylint/minpyver.py#L59-L74 | [
"def get_versions(source):\n \"\"\"Return information about the Python versions required for specific features.\n\n The return value is a dictionary with keys as a version number as a tuple\n (for example Python 2.6 is (2,6)) and the value are a list of features that\n require the indicated Python version.\n \"\"\"\n tree = compiler.parse(source)\n checker = compiler.walk(tree, NodeChecker())\n return checker.vers\n"
] | class MininumPythonVersionChecker(BaseChecker):
'''
Check the minimal required python version
'''
__implements__ = IRawChecker
name = 'mininum-python-version'
msgs = {'E0598': ('Incompatible Python %s code found: %s',
'minimum-python-version',
'The code does not meet the required minimum python version'),
}
priority = -1
options = (('minimum-python-version',
{'default': '2.6', 'type': 'string', 'metavar': 'MIN_PYTHON_VERSION',
'help': 'The desired minimum python version to enforce. Default: 2.6'}
),
)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.get_motion_detection | python | def get_motion_detection(self):
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection | Fetch current motion state from camera | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L141-L179 | [
"def element_query(self, element):\n \"\"\"Build tree query for a given element.\"\"\"\n return '{%s}%s' % (self.namespace, element)\n"
] | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera._set_motion_detection | python | def _set_motion_detection(self, enable):
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable | Set desired motion detection state on camera | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L189-L218 | [
"def element_query(self, element):\n \"\"\"Build tree query for a given element.\"\"\"\n return '{%s}%s' % (self.namespace, element)\n"
] | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.add_update_callback | python | def add_update_callback(self, callback, sensor):
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor) | Register as callback for when a matching device sensor changes. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L220-L223 | null | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera._do_update_callback | python | def _do_update_callback(self, msg):
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg) | Call registered callback functions. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L225-L231 | null | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.initialize | python | def initialize(self):
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection() | Initialize deviceInfo and available events. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L237-L274 | [
"def get_motion_detection(self):\n \"\"\"Fetch current motion state from camera\"\"\"\n url = ('%s/ISAPI/System/Video/inputs/'\n 'channels/1/motionDetection') % self.root_url\n\n try:\n response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)\n except (requests.exceptions.RequestException,\n requests.exceptions.ConnectionError) as err:\n _LOGGING.error('Unable to fetch MotionDetection, error: %s', err)\n self.motion_detection = None\n return self.motion_detection\n\n if response.status_code == requests.codes.unauthorized:\n _LOGGING.error('Authentication failed')\n self.motion_detection = None\n return self.motion_detection\n\n if response.status_code != requests.codes.ok:\n # If we didn't receive 200, abort\n _LOGGING.debug('Unable to fetch motion detection.')\n self.motion_detection = None\n return self.motion_detection\n\n try:\n tree = ET.fromstring(response.text)\n ET.register_namespace(\"\", self.namespace)\n enabled = tree.find(self.element_query('enabled'))\n\n if enabled is not None:\n self._motion_detection_xml = tree\n self.motion_detection = {'true': True, 'false': False}[enabled.text]\n return self.motion_detection\n\n except AttributeError as err:\n _LOGGING.error('Entire response: %s', response.text)\n _LOGGING.error('There was a problem: %s', err)\n self.motion_detection = None\n return self.motion_detection\n",
"def get_event_triggers(self):\n \"\"\"\n Returns dict of supported events.\n Key = Event Type\n List = Channels that have that event activated\n \"\"\"\n events = {}\n nvrflag = False\n event_xml = []\n\n url = '%s/ISAPI/Event/triggers' % self.root_url\n\n try:\n response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)\n if response.status_code == requests.codes.not_found:\n # Try alternate URL for triggers\n _LOGGING.debug('Using alternate triggers URL.')\n url = '%s/Event/triggers' % self.root_url\n response = self.hik_request.get(url)\n\n except (requests.exceptions.RequestException,\n requests.exceptions.ConnectionError) as err:\n _LOGGING.error('Unable to fetch events, error: %s', err)\n return None\n\n if response.status_code != 200:\n # If we didn't recieve 200, abort\n return None\n\n # pylint: disable=too-many-nested-blocks\n try:\n content = ET.fromstring(response.text)\n\n if content[0].find(self.element_query('EventTrigger')):\n event_xml = content[0].findall(\n self.element_query('EventTrigger'))\n elif content.find(self.element_query('EventTrigger')):\n # This is either an NVR or a rebadged camera\n event_xml = content.findall(\n self.element_query('EventTrigger'))\n\n for eventtrigger in event_xml:\n ettype = eventtrigger.find(self.element_query('eventType'))\n # Catch empty xml defintions\n if ettype is None:\n break\n etnotify = eventtrigger.find(\n self.element_query('EventTriggerNotificationList'))\n\n etchannel = None\n etchannel_num = 0\n\n for node_name in CHANNEL_NAMES:\n etchannel = eventtrigger.find(\n self.element_query(node_name))\n if etchannel is not None:\n try:\n # Need to make sure this is actually a number\n etchannel_num = int(etchannel.text)\n if etchannel_num > 1:\n # Must be an nvr\n nvrflag = True\n break\n except ValueError:\n # Field must not be an integer\n pass\n\n if etnotify:\n for notifytrigger in etnotify:\n ntype = notifytrigger.find(\n self.element_query('notificationMethod'))\n if ntype.text == 'center' or ntype.text == 'HTTP':\n \"\"\"\n If we got this far we found an event that we want\n to track.\n \"\"\"\n events.setdefault(ettype.text, []) \\\n .append(etchannel_num)\n\n except (AttributeError, ET.ParseError) as err:\n _LOGGING.error(\n 'There was a problem finding an element: %s', err)\n return None\n\n if nvrflag:\n self.device_type = NVR_DEVICE\n else:\n self.device_type = CAM_DEVICE\n _LOGGING.debug('Processed %s as %s Device.',\n self.cam_id, self.device_type)\n\n _LOGGING.debug('Found events: %s', events)\n self.hik_request.close()\n return events\n",
"def get_device_info(self):\n \"\"\"Parse deviceInfo into dictionary.\"\"\"\n device_info = {}\n url = '%s/ISAPI/System/deviceInfo' % self.root_url\n using_digest = False\n\n try:\n response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)\n if response.status_code == requests.codes.unauthorized:\n _LOGGING.debug('Basic authentication failed. Using digest.')\n self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)\n using_digest = True\n response = self.hik_request.get(url)\n\n if response.status_code == requests.codes.not_found:\n # Try alternate URL for deviceInfo\n _LOGGING.debug('Using alternate deviceInfo URL.')\n url = '%s/System/deviceInfo' % self.root_url\n response = self.hik_request.get(url)\n # Seems to be difference between camera and nvr, they can't seem to\n # agree if they should 404 or 401 first\n if not using_digest and response.status_code == requests.codes.unauthorized:\n _LOGGING.debug('Basic authentication failed. Using digest.')\n self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)\n using_digest = True\n response = self.hik_request.get(url)\n\n except (requests.exceptions.RequestException,\n requests.exceptions.ConnectionError) as err:\n _LOGGING.error('Unable to fetch deviceInfo, error: %s', err)\n return None\n\n if response.status_code == requests.codes.unauthorized:\n _LOGGING.error('Authentication failed')\n return None\n\n if response.status_code != requests.codes.ok:\n # If we didn't receive 200, abort\n _LOGGING.debug('Unable to fetch device info.')\n return None\n\n try:\n tree = ET.fromstring(response.text)\n # Try to fetch namespace from XML\n nmsp = tree.tag.split('}')[0].strip('{')\n self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE\n _LOGGING.debug('Using Namespace: %s', self.namespace)\n\n for item in tree:\n tag = item.tag.split('}')[1]\n device_info[tag] = item.text\n\n return device_info\n\n except AttributeError as err:\n _LOGGING.error('Entire response: %s', response.text)\n _LOGGING.error('There was a problem: %s', err)\n return None\n"
] | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.get_event_triggers | python | def get_event_triggers(self):
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events | Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L276-L369 | [
"def element_query(self, element):\n \"\"\"Build tree query for a given element.\"\"\"\n return '{%s}%s' % (self.namespace, element)\n"
] | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.get_device_info | python | def get_device_info(self):
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None | Parse deviceInfo into dictionary. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L371-L428 | null | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.watchdog_handler | python | def watchdog_handler(self):
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set() | Take care of threads if wachdog expires. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L430-L434 | null | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.disconnect | python | def disconnect(self):
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear() | Disconnect from event stream. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L436-L442 | null | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.alert_stream | python | def alert_stream(self, reset_event, kill_event):
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue | Open event stream. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L449-L531 | [
"def update_stale(self):\n \"\"\"Update stale active statuses\"\"\"\n # Some events don't post an inactive XML, only active.\n # If we don't get an active update for 5 seconds we can\n # assume the event is no longer active and update accordingly.\n for etype, echannels in self.event_states.items():\n for eprop in echannels:\n if eprop[3] is not None:\n sec_elap = ((datetime.datetime.now()-eprop[3])\n .total_seconds())\n # print('Seconds since last update: {}'.format(sec_elap))\n if sec_elap > 5 and eprop[0] is True:\n _LOGGING.debug('Updating stale event %s on CH(%s)',\n etype, eprop[1])\n attr = [False, eprop[1], eprop[2],\n datetime.datetime.now()]\n self.update_attributes(etype, eprop[1], attr)\n self.publish_changes(etype, eprop[1])\n"
] | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.process_stream | python | def process_stream(self, tree):
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet() | Process incoming event stream packets. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L533-L571 | [
"def element_query(self, element):\n \"\"\"Build tree query for a given element.\"\"\"\n return '{%s}%s' % (self.namespace, element)\n",
"def publish_changes(self, etype, echid):\n \"\"\"Post updates for specified event type.\"\"\"\n _LOGGING.debug('%s Update: %s, %s',\n self.name, etype, self.fetch_attributes(etype, echid))\n signal = 'ValueChanged.{}'.format(self.cam_id)\n sender = '{}.{}'.format(etype, echid)\n if dispatcher:\n dispatcher.send(signal=signal, sender=sender)\n\n self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))\n",
"def fetch_attributes(self, event, channel):\n \"\"\"Returns attribute list for a given event/channel.\"\"\"\n try:\n for sensor in self.event_states[event]:\n if sensor[1] == int(channel):\n return sensor\n except KeyError:\n return None\n",
"def update_attributes(self, event, channel, attr):\n \"\"\"Update attribute list for current event/channel.\"\"\"\n try:\n for i, sensor in enumerate(self.event_states[event]):\n if sensor[1] == int(channel):\n self.event_states[event][i] = attr\n except KeyError:\n _LOGGING.debug('Error updating attributes for: (%s, %s)',\n event, channel)\n"
] | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.update_stale | python | def update_stale(self):
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1]) | Update stale active statuses | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L573-L590 | [
"def publish_changes(self, etype, echid):\n \"\"\"Post updates for specified event type.\"\"\"\n _LOGGING.debug('%s Update: %s, %s',\n self.name, etype, self.fetch_attributes(etype, echid))\n signal = 'ValueChanged.{}'.format(self.cam_id)\n sender = '{}.{}'.format(etype, echid)\n if dispatcher:\n dispatcher.send(signal=signal, sender=sender)\n\n self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))\n",
"def update_attributes(self, event, channel, attr):\n \"\"\"Update attribute list for current event/channel.\"\"\"\n try:\n for i, sensor in enumerate(self.event_states[event]):\n if sensor[1] == int(channel):\n self.event_states[event][i] = attr\n except KeyError:\n _LOGGING.debug('Error updating attributes for: (%s, %s)',\n event, channel)\n"
] | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.publish_changes | python | def publish_changes(self, etype, echid):
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid)) | Post updates for specified event type. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L592-L601 | [
"def _do_update_callback(self, msg):\n \"\"\"Call registered callback functions.\"\"\"\n for callback, sensor in self._updateCallbacks:\n if sensor == msg:\n _LOGGING.debug('Update callback %s for sensor %s',\n callback, sensor)\n callback(msg)\n",
"def fetch_attributes(self, event, channel):\n \"\"\"Returns attribute list for a given event/channel.\"\"\"\n try:\n for sensor in self.event_states[event]:\n if sensor[1] == int(channel):\n return sensor\n except KeyError:\n return None\n"
] | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.fetch_attributes | python | def fetch_attributes(self, event, channel):
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None | Returns attribute list for a given event/channel. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L603-L610 | null | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def update_attributes(self, event, channel, attr):
"""Update attribute list for current event/channel."""
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel)
|
mezz64/pyHik | pyhik/hikvision.py | HikCamera.update_attributes | python | def update_attributes(self, event, channel, attr):
try:
for i, sensor in enumerate(self.event_states[event]):
if sensor[1] == int(channel):
self.event_states[event][i] = attr
except KeyError:
_LOGGING.debug('Error updating attributes for: (%s, %s)',
event, channel) | Update attribute list for current event/channel. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/hikvision.py#L612-L620 | null | class HikCamera(object):
"""Creates a new Hikvision api device."""
def __init__(self, host=None, port=DEFAULT_PORT,
usr=None, pwd=None):
"""Initialize device."""
_LOGGING.debug("pyHik %s initializing new hikvision device at: %s",
__version__, host)
self.event_states = {}
self.watchdog = Watchdog(300.0, self.watchdog_handler)
self.namespace = XML_NAMESPACE
if not host:
_LOGGING.error('Host not specified! Cannot continue.')
return
self.host = host
self.usr = usr
self.pwd = pwd
self.cam_id = 0
self.name = ''
self.device_type = None
self.motion_detection = None
self._motion_detection_xml = None
self.root_url = '{}:{}'.format(host, port)
# Build requests session for main thread calls
# Default to basic authentication. It will change to digest inside
# get_device_info if basic fails
self.hik_request = requests.Session()
self.hik_request.auth = (usr, pwd)
self.hik_request.headers.update(DEFAULT_HEADERS)
# Define event stream processing thread
self.kill_thrd = threading.Event()
self.reset_thrd = threading.Event()
self.thrd = threading.Thread(
target=self.alert_stream, args=(self.reset_thrd, self.kill_thrd,))
self.thrd.daemon = False
# Callbacks
self._updateCallbacks = []
self.initialize()
@property
def get_id(self):
"""Returns unique camera/nvr identifier."""
return self.cam_id
@property
def get_name(self):
"""Return camera/nvr name."""
return self.name
@property
def get_type(self):
"""Return device type."""
return self.device_type
@property
def current_event_states(self):
"""Return Event states dictionary"""
return self.event_states
@property
def current_motion_detection_state(self):
"""Return current state of motion detection property"""
return self.motion_detection
def get_motion_detection(self):
"""Fetch current motion state from camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch MotionDetection, error: %s', err)
self.motion_detection = None
return self.motion_detection
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
self.motion_detection = None
return self.motion_detection
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch motion detection.')
self.motion_detection = None
return self.motion_detection
try:
tree = ET.fromstring(response.text)
ET.register_namespace("", self.namespace)
enabled = tree.find(self.element_query('enabled'))
if enabled is not None:
self._motion_detection_xml = tree
self.motion_detection = {'true': True, 'false': False}[enabled.text]
return self.motion_detection
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
self.motion_detection = None
return self.motion_detection
def enable_motion_detection(self):
"""Enable motion detection"""
self._set_motion_detection(True)
def disable_motion_detection(self):
"""Disable motion detection"""
self._set_motion_detection(False)
def _set_motion_detection(self, enable):
"""Set desired motion detection state on camera"""
url = ('%s/ISAPI/System/Video/inputs/'
'channels/1/motionDetection') % self.root_url
enabled = self._motion_detection_xml.find(self.element_query('enabled'))
if enabled is None:
_LOGGING.error("Couldn't find 'enabled' in the xml")
_LOGGING.error('XML: %s', ET.tostring(self._motion_detection_xml))
return
enabled.text = 'true' if enable else 'false'
xml = ET.tostring(self._motion_detection_xml)
try:
response = self.hik_request.put(url, data=xml, timeout=CONNECT_TIMEOUT)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to set MotionDetection, error: %s', err)
return
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.error('Unable to set motion detection: %s', response.text)
self.motion_detection = enable
def add_update_callback(self, callback, sensor):
"""Register as callback for when a matching device sensor changes."""
self._updateCallbacks.append([callback, sensor])
_LOGGING.debug('Added update callback to %s on %s', callback, sensor)
def _do_update_callback(self, msg):
"""Call registered callback functions."""
for callback, sensor in self._updateCallbacks:
if sensor == msg:
_LOGGING.debug('Update callback %s for sensor %s',
callback, sensor)
callback(msg)
def element_query(self, element):
"""Build tree query for a given element."""
return '{%s}%s' % (self.namespace, element)
def initialize(self):
"""Initialize deviceInfo and available events."""
device_info = self.get_device_info()
if device_info is None:
self.name = None
self.cam_id = None
self.event_states = None
return
for key in device_info:
if key == 'deviceName':
self.name = device_info[key]
elif key == 'deviceID':
if len(device_info[key]) > 10:
self.cam_id = device_info[key]
else:
self.cam_id = uuid.uuid4()
events_available = self.get_event_triggers()
if events_available:
for event, channel_list in events_available.items():
for channel in channel_list:
try:
self.event_states.setdefault(
SENSOR_MAP[event.lower()], []).append(
[False, channel, 0, datetime.datetime.now()])
except KeyError:
# Sensor type doesn't have a known friendly name
# We can't reliably handle it at this time...
_LOGGING.warning(
'Sensor type "%s" is unsupported.', event)
_LOGGING.debug('Initialized Dictionary: %s', self.event_states)
else:
_LOGGING.debug('No Events available in dictionary.')
self.get_motion_detection()
def get_event_triggers(self):
"""
Returns dict of supported events.
Key = Event Type
List = Channels that have that event activated
"""
events = {}
nvrflag = False
event_xml = []
url = '%s/ISAPI/Event/triggers' % self.root_url
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.not_found:
# Try alternate URL for triggers
_LOGGING.debug('Using alternate triggers URL.')
url = '%s/Event/triggers' % self.root_url
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch events, error: %s', err)
return None
if response.status_code != 200:
# If we didn't recieve 200, abort
return None
# pylint: disable=too-many-nested-blocks
try:
content = ET.fromstring(response.text)
if content[0].find(self.element_query('EventTrigger')):
event_xml = content[0].findall(
self.element_query('EventTrigger'))
elif content.find(self.element_query('EventTrigger')):
# This is either an NVR or a rebadged camera
event_xml = content.findall(
self.element_query('EventTrigger'))
for eventtrigger in event_xml:
ettype = eventtrigger.find(self.element_query('eventType'))
# Catch empty xml defintions
if ettype is None:
break
etnotify = eventtrigger.find(
self.element_query('EventTriggerNotificationList'))
etchannel = None
etchannel_num = 0
for node_name in CHANNEL_NAMES:
etchannel = eventtrigger.find(
self.element_query(node_name))
if etchannel is not None:
try:
# Need to make sure this is actually a number
etchannel_num = int(etchannel.text)
if etchannel_num > 1:
# Must be an nvr
nvrflag = True
break
except ValueError:
# Field must not be an integer
pass
if etnotify:
for notifytrigger in etnotify:
ntype = notifytrigger.find(
self.element_query('notificationMethod'))
if ntype.text == 'center' or ntype.text == 'HTTP':
"""
If we got this far we found an event that we want
to track.
"""
events.setdefault(ettype.text, []) \
.append(etchannel_num)
except (AttributeError, ET.ParseError) as err:
_LOGGING.error(
'There was a problem finding an element: %s', err)
return None
if nvrflag:
self.device_type = NVR_DEVICE
else:
self.device_type = CAM_DEVICE
_LOGGING.debug('Processed %s as %s Device.',
self.cam_id, self.device_type)
_LOGGING.debug('Found events: %s', events)
self.hik_request.close()
return events
def get_device_info(self):
"""Parse deviceInfo into dictionary."""
device_info = {}
url = '%s/ISAPI/System/deviceInfo' % self.root_url
using_digest = False
try:
response = self.hik_request.get(url, timeout=CONNECT_TIMEOUT)
if response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
if response.status_code == requests.codes.not_found:
# Try alternate URL for deviceInfo
_LOGGING.debug('Using alternate deviceInfo URL.')
url = '%s/System/deviceInfo' % self.root_url
response = self.hik_request.get(url)
# Seems to be difference between camera and nvr, they can't seem to
# agree if they should 404 or 401 first
if not using_digest and response.status_code == requests.codes.unauthorized:
_LOGGING.debug('Basic authentication failed. Using digest.')
self.hik_request.auth = HTTPDigestAuth(self.usr, self.pwd)
using_digest = True
response = self.hik_request.get(url)
except (requests.exceptions.RequestException,
requests.exceptions.ConnectionError) as err:
_LOGGING.error('Unable to fetch deviceInfo, error: %s', err)
return None
if response.status_code == requests.codes.unauthorized:
_LOGGING.error('Authentication failed')
return None
if response.status_code != requests.codes.ok:
# If we didn't receive 200, abort
_LOGGING.debug('Unable to fetch device info.')
return None
try:
tree = ET.fromstring(response.text)
# Try to fetch namespace from XML
nmsp = tree.tag.split('}')[0].strip('{')
self.namespace = nmsp if nmsp.startswith('http') else XML_NAMESPACE
_LOGGING.debug('Using Namespace: %s', self.namespace)
for item in tree:
tag = item.tag.split('}')[1]
device_info[tag] = item.text
return device_info
except AttributeError as err:
_LOGGING.error('Entire response: %s', response.text)
_LOGGING.error('There was a problem: %s', err)
return None
def watchdog_handler(self):
"""Take care of threads if wachdog expires."""
_LOGGING.debug('%s Watchdog expired. Resetting connection.', self.name)
self.watchdog.stop()
self.reset_thrd.set()
def disconnect(self):
"""Disconnect from event stream."""
_LOGGING.debug('Disconnecting from stream: %s', self.name)
self.kill_thrd.set()
self.thrd.join()
_LOGGING.debug('Event stream thread for %s is stopped', self.name)
self.kill_thrd.clear()
def start_stream(self):
"""Start thread to process event stream."""
# self.watchdog.start()
self.thrd.start()
def alert_stream(self, reset_event, kill_event):
"""Open event stream."""
_LOGGING.debug('Stream Thread Started: %s, %s', self.name, self.cam_id)
start_event = False
parse_string = ""
fail_count = 0
url = '%s/ISAPI/Event/notification/alertStream' % self.root_url
# pylint: disable=too-many-nested-blocks
while True:
try:
stream = self.hik_request.get(url, stream=True,
timeout=(CONNECT_TIMEOUT,
READ_TIMEOUT))
if stream.status_code == requests.codes.not_found:
# Try alternate URL for stream
url = '%s/Event/notification/alertStream' % self.root_url
stream = self.hik_request.get(url, stream=True)
if stream.status_code != requests.codes.ok:
raise ValueError('Connection unsucessful.')
else:
_LOGGING.debug('%s Connection Successful.', self.name)
fail_count = 0
self.watchdog.start()
for line in stream.iter_lines():
# _LOGGING.debug('Processing line from %s', self.name)
# filter out keep-alive new lines
if line:
str_line = line.decode("utf-8", "ignore")
# New events start with --boundry
if str_line.find('<EventNotificationAlert') != -1:
# Start of event message
start_event = True
parse_string += str_line
elif str_line.find('</EventNotificationAlert>') != -1:
# Message end found found
parse_string += str_line
start_event = False
if parse_string:
tree = ET.fromstring(parse_string)
self.process_stream(tree)
self.update_stale()
parse_string = ""
else:
if start_event:
parse_string += str_line
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
break
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
if kill_event.is_set():
# We were asked to stop the thread so lets do so.
_LOGGING.debug('Stopping event stream thread for %s',
self.name)
self.watchdog.stop()
self.hik_request.close()
return
elif reset_event.is_set():
# We need to reset the connection.
raise ValueError('Watchdog failed.')
except (ValueError,
requests.exceptions.ConnectionError,
requests.exceptions.ChunkedEncodingError) as err:
fail_count += 1
reset_event.clear()
_LOGGING.warning('%s Connection Failed (count=%d). Waiting %ss. Err: %s',
self.name, fail_count, (fail_count * 5) + 5, err)
parse_string = ""
self.watchdog.stop()
self.hik_request.close()
time.sleep(5)
self.update_stale()
time.sleep(fail_count * 5)
continue
def process_stream(self, tree):
"""Process incoming event stream packets."""
try:
etype = SENSOR_MAP[tree.find(
self.element_query('eventType')).text.lower()]
estate = tree.find(
self.element_query('eventState')).text
echid = tree.find(
self.element_query('channelID'))
if echid is None:
# Some devices use a different key
echid = tree.find(
self.element_query('dynChannelID'))
echid = int(echid.text)
ecount = tree.find(
self.element_query('activePostCount')).text
except (AttributeError, KeyError, IndexError) as err:
_LOGGING.error('Problem finding attribute: %s', err)
return
# Take care of keep-alive
if len(etype) > 0 and etype == 'Video Loss':
self.watchdog.pet()
# Track state if it's in the event list.
if len(etype) > 0:
state = self.fetch_attributes(etype, echid)
if state:
# Determine if state has changed
# If so, publish, otherwise do nothing
estate = (estate == 'active')
old_state = state[0]
attr = [estate, echid, int(ecount),
datetime.datetime.now()]
self.update_attributes(etype, echid, attr)
if estate != old_state:
self.publish_changes(etype, echid)
self.watchdog.pet()
def update_stale(self):
"""Update stale active statuses"""
# Some events don't post an inactive XML, only active.
# If we don't get an active update for 5 seconds we can
# assume the event is no longer active and update accordingly.
for etype, echannels in self.event_states.items():
for eprop in echannels:
if eprop[3] is not None:
sec_elap = ((datetime.datetime.now()-eprop[3])
.total_seconds())
# print('Seconds since last update: {}'.format(sec_elap))
if sec_elap > 5 and eprop[0] is True:
_LOGGING.debug('Updating stale event %s on CH(%s)',
etype, eprop[1])
attr = [False, eprop[1], eprop[2],
datetime.datetime.now()]
self.update_attributes(etype, eprop[1], attr)
self.publish_changes(etype, eprop[1])
def publish_changes(self, etype, echid):
"""Post updates for specified event type."""
_LOGGING.debug('%s Update: %s, %s',
self.name, etype, self.fetch_attributes(etype, echid))
signal = 'ValueChanged.{}'.format(self.cam_id)
sender = '{}.{}'.format(etype, echid)
if dispatcher:
dispatcher.send(signal=signal, sender=sender)
self._do_update_callback('{}.{}.{}'.format(self.cam_id, etype, echid))
def fetch_attributes(self, event, channel):
"""Returns attribute list for a given event/channel."""
try:
for sensor in self.event_states[event]:
if sensor[1] == int(channel):
return sensor
except KeyError:
return None
|
mezz64/pyHik | pyhik/watchdog.py | Watchdog.start | python | def start(self):
self._timer = Timer(self.time, self.handler)
self._timer.daemon = True
self._timer.start()
return | Starts the watchdog timer. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/pyhik/watchdog.py#L21-L26 | null | class Watchdog(object):
""" Watchdog timer class. """
def __init__(self, timeout, handler):
""" Initialize watchdog variables. """
self.time = timeout
self.handler = handler
return
def pet(self):
""" Reset watchdog timer. """
self.stop()
self.start()
return
def stop(self):
""" Stops the watchdog timer. """
self._timer.cancel()
|
mezz64/pyHik | examples/basic_usage.py | main | python | def main():
"""Main function"""
cam = HikCamObject('http://XXX.XXX.XXX.XXX', 80, 'user', 'password')
entities = []
for sensor, channel_list in cam.sensors.items():
for channel in channel_list:
entities.append(HikSensor(sensor, channel[1], cam)) | Main function | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/examples/basic_usage.py#L97-L105 | null | """
Sample program for hikvision api.
"""
import logging
import pyhik.hikvision as hikvision
logging.basicConfig(filename='out.log', filemode='w', level=logging.DEBUG, format='%(asctime)s - %(levelname)s - %(message)s', datefmt='%m/%d/%Y %I:%M:%S %p')
class HikCamObject(object):
"""Representation of HIk camera."""
def __init__(self, url, port, user, passw):
"""initalize camera"""
# Establish camera
self.cam = hikvision.HikCamera(url, port, user, passw)
self._name = self.cam.get_name
self.motion = self.cam.current_motion_detection_state
# Start event stream
self.cam.start_stream()
self._event_states = self.cam.current_event_states
self._id = self.cam.get_id
print('NAME: {}'.format(self._name))
print('ID: {}'.format(self._id))
print('{}'.format(self._event_states))
print('Motion Dectect State: {}'.format(self.motion))
@property
def sensors(self):
"""Return list of available sensors and their states."""
return self.cam.current_event_states
def get_attributes(self, sensor, channel):
"""Return attribute list for sensor/channel."""
return self.cam.fetch_attributes(sensor, channel)
def stop_hik(self):
"""Shutdown Hikvision subscriptions and subscription thread on exit."""
self.cam.disconnect()
def flip_motion(self, value):
"""Toggle motion detection"""
if value:
self.cam.enable_motion_detection()
else:
self.cam.disable_motion_detection()
class HikSensor(object):
""" Hik camera sensor."""
def __init__(self, sensor, channel, cam):
"""Init"""
self._cam = cam
self._name = "{} {} {}".format(self._cam.cam.name, sensor, channel)
self._id = "{}.{}.{}".format(self._cam.cam.cam_id, sensor, channel)
self._sensor = sensor
self._channel = channel
self._cam.cam.add_update_callback(self.update_callback, self._id)
def _sensor_state(self):
"""Extract sensor state."""
return self._cam.get_attributes(self._sensor, self._channel)[0]
def _sensor_last_update(self):
"""Extract sensor last update time."""
return self._cam.get_attributes(self._sensor, self._channel)[3]
@property
def name(self):
"""Return the name of the Hikvision sensor."""
return self._name
@property
def unique_id(self):
"""Return an unique ID."""
return '{}.{}'.format(self.__class__, self._id)
@property
def is_on(self):
"""Return true if sensor is on."""
return self._sensor_state()
def update_callback(self, msg):
""" get updates. """
print('Callback: {}'.format(msg))
print('{}:{} @ {}'.format(self.name, self._sensor_state(), self._sensor_last_update()))
def main():
"""Main function"""
cam = HikCamObject('http://XXX.XXX.XXX.XXX', 80, 'user', 'password')
entities = []
for sensor, channel_list in cam.sensors.items():
for channel in channel_list:
entities.append(HikSensor(sensor, channel[1], cam))
main()
|
mezz64/pyHik | examples/basic_usage.py | HikCamObject.flip_motion | python | def flip_motion(self, value):
"""Toggle motion detection"""
if value:
self.cam.enable_motion_detection()
else:
self.cam.disable_motion_detection() | Toggle motion detection | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/examples/basic_usage.py#L47-L52 | null | class HikCamObject(object):
"""Representation of HIk camera."""
def __init__(self, url, port, user, passw):
"""initalize camera"""
# Establish camera
self.cam = hikvision.HikCamera(url, port, user, passw)
self._name = self.cam.get_name
self.motion = self.cam.current_motion_detection_state
# Start event stream
self.cam.start_stream()
self._event_states = self.cam.current_event_states
self._id = self.cam.get_id
print('NAME: {}'.format(self._name))
print('ID: {}'.format(self._id))
print('{}'.format(self._event_states))
print('Motion Dectect State: {}'.format(self.motion))
@property
def sensors(self):
"""Return list of available sensors and their states."""
return self.cam.current_event_states
def get_attributes(self, sensor, channel):
"""Return attribute list for sensor/channel."""
return self.cam.fetch_attributes(sensor, channel)
def stop_hik(self):
"""Shutdown Hikvision subscriptions and subscription thread on exit."""
self.cam.disconnect()
def flip_motion(self, value):
"""Toggle motion detection"""
if value:
self.cam.enable_motion_detection()
else:
self.cam.disable_motion_detection()
|
mezz64/pyHik | examples/basic_usage.py | HikSensor.update_callback | python | def update_callback(self, msg):
""" get updates. """
print('Callback: {}'.format(msg))
print('{}:{} @ {}'.format(self.name, self._sensor_state(), self._sensor_last_update())) | get updates. | train | https://github.com/mezz64/pyHik/blob/1e7afca926e2b045257a43cbf8b1236a435493c2/examples/basic_usage.py#L91-L94 | [
"def _sensor_state(self):\n \"\"\"Extract sensor state.\"\"\"\n return self._cam.get_attributes(self._sensor, self._channel)[0]\n",
"def _sensor_last_update(self):\n \"\"\"Extract sensor last update time.\"\"\"\n return self._cam.get_attributes(self._sensor, self._channel)[3]\n"
] | class HikSensor(object):
""" Hik camera sensor."""
def __init__(self, sensor, channel, cam):
"""Init"""
self._cam = cam
self._name = "{} {} {}".format(self._cam.cam.name, sensor, channel)
self._id = "{}.{}.{}".format(self._cam.cam.cam_id, sensor, channel)
self._sensor = sensor
self._channel = channel
self._cam.cam.add_update_callback(self.update_callback, self._id)
def _sensor_state(self):
"""Extract sensor state."""
return self._cam.get_attributes(self._sensor, self._channel)[0]
def _sensor_last_update(self):
"""Extract sensor last update time."""
return self._cam.get_attributes(self._sensor, self._channel)[3]
@property
def name(self):
"""Return the name of the Hikvision sensor."""
return self._name
@property
def unique_id(self):
"""Return an unique ID."""
return '{}.{}'.format(self.__class__, self._id)
@property
def is_on(self):
"""Return true if sensor is on."""
return self._sensor_state()
def update_callback(self, msg):
""" get updates. """
print('Callback: {}'.format(msg))
print('{}:{} @ {}'.format(self.name, self._sensor_state(), self._sensor_last_update()))
|
taxjar/taxjar-python | taxjar/client.py | Client.rates_for_location | python | def rates_for_location(self, postal_code, location_deets=None):
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request) | Shows the sales tax rates for a given location. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L31-L34 | [
"def _get(self, endpoint, data=None):\n if data is None:\n data = {}\n return self._request(requests.get, endpoint, {'params': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.tax_for_order | python | def tax_for_order(self, order_deets):
request = self._post('taxes', order_deets)
return self.responder(request) | Shows the sales tax that should be collected for a given order. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L36-L39 | [
"def _post(self, endpoint, data):\n return self._request(requests.post, endpoint, {'json': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.list_orders | python | def list_orders(self, params=None):
request = self._get('transactions/orders', params)
return self.responder(request) | Lists existing order transactions. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L41-L44 | [
"def _get(self, endpoint, data=None):\n if data is None:\n data = {}\n return self._request(requests.get, endpoint, {'params': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.show_order | python | def show_order(self, order_id):
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request) | Shows an existing order transaction. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L46-L49 | [
"def _get(self, endpoint, data=None):\n if data is None:\n data = {}\n return self._request(requests.get, endpoint, {'params': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.create_order | python | def create_order(self, order_deets):
request = self._post('transactions/orders', order_deets)
return self.responder(request) | Creates a new order transaction. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L51-L54 | [
"def _post(self, endpoint, data):\n return self._request(requests.post, endpoint, {'json': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.update_order | python | def update_order(self, order_id, order_deets):
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request) | Updates an existing order transaction. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L56-L59 | [
"def _put(self, endpoint, data):\n return self._request(requests.put, endpoint, {'json': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.delete_order | python | def delete_order(self, order_id):
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request) | Deletes an existing order transaction. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L61-L64 | [
"def _delete(self, endpoint):\n return self._request(requests.delete, endpoint)\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.list_refunds | python | def list_refunds(self, params=None):
request = self._get('transactions/refunds', params)
return self.responder(request) | Lists existing refund transactions. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L66-L69 | [
"def _get(self, endpoint, data=None):\n if data is None:\n data = {}\n return self._request(requests.get, endpoint, {'params': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.show_refund | python | def show_refund(self, refund_id):
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request) | Shows an existing refund transaction. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L71-L74 | [
"def _get(self, endpoint, data=None):\n if data is None:\n data = {}\n return self._request(requests.get, endpoint, {'params': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.create_refund | python | def create_refund(self, refund_deets):
request = self._post('transactions/refunds', refund_deets)
return self.responder(request) | Creates a new refund transaction. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L76-L79 | [
"def _post(self, endpoint, data):\n return self._request(requests.post, endpoint, {'json': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.update_refund | python | def update_refund(self, refund_id, refund_deets):
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request) | Updates an existing refund transaction. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L81-L84 | [
"def _put(self, endpoint, data):\n return self._request(requests.put, endpoint, {'json': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.delete_refund | python | def delete_refund(self, refund_id):
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request) | Deletes an existing refund transaction. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L86-L89 | [
"def _delete(self, endpoint):\n return self._request(requests.delete, endpoint)\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.list_customers | python | def list_customers(self, params=None):
request = self._get('customers', params)
return self.responder(request) | Lists existing customers. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L91-L94 | [
"def _get(self, endpoint, data=None):\n if data is None:\n data = {}\n return self._request(requests.get, endpoint, {'params': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.show_customer | python | def show_customer(self, customer_id):
request = self._get('customers/' + str(customer_id))
return self.responder(request) | Shows an existing customer. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L96-L99 | [
"def _get(self, endpoint, data=None):\n if data is None:\n data = {}\n return self._request(requests.get, endpoint, {'params': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.create_customer | python | def create_customer(self, customer_deets):
request = self._post('customers', customer_deets)
return self.responder(request) | Creates a new customer. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L101-L104 | [
"def _post(self, endpoint, data):\n return self._request(requests.post, endpoint, {'json': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.update_customer | python | def update_customer(self, customer_id, customer_deets):
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request) | Updates an existing customer. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L106-L109 | [
"def _put(self, endpoint, data):\n return self._request(requests.put, endpoint, {'json': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.delete_customer | python | def delete_customer(self, customer_id):
request = self._delete("customers/" + str(customer_id))
return self.responder(request) | Deletes an existing customer. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L111-L114 | [
"def _delete(self, endpoint):\n return self._request(requests.delete, endpoint)\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.validate_address | python | def validate_address(self, address_deets):
request = self._post('addresses/validate', address_deets)
return self.responder(request) | Validates a customer address and returns back a collection of address matches. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L121-L124 | [
"def _post(self, endpoint, data):\n return self._request(requests.post, endpoint, {'json': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate(self, vat_deets):
"""Validates an existing VAT identification number against VIES."""
request = self._get('validation', vat_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
taxjar/taxjar-python | taxjar/client.py | Client.validate | python | def validate(self, vat_deets):
request = self._get('validation', vat_deets)
return self.responder(request) | Validates an existing VAT identification number against VIES. | train | https://github.com/taxjar/taxjar-python/blob/be9b30d7dc968d24e066c7c133849fee180f8d95/taxjar/client.py#L126-L129 | [
"def _get(self, endpoint, data=None):\n if data is None:\n data = {}\n return self._request(requests.get, endpoint, {'params': data})\n"
] | class Client(object):
"""TaxJar Python Client"""
def __init__(self, api_key, api_url="", options=None, responder=TaxJarResponse.from_request):
if options is None:
options = {}
self.api_key = api_key
self.api_url = api_url if api_url else taxjar.DEFAULT_API_URL
self.api_url += "/" + taxjar.API_VERSION + "/"
self.headers = options.get('headers', {})
self.timeout = options.get('timeout', 5)
self.responder = responder
def set_api_config(self, key, value):
if key is 'api_url':
value += "/" + taxjar.API_VERSION + "/"
setattr(self, key, value)
def get_api_config(self, key):
return getattr(self, key)
def categories(self):
"""Lists all tax categories."""
request = self._get('categories')
return self.responder(request)
def rates_for_location(self, postal_code, location_deets=None):
"""Shows the sales tax rates for a given location."""
request = self._get("rates/" + postal_code, location_deets)
return self.responder(request)
def tax_for_order(self, order_deets):
"""Shows the sales tax that should be collected for a given order."""
request = self._post('taxes', order_deets)
return self.responder(request)
def list_orders(self, params=None):
"""Lists existing order transactions."""
request = self._get('transactions/orders', params)
return self.responder(request)
def show_order(self, order_id):
"""Shows an existing order transaction."""
request = self._get('transactions/orders/' + str(order_id))
return self.responder(request)
def create_order(self, order_deets):
"""Creates a new order transaction."""
request = self._post('transactions/orders', order_deets)
return self.responder(request)
def update_order(self, order_id, order_deets):
"""Updates an existing order transaction."""
request = self._put("transactions/orders/" + str(order_id), order_deets)
return self.responder(request)
def delete_order(self, order_id):
"""Deletes an existing order transaction."""
request = self._delete("transactions/orders/" + str(order_id))
return self.responder(request)
def list_refunds(self, params=None):
"""Lists existing refund transactions."""
request = self._get('transactions/refunds', params)
return self.responder(request)
def show_refund(self, refund_id):
"""Shows an existing refund transaction."""
request = self._get('transactions/refunds/' + str(refund_id))
return self.responder(request)
def create_refund(self, refund_deets):
"""Creates a new refund transaction."""
request = self._post('transactions/refunds', refund_deets)
return self.responder(request)
def update_refund(self, refund_id, refund_deets):
"""Updates an existing refund transaction."""
request = self._put('transactions/refunds/' + str(refund_id), refund_deets)
return self.responder(request)
def delete_refund(self, refund_id):
"""Deletes an existing refund transaction."""
request = self._delete('transactions/refunds/' + str(refund_id))
return self.responder(request)
def list_customers(self, params=None):
"""Lists existing customers."""
request = self._get('customers', params)
return self.responder(request)
def show_customer(self, customer_id):
"""Shows an existing customer."""
request = self._get('customers/' + str(customer_id))
return self.responder(request)
def create_customer(self, customer_deets):
"""Creates a new customer."""
request = self._post('customers', customer_deets)
return self.responder(request)
def update_customer(self, customer_id, customer_deets):
"""Updates an existing customer."""
request = self._put("customers/" + str(customer_id), customer_deets)
return self.responder(request)
def delete_customer(self, customer_id):
"""Deletes an existing customer."""
request = self._delete("customers/" + str(customer_id))
return self.responder(request)
def nexus_regions(self):
"""Lists existing nexus locations for a TaxJar account."""
request = self._get('nexus/regions')
return self.responder(request)
def validate_address(self, address_deets):
"""Validates a customer address and returns back a collection of address matches."""
request = self._post('addresses/validate', address_deets)
return self.responder(request)
def summary_rates(self):
"""Retrieve minimum and average sales tax rates by region as a backup."""
request = self._get('summary_rates')
return self.responder(request)
def _get(self, endpoint, data=None):
if data is None:
data = {}
return self._request(requests.get, endpoint, {'params': data})
def _post(self, endpoint, data):
return self._request(requests.post, endpoint, {'json': data})
def _put(self, endpoint, data):
return self._request(requests.put, endpoint, {'json': data})
def _delete(self, endpoint):
return self._request(requests.delete, endpoint)
def _request(self, method, endpoint, data=None):
if data is None:
data = {}
try:
data['timeout'] = self.timeout
return method(self._uri(self.api_url, endpoint), headers=self._headers(), **data)
except requests.Timeout as err:
raise TaxJarConnectionError(err)
except requests.ConnectionError as err:
raise TaxJarConnectionError(err)
@staticmethod
def _uri(api_url, endpoint):
return api_url + endpoint
def _default_headers(self):
return {'Authorization': 'Bearer ' + self.api_key,
'User-Agent': 'TaxJarPython/' + taxjar.VERSION}
def _headers(self):
headers = self._default_headers().copy()
headers.update(self.headers)
return headers
|
sashahart/vex | vex/options.py | make_arg_parser | python | def make_arg_parser():
parser = argparse.ArgumentParser(
formatter_class=argparse.RawTextHelpFormatter,
usage="vex [OPTIONS] VIRTUALENV_NAME COMMAND_TO_RUN ...",
)
make = parser.add_argument_group(title='To make a new virtualenv')
make.add_argument(
'-m', '--make',
action="store_true",
help="make named virtualenv before running command"
)
make.add_argument(
'--python',
help="specify which python for virtualenv to be made",
action="store",
default=None,
)
make.add_argument(
'--site-packages',
help="allow site package imports from new virtualenv",
action="store_true",
)
make.add_argument(
'--always-copy',
help="use copies instead of symlinks in new virtualenv",
action="store_true",
)
remove = parser.add_argument_group(title='To remove a virtualenv')
remove.add_argument(
'-r', '--remove',
action="store_true",
help="remove the named virtualenv after running command"
)
parser.add_argument(
"--path",
metavar="DIR",
help="absolute path to virtualenv to use",
action="store"
)
parser.add_argument(
'--cwd',
metavar="DIR",
action="store",
default='.',
help="path to run command in (default: '.' aka $PWD)",
)
parser.add_argument(
"--config",
metavar="FILE",
default=None,
action="store",
help="path to config file to read (default: '~/.vexrc')"
)
parser.add_argument(
'--shell-config',
metavar="SHELL",
dest="shell_to_configure",
action="store",
default=None,
help="print optional config for the specified shell"
)
parser.add_argument(
'--list',
metavar="PREFIX",
nargs="?",
const="",
default=None,
help="print a list of available virtualenvs [matching PREFIX]",
action="store"
)
parser.add_argument(
'--version',
help="print the version of vex that is being run",
action="store_true"
)
parser.add_argument(
"rest",
nargs=argparse.REMAINDER,
help=argparse.SUPPRESS)
return parser | Return a standard ArgumentParser object. | train | https://github.com/sashahart/vex/blob/b7680c40897b8cbe6aae55ec9812b4fb11738192/vex/options.py#L5-L90 | null | import argparse
from vex import exceptions
def get_options(argv):
"""Called to parse the given list as command-line arguments.
:returns:
an options object as returned by argparse.
"""
arg_parser = make_arg_parser()
options, unknown = arg_parser.parse_known_args(argv)
if unknown:
arg_parser.print_help()
raise exceptions.UnknownArguments(
"unknown args: {0!r}".format(unknown))
options.print_help = arg_parser.print_help
return options
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.