repository_name stringclasses 316 values | func_path_in_repository stringlengths 6 223 | func_name stringlengths 1 134 | language stringclasses 1 value | func_code_string stringlengths 57 65.5k | func_documentation_string stringlengths 1 46.3k | split_name stringclasses 1 value | func_code_url stringlengths 91 315 | called_functions listlengths 1 156 ⌀ | enclosing_scope stringlengths 2 1.48M |
|---|---|---|---|---|---|---|---|---|---|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP.data | python | async def data(self, email_message):
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message | Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4 | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L580-L616 | [
"async def do_cmd(self, *args, success=None):\n \"\"\"\n Sends the given command to the server.\n\n Args:\n *args: Command and arguments to be sent to the server.\n\n Raises:\n ConnectionResetError: If the connection with the server is\n unexpectedely lost.\n SMTPCommandF... | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP.auth | python | async def auth(self, username, password):
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message | Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response. | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L618-L664 | [
"async def ehlo_or_helo_if_needed(self):\n \"\"\"\n Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.\n\n If there hasn't been any previous *EHLO* or *HELO* command this\n session, tries to initiate the session. *EHLO* is tried first.\n\n Raises:\n ConnectionResetError: If the connec... | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP.starttls | python | async def starttls(self, context=None):
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message) | Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response. | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L666-L721 | [
"async def do_cmd(self, *args, success=None):\n \"\"\"\n Sends the given command to the server.\n\n Args:\n *args: Command and arguments to be sent to the server.\n\n Raises:\n ConnectionResetError: If the connection with the server is\n unexpectedely lost.\n SMTPCommandF... | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP.sendmail | python | async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors | Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details. | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L723-L810 | [
"async def mail(self, sender, options=None):\n \"\"\"\n Sends a SMTP 'MAIL' command. - Starts the mail transfer session.\n\n For further details, please check out `RFC 5321 § 4.1.1.2`_ and\n `§ 3.3`_.\n\n Args:\n sender (str): Sender mailbox (used as reverse-path).\n options (list of st... | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP.send_mail | python | async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
) | Alias for :meth:`SMTP.sendmail`. | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L812-L820 | [
"async def sendmail(\n self, sender, recipients, message, mail_options=None, rcpt_options=None\n):\n \"\"\"\n Performs an entire e-mail transaction.\n\n Example:\n\n >>> try:\n >>> with SMTP() as client:\n >>> try:\n >>> r = client.sendmail(sender, rec... | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP.ehlo_or_helo_if_needed | python | async def ehlo_or_helo_if_needed(self):
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo() | Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting. | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L822-L844 | [
"async def helo(self, from_host=None):\n \"\"\"\n Sends a SMTP 'HELO' command. - Identifies the client and starts the\n session.\n\n If given ``from_host`` is None, defaults to the client FQDN.\n\n For further details, please check out `RFC 5321 § 4.1.1.1`_.\n\n Args:\n from_host (str or No... | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP.close | python | async def close(self):
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state() | Cleans up after the connection to the SMTP server has been closed
(voluntarily or not). | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L846-L859 | [
"def reset_state(self):\n \"\"\"\n Resets some attributes to their default values.\n\n This is especially useful when initializing a newly created\n :class:`SMTP` instance and when closing an existing SMTP session.\n\n It allows us to use the same SMTP instance and connect several times.\n \"\"\"\... | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP._auth_cram_md5 | python | async def _auth_cram_md5(self, username, password):
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message | Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L861-L913 | null | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP._auth_login | python | async def _auth_login(self, username, password):
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message | Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L915-L954 | null | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP._auth_plain | python | async def _auth_plain(self, username, password):
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message | Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L956-L995 | null | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP.parse_esmtp_extensions | python | def parse_esmtp_extensions(message):
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths | Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods. | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L998-L1049 | null | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
@staticmethod
def prepare_message(message):
"""
Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2
"""
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
hwmrocker/smtplibaio | smtplibaio/smtp.py | SMTP.prepare_message | python | def prepare_message(message):
if isinstance(message, bytes):
bytes_message = message
else:
bytes_message = message.encode("ascii")
# The original algorithm uses regexes to do this stuff.
# This one is -IMHO- more pythonic and it is slightly faster.
#
# Another version is even faster, but I chose to keep something
# more pythonic and readable.
# FYI, the fastest way to do all this stuff seems to be
# (according to my benchmarks):
#
# bytes_message.replace(b"\r\n", b"\n") \
# .replace(b"\r", b"\n") \
# .replace(b"\n", b"\r\n")
#
# DOT_LINE_REGEX = re.compile(rb"^\.", re.MULTILINE)
# bytes_message = DOT_LINE_REGEX.sub(b"..", bytes_message)
#
# if not bytes_message.endswith(b"\r\n"):
# bytes_message += b"\r\n"
#
# bytes_message += b"\r\n.\r\n"
lines = []
for line in bytes_message.splitlines():
if line.startswith(b"."):
line = line.replace(b".", b"..", 1)
lines.append(line)
# Recompose the message with <CRLF> only:
bytes_message = b"\r\n".join(lines)
# Make sure message ends with <CRLF>.<CRLF>:
bytes_message += b"\r\n.\r\n"
return bytes_message | Returns the given message encoded in ascii with a format suitable for
SMTP transmission:
- Makes sure the message is ASCII encoded ;
- Normalizes line endings to '\r\n' ;
- Adds a (second) period at the beginning of lines that start
with a period ;
- Makes sure the message ends with '\r\n.\r\n'.
For further details, please check out RFC 5321 `§ 4.1.1.4`_
and `§ 4.5.2`_.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
.. _`§ 4.5.2`: https://tools.ietf.org/html/rfc5321#section-4.5.2 | train | https://github.com/hwmrocker/smtplibaio/blob/84ce8e45b7e706476739d0efcb416c18ecabbbb6/smtplibaio/smtp.py#L1052-L1108 | null | class SMTP:
"""
SMTP or ESMTP client.
This should follow RFC 5321 (SMTP), RFC 1869 (ESMTP), RFC 2554 (SMTP
Authentication) and RFC 2487 (Secure SMTP over TLS).
Attributes:
hostname (str): Hostname of the SMTP server we are connecting to.
port (int): Port on which the SMTP server listens for connections.
timeout (int): Not used.
last_helo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *HELO* response.
last_ehlo_response ((int or None, str or None)): A (code, message)
2-tuple containing the last *EHLO* response.
supports_esmtp (bool): True if the server supports ESMTP (set after a
*EHLO* command, False otherwise.
esmtp_extensions (dict): ESMTP extensions and parameters supported by
the SMTP server (set after a *EHLO* command).
auth_mechanisms (list of str): Authentication mechanisms supported by
the SMTP server.
ssl_context (bool): Always False. (Used in SMTP_SSL subclass)
reader (:class:`streams.SMTPStreamReader`): SMTP stream reader, used
to read server responses.
writer (:class:`streams.SMTPStreamWriter`): SMTP stream writer, used
to send commands to the server.
transport (:class:`asyncio.BaseTransport`): Communication channel
abstraction between client and server.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): If True, the connection is made using the
aioopenssl module. Defaults to False.
_fqdn (str): Client FQDN. Used to identify the client to the
server.
Class Attributes:
_default_port (int): Default port to use. Defaults to 25.
_supported_auth_mechanisms (dict): Dict containing the information
about supported authentication mechanisms, ordered by preference
of use. The entries consist in :
- The authentication mechanism name, in lowercase, as given by
SMTP servers.
- The name of the method to call to authenticate using the
mechanism.
"""
_default_port = 25
_supported_auth_mechanisms = {
"cram-md5": "_auth_cram_md5",
"plain": "_auth_plain",
"login": "_auth_login",
}
def __init__(
self,
hostname="localhost",
port=_default_port,
fqdn=None,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
loop=None,
use_aioopenssl=False,
):
"""
Initializes a new :class:`SMTP` instance.
Args:
hostname (str): Hostname of the SMTP server to connect to.
port (int): Port to use to connect to the SMTP server.
fqdn (str or None): Client Fully Qualified Domain Name. This is
used to identify the client to the server.
timeout (int): Not used.
loop (:class:`asyncio.BaseEventLoop`): Event loop to use.
use_aioopenssl (bool): Use the aioopenssl module to open
the connection. This is mandatory if you plan on using
STARTTLS.
"""
self.hostname = hostname
try:
self.port = int(port)
except ValueError:
self.port = self.__class__._default_port
self.timeout = timeout
self._fqdn = fqdn
self.loop = loop or asyncio.get_event_loop()
self.use_aioopenssl = use_aioopenssl
self.reset_state()
@property
def fqdn(self):
"""
Returns the string used to identify the client when initiating a SMTP
session.
RFC 5321 `§ 4.1.1.1`_ and `§ 4.1.3`_ tell us what to do:
- Use the client FQDN ;
- If it isn't available, we SHOULD fall back to an address literal.
Returns:
str: The value that should be used as the client FQDN.
.. _`§ 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
.. _`§ 4.1.3`: https//tools.ietf.org/html/rfc5321#section-4.1.3
"""
if self._fqdn is None:
# Let's try to retrieve it:
self._fqdn = socket.getfqdn()
if "." not in self._fqdn:
try:
info = socket.getaddrinfo(
host="localhost", port=None, proto=socket.IPPROTO_TCP
)
except socket.gaierror:
addr = "127.0.0.1"
else:
# We only consider the first returned result and we're
# only interested in getting the IP(v4 or v6) address:
addr = info[0][4][0]
self._fqdn = "[{}]".format(addr)
return self._fqdn
def reset_state(self):
"""
Resets some attributes to their default values.
This is especially useful when initializing a newly created
:class:`SMTP` instance and when closing an existing SMTP session.
It allows us to use the same SMTP instance and connect several times.
"""
self.last_helo_response = (None, None)
self.last_ehlo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
self.ssl_context = False
self.reader = None
self.writer = None
self.transport = None
async def __aenter__(self):
"""
Enters the asynchronous context manager.
Also tries to connect to the server.
Raises:
SMTPConnectionRefusedError: If the connection between client and
SMTP server can not be established.
.. seealso:: :meth:`SMTP.connect`
"""
await self.connect()
return self
async def __aexit__(self, *args):
"""
Exits the asynchronous context manager.
Closes the connection and resets instance attributes.
.. seealso:: :meth:`SMTP.quit`
"""
await self.quit()
async def connect(self):
"""
Connects to the server.
.. note:: This method is automatically invoked by
:meth:`SMTP.__aenter__`. The code is mostly borrowed from the
:func:`asyncio.streams.open_connection` source code.
Raises:
ConnectionError subclass: If the connection between client and
SMTP server can not be established.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
# First build the reader:
self.reader = SMTPStreamReader(loop=self.loop)
# Then build the protocol:
protocol = asyncio.StreamReaderProtocol(self.reader, loop=self.loop)
# With the just-built reader and protocol, create the connection and
# get the transport stream:
conn = {
"protocol_factory": lambda: protocol,
"host": self.hostname,
"port": self.port,
}
if self.use_aioopenssl:
conn.update(
{
"use_starttls": not self.ssl_context,
"ssl_context_factory": lambda transport: self.ssl_context,
"server_hostname": self.hostname, # For SSL
}
)
import aioopenssl
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await aioopenssl.create_starttls_connection(
self.loop, **conn
)
# HACK: aioopenssl transports don't implement is_closing, and thus drain() fails...
self.transport.is_closing = lambda: False
else:
conn["ssl"] = self.ssl_context
# This may raise a ConnectionError exception, which we let bubble up.
self.transport, _ = await self.loop.create_connection(**conn)
# If the connection has been established, build the writer:
self.writer = SMTPStreamWriter(self.transport, protocol, self.reader, self.loop)
code, message = await self.reader.read_reply()
if code != 220:
raise ConnectionRefusedError(code, message)
return code, message
async def do_cmd(self, *args, success=None):
"""
Sends the given command to the server.
Args:
*args: Command and arguments to be sent to the server.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
if success is None:
success = (250,)
cmd = " ".join(args)
await self.writer.send_command(cmd)
code, message = await self.reader.read_reply()
if code not in success:
raise SMTPCommandFailedError(code, message, cmd)
return code, message
async def helo(self, from_host=None):
"""
Sends a SMTP 'HELO' command. - Identifies the client and starts the
session.
If given ``from_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our HELO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("HELO", from_host)
self.last_helo_response = (code, message)
return code, message
async def ehlo(self, from_host=None):
"""
Sends a SMTP 'EHLO' command. - Identifies the client and starts the
session.
If given ``from`_host`` is None, defaults to the client FQDN.
For further details, please check out `RFC 5321 § 4.1.1.1`_.
Args:
from_host (str or None): Name to use to identify the client.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO greeting.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.1`: https://tools.ietf.org/html/rfc5321#section-4.1.1.1
"""
if from_host is None:
from_host = self.fqdn
code, message = await self.do_cmd("EHLO", from_host)
self.last_ehlo_response = (code, message)
extns, auths = SMTP.parse_esmtp_extensions(message)
self.esmtp_extensions = extns
self.auth_mechanisms = auths
self.supports_esmtp = True
return code, message
async def help(self, command_name=None):
"""
Sends a SMTP 'HELP' command.
For further details please check out `RFC 5321 § 4.1.1.8`_.
Args:
command_name (str or None, optional): Name of a command for which
you want help. For example, if you want to get help about the
'*RSET*' command, you'd call ``help('RSET')``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the HELP command fails.
Returns:
Help text as given by the server.
.. _`RFC 5321 § 4.1.1.8`: https://tools.ietf.org/html/rfc5321#section-4.1.1.8
"""
if command_name is None:
command_name = ""
code, message = await self.do_cmd("HELP", command_name)
return message
async def rset(self):
"""
Sends a SMTP 'RSET' command. - Resets the session.
For further details, please check out `RFC 5321 § 4.1.1.5`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RSET command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.5`: https://tools.ietf.org/html/rfc5321#section-4.1.1.5
"""
return await self.do_cmd("RSET")
async def noop(self):
"""
Sends a SMTP 'NOOP' command. - Doesn't do anything.
For further details, please check out `RFC 5321 § 4.1.1.9`_.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the NOOP command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.9`: https://tools.ietf.org/html/rfc5321#section-4.1.1.9
"""
return await self.do_cmd("NOOP")
async def vrfy(self, address):
"""
Sends a SMTP 'VRFY' command. - Tests the validity of the given address.
For further details, please check out `RFC 5321 § 4.1.1.6`_.
Args:
address (str): E-mail address to be checked.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the VRFY command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.6`: https://tools.ietf.org/html/rfc5321#section-4.1.1.6
"""
return await self.do_cmd("VRFY", address)
async def expn(self, address):
"""
Sends a SMTP 'EXPN' command. - Expands a mailing-list.
For further details, please check out `RFC 5321 § 4.1.1.7`_.
Args:
address (str): E-mail address to expand.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the EXPN command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.7`: https://tools.ietf.org/html/rfc5321#section-4.1.1.7
"""
return await self.do_cmd("EXPN", address)
async def mail(self, sender, options=None):
"""
Sends a SMTP 'MAIL' command. - Starts the mail transfer session.
For further details, please check out `RFC 5321 § 4.1.1.2`_ and
`§ 3.3`_.
Args:
sender (str): Sender mailbox (used as reverse-path).
options (list of str or None, optional): Additional options to send
along with the *MAIL* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the MAIL command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.2`: https://tools.ietf.org/html/rfc5321#section-4.1.1.2
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
from_addr = "FROM:{}".format(quoteaddr(sender))
code, message = await self.do_cmd("MAIL", from_addr, *options)
return code, message
async def rcpt(self, recipient, options=None):
"""
Sends a SMTP 'RCPT' command. - Indicates a recipient for the e-mail.
For further details, please check out `RFC 5321 § 4.1.1.3`_ and
`§ 3.3`_.
Args:
recipient (str): E-mail address of one recipient.
options (list of str or None, optional): Additional options to send
along with the *RCPT* command.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the RCPT command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
.. _`RFC 5321 § 4.1.1.3`: https://tools.ietf.org/html/rfc5321#section-4.1.1.3
.. _`§ 3.3`: https://tools.ietf.org/html/rfc5321#section-3.3
"""
if options is None:
options = []
to_addr = "TO:{}".format(quoteaddr(recipient))
code, message = await self.do_cmd("RCPT", to_addr, *options)
return code, message
async def quit(self):
"""
Sends a SMTP 'QUIT' command. - Ends the session.
For further details, please check out `RFC 5321 § 4.1.1.10`_.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response. If the connection is already closed when calling this
method, returns (-1, None).
.. _`RFC 5321 § 4.1.1.10`: https://tools.ietf.org/html/rfc5321#section-4.1.1.10
"""
code = -1
message = None
try:
code, message = await self.do_cmd("QUIT")
except ConnectionError:
# We voluntarily ignore this kind of exceptions since... the
# connection seems already closed.
pass
except SMTPCommandFailedError:
pass
await self.close()
return code, message
async def data(self, email_message):
"""
Sends a SMTP 'DATA' command. - Transmits the message to the server.
If ``email_message`` is a bytes object, sends it as it is. Else,
makes all the required changes so it can be safely trasmitted to the
SMTP server.`
For further details, please check out `RFC 5321 § 4.1.1.4`_.
Args:
email_message (str or bytes): Message to be sent.
Raises:
ConnectionError subclass: If the connection to the server is
unexpectedely lost.
SMTPCommandFailedError: If the DATA command fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server last
response (the one the server sent after all data were sent by
the client).
.. seealso: :meth:`SMTP.prepare_message`
.. _`RFC 5321 § 4.1.1.4`: https://tools.ietf.org/html/rfc5321#section-4.1.1.4
"""
code, message = await self.do_cmd("DATA", success=(354,))
email_message = SMTP.prepare_message(email_message)
self.writer.write(email_message) # write is non-blocking.
await self.writer.drain() # don't forget to drain.
code, message = await self.reader.read_reply()
return code, message
async def auth(self, username, password):
"""
Tries to authenticate user against the SMTP server.
Args:
username (str): Username to authenticate with.
password (str): Password to use along with the given ``username``.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPLoginError: If the authentication failed (either because all
attempts failed or because there was no suitable authentication
mechanism).
Returns:
(int, str): A (code, message) 2-tuple containing the last server
response.
"""
# EHLO/HELO is required:
await self.ehlo_or_helo_if_needed()
errors = [] # To store SMTPAuthenticationErrors
code = message = None
# Try to authenticate using all mechanisms supported by both
# server and client (and only these):
for auth, meth in self.__class__._supported_auth_mechanisms.items():
if auth in self.auth_mechanisms:
auth_func = getattr(self, meth)
try:
code, message = await auth_func(username, password)
except SMTPAuthenticationError as e:
errors.append(e)
else:
break
else:
if not errors:
err = "Could not find any suitable authentication mechanism."
errors.append(SMTPAuthenticationError(-1, err))
raise SMTPLoginError(errors)
return code, message
async def starttls(self, context=None):
"""
Upgrades the connection to the SMTP server into TLS mode.
If there has been no previous EHLO or HELO command this session, this
method tries ESMTP EHLO first.
If the server supports SSL/TLS, this will encrypt the rest of the SMTP
session.
Raises:
SMTPCommandNotSupportedError: If the server does not support STARTTLS.
SMTPCommandFailedError: If the STARTTLS command fails
BadImplementationError: If the connection does not use aioopenssl.
Args:
context (:obj:`OpenSSL.SSL.Context`): SSL context
Returns:
(int, message): A (code, message) 2-tuple containing the server
response.
"""
if not self.use_aioopenssl:
raise BadImplementationError("This connection does not use aioopenssl")
import aioopenssl
import OpenSSL
await self.ehlo_or_helo_if_needed()
if "starttls" not in self.esmtp_extensions:
raise SMTPCommandNotSupportedError("STARTTLS")
code, message = await self.do_cmd("STARTTLS", success=(220,))
# Don't check for code, do_cmd did it
if context is None:
context = OpenSSL.SSL.Context(OpenSSL.SSL.TLSv1_2_METHOD)
await self.transport.starttls(ssl_context=context)
# RFC 3207:
# The client MUST discard any knowledge obtained from
# the server, such as the list of SMTP service extensions,
# which was not obtained from the TLS negotiation itself.
# FIXME: wouldn't it be better to use reset_state here ?
# And reset self.reader, self.writer and self.transport just after
# Maybe also self.ssl_context ?
self.last_ehlo_response = (None, None)
self.last_helo_response = (None, None)
self.supports_esmtp = False
self.esmtp_extensions = {}
self.auth_mechanisms = []
return (code, message)
async def sendmail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Performs an entire e-mail transaction.
Example:
>>> try:
>>> with SMTP() as client:
>>> try:
>>> r = client.sendmail(sender, recipients, message)
>>> except SMTPException:
>>> print("Error while sending message.")
>>> else:
>>> print("Result: {}.".format(r))
>>> except ConnectionError as e:
>>> print(e)
Result: {}.
Args:
sender (str): E-mail address of the sender.
recipients (list of str or str): E-mail(s) address(es) of the
recipient(s).
message (str or bytes): Message body.
mail_options (list of str): ESMTP options (such as *8BITMIME*) to
send along the *MAIL* command.
rcpt_options (list of str): ESMTP options (such as *DSN*) to
send along all the *RCPT* commands.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
SMTPCommandFailedError: If the server refuses our MAIL command.
SMTPCommandFailedError: If the server refuses our DATA command.
SMTPNoRecipientError: If the server refuses all given
recipients.
Returns:
dict: A dict containing an entry for each recipient that was
refused. Each entry is associated with a (code, message)
2-tuple containing the error code and message, as returned by
the server.
When everythign runs smoothly, the returning dict is empty.
.. note:: The connection remains open after. It's your responsibility
to close it. A good practice is to use the asynchronous context
manager instead. See :meth:`SMTP.__aenter__` for further details.
"""
# Make sure `recipients` is a list:
if isinstance(recipients, str):
recipients = [recipients]
# Set some defaults values:
if mail_options is None:
mail_options = []
if rcpt_options is None:
rcpt_options = []
# EHLO or HELO is required:
await self.ehlo_or_helo_if_needed()
if self.supports_esmtp:
if "size" in self.esmtp_extensions:
mail_options.append("size={}".format(len(message)))
await self.mail(sender, mail_options)
errors = []
for recipient in recipients:
try:
await self.rcpt(recipient, rcpt_options)
except SMTPCommandFailedError as e:
errors.append(e)
if len(recipients) == len(errors):
# The server refused all our recipients:
raise SMTPNoRecipientError(errors)
await self.data(message)
# If we got here then somebody got our mail:
return errors
async def send_mail(
self, sender, recipients, message, mail_options=None, rcpt_options=None
):
"""
Alias for :meth:`SMTP.sendmail`.
"""
return await self.sendmail(
sender, recipients, message, mail_options, rcpt_options
)
async def ehlo_or_helo_if_needed(self):
"""
Calls :meth:`SMTP.ehlo` and/or :meth:`SMTP.helo` if needed.
If there hasn't been any previous *EHLO* or *HELO* command this
session, tries to initiate the session. *EHLO* is tried first.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPCommandFailedError: If the server refuses our EHLO/HELO
greeting.
"""
no_helo = self.last_helo_response == (None, None)
no_ehlo = self.last_ehlo_response == (None, None)
if no_helo and no_ehlo:
try:
# First we try EHLO:
await self.ehlo()
except SMTPCommandFailedError:
# EHLO failed, let's try HELO:
await self.helo()
async def close(self):
"""
Cleans up after the connection to the SMTP server has been closed
(voluntarily or not).
"""
if self.writer is not None:
# Close the transport:
try:
self.writer.close()
except OSError as exc:
if exc.errno != errno.ENOTCONN:
raise
self.reset_state()
async def _auth_cram_md5(self, username, password):
"""
Performs an authentication attemps using the CRAM-MD5 mechanism.
Protocol:
1. Send 'AUTH CRAM-MD5' to server ;
2. If the server replies with a 334 return code, we can go on:
1) The challenge (sent by the server) is base64-decoded ;
2) The decoded challenge is hashed using HMAC-MD5 and the user
password as key (shared secret) ;
3) The hashed challenge is converted to a string of lowercase
hexadecimal digits ;
4) The username and a space character are prepended to the hex
digits ;
5) The concatenation is base64-encoded and sent to the server.
6) If the server replies with a return code of 235, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "CRAM-MD5"
code, message = await self.do_cmd("AUTH", mechanism, success=(334,))
decoded_challenge = base64.b64decode(message)
challenge_hash = hmac.new(
key=password.encode("utf-8"), msg=decoded_challenge, digestmod="md5"
)
hex_hash = challenge_hash.hexdigest()
response = "{} {}".format(username, hex_hash)
encoded_response = SMTP.b64enc(response)
try:
code, message = await self.do_cmd(encoded_response, success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_login(self, username, password):
"""
Performs an authentication attempt using the LOGIN mechanism.
Protocol:
1. The username is base64-encoded ;
2. The string 'AUTH LOGIN' and a space character are prepended to
the base64-encoded username and sent to the server ;
3. If the server replies with a 334 return code, we can go on:
1) The password is base64-encoded and sent to the server ;
2) If the server replies with a 235 return code, the user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "LOGIN"
code, message = await self.do_cmd(
"AUTH", mechanism, SMTP.b64enc(username), success=(334,)
)
try:
code, message = await self.do_cmd(SMTP.b64enc(password), success=(235, 503))
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
async def _auth_plain(self, username, password):
"""
Performs an authentication attempt using the PLAIN mechanism.
Protocol:
1. Format the username and password in a suitable way ;
2. The formatted string is base64-encoded ;
3. The string 'AUTH PLAIN' and a space character are prepended to
the base64-encoded username and password and sent to the
server ;
4. If the server replies with a 235 return code, user is
authenticated.
Args:
username (str): Identifier of the user trying to authenticate.
password (str): Password for the user.
Raises:
ConnectionResetError: If the connection with the server is
unexpectedely lost.
SMTPAuthenticationError: If the authentication attempt fails.
Returns:
(int, str): A (code, message) 2-tuple containing the server
response.
"""
mechanism = "PLAIN"
credentials = "\0{}\0{}".format(username, password)
encoded_credentials = SMTP.b64enc(credentials)
try:
code, message = await self.do_cmd(
"AUTH", mechanism, encoded_credentials, success=(235, 503)
)
except SMTPCommandFailedError as e:
raise SMTPAuthenticationError(e.code, e.message, mechanism)
return code, message
@staticmethod
def parse_esmtp_extensions(message):
"""
Parses the response given by an ESMTP server after a *EHLO* command.
The response is parsed to build:
- A dict of supported ESMTP extensions (with parameters, if any).
- A list of supported authentication methods.
Returns:
(dict, list): A (extensions, auth_mechanisms) 2-tuple containing
the supported extensions and authentication methods.
"""
extns = {}
auths = []
oldstyle_auth_regex = re.compile(r"auth=(?P<auth>.*)", re.IGNORECASE)
extension_regex = re.compile(
r"(?P<feature>[a-z0-9][a-z0-9\-]*) ?", re.IGNORECASE
)
lines = message.splitlines()
for line in lines[1:]:
# To be able to communicate with as many SMTP servers as possible,
# we have to take the old-style auth advertisement into account.
match = oldstyle_auth_regex.match(line)
if match:
auth = match.group("auth")[0]
auth = auth.lower().strip()
if auth not in auths:
auths.append(auth)
# RFC 1869 requires a space between EHLO keyword and parameters.
# It's actually stricter, in that only spaces are allowed between
# parameters, but were not going to check for that here.
# Note that the space isn't present if there are no parameters.
match = extension_regex.match(line)
if match:
feature = match.group("feature").lower()
params = match.string[match.end("feature") :].strip()
extns[feature] = params
if feature == "auth":
auths.extend([param.strip().lower() for param in params.split()])
return extns, auths
@staticmethod
@staticmethod
def b64enc(s):
"""
Base64-encodes the given string and returns it as a :obj:`str`.
This is a simple helper function that takes a str, base64-encodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
s (str): String to be converted to base64.
Returns:
str: A base64-encoded string.
"""
return base64.b64encode(s.encode("utf-8")).decode("utf-8")
@staticmethod
def b64dec(b):
"""
Base64-decodes the given :obj:`bytes` and converts it to a :obj:`str`.
This is a simple helper function that takes a bytes, base64-decodes it
and returns it as str.
:mod:`base64` functions are working with :obj:`bytes`, hence this func.
Args:
b (bytes): A base64-encoded bytes.
Returns:
str: A base64-decoded string.
"""
return base64.b64decode(b).decode("utf-8")
|
networks-lab/tidyextractors | tidyextractors/base_extractor.py | BaseExtractor._col_type_set | python | def _col_type_set(self, col, df):
type_set = set()
if df[col].dtype == np.dtype(object):
unindexed_col = list(df[col])
for i in range(0, len(df[col])):
if unindexed_col[i] == np.nan:
continue
else:
type_set.add(type(unindexed_col[i]))
return type_set
else:
type_set.add(df[col].dtype)
return type_set | Determines the set of types present in a DataFrame column.
:param str col: A column name.
:param pandas.DataFrame df: The dataset. Usually ``self._data``.
:return: A set of Types. | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/base_extractor.py#L84-L103 | null | class BaseExtractor(object):
"""
BaseExtractor defines a basic interface, initialization routine, and data
manipulation tools for extractor subclasses.
"""
# _data stores the main collection of extracted test_data
_data = None
def __init__(self, source, auto_extract=True, *args, **kwargs):
"""
Extractor initialization. Should not be overridden by extractor subclasses.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param bool auto_extract: Extract data from source upon initialization?
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
"""
# Extract test_data unless otherwise specified
if auto_extract:
self._extract(source, *args, **kwargs)
# Do subclass initialization
self.__sub_init__(source, *args, **kwargs)
def __sub_init__(self, source, *args, **kwargs):
"""
Subclass initialization routine. Used so that subclasses do not need to override ``BaseExtractor.__init__``.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
:return: None
"""
pass
def __len__(self):
"""
Length of self._data.
:return: Integer
"""
return len(self._data)
def _extract(self, source, *args, **kwargs):
"""
This method handles data extraction, and should be overridden by extractor subclasses.
Default behaviour initializes an empty ``pandas.DataFrame``.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
:return: None
"""
self._data = pd.DataFrame()
def _drop_collections(self, df):
"""
Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.
:param pandas.DataFrame df: Usually self._data.
:return: pandas.DataFrame
"""
all_cols = df.columns
keep_cols = []
# Check whether each column contains collections.
for c in all_cols:
if len(self._col_type_set(c, df).intersection([set, dict, list])) == 0:
keep_cols.append(c)
return df[keep_cols]
def raw(self, drop_collections = False):
"""
Produces the extractor object's data as it is stored internally.
:param bool drop_collections: Defaults to False. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
def expand_on(self, col1, col2, rename1 = None, rename2 = None, drop = [], drop_collections = False):
"""
Returns a reshaped version of extractor's data, where unique combinations of values from col1 and col2
are given individual rows.
Example function call from ``tidymbox``:
.. code-block:: python
self.expand_on('From', 'To', ['MessageID', 'Recipient'], rename1='From', rename2='Recipient')
Columns to be expanded upon should be either atomic values or dictionaries of dictionaries. For example:
Input Data:
+-----------------+-------------------------------------------------------------------+
| col1 (Atomic) | col2 (Dict of Dict) |
+=================+===================================================================+
| value1 | {valueA : {attr1: X1, attr2: Y1}, valueB: {attr1: X2, attr2: Y2} |
+-----------------+-------------------------------------------------------------------+
| value2 | {valueC : {attr1: X3, attr2: Y3}, valueD: {attr1: X4, attr2: Y4} |
+-----------------+-------------------------------------------------------------------+
Output Data:
+---------------+---------------+-------+-------+
| col1_extended | col2_extended | attr1 | attr2 |
+===============+===============+=======+=======+
| value1 | valueA | X1 | Y1 |
+---------------+---------------+-------+-------+
| value1 | valueB | X2 | Y2 |
+---------------+---------------+-------+-------+
| value2 | valueA | X3 | Y3 |
+---------------+---------------+-------+-------+
| value2 | valueB | X4 | Y4 |
+---------------+---------------+-------+-------+
:param str col1: The first column to expand on. May be an atomic value, or a dict of dict.
:param str col2: The second column to expand on. May be an atomic value, or a dict of dict.
:param str rename1: The name for col1 after expansion. Defaults to col1_extended.
:param str rename2: The name for col2 after expansion. Defaults to col2_extended.
:param list drop: Column names to be dropped from output.
:param bool drop_collections: Should columns with compound values be dropped?
:return: pandas.DataFrame
"""
# Assumption 1: Expanded columns are either atomic are built in collections
# Assumption 2: New test_data columns added to rows from dicts in columns of collections.
# How many rows expected in the output?
count = len(self._data)
# How often should the progress bar be updated?
update_interval = max(min(count//100, 100), 5)
# What are the column names?
column_list = list(self._data.columns)
# Determine column index (for itertuples)
try:
col1_index = column_list.index(col1)
except ValueError:
warnings.warn('Could not find "{}" in columns.'.format(col1))
raise
try:
col2_index = column_list.index(col2)
except ValueError:
warnings.warn('Could not find "{}" in columns.'.format(col2))
raise
# Standardize the order of the specified columns
first_index = min(col1_index, col2_index)
second_index = max(col1_index, col2_index)
first_name = column_list[first_index]
second_name = column_list[second_index]
first_rename = rename1 if first_index == col1_index else rename2
second_rename = rename2 if first_index == col1_index else rename1
# New column names:
new_column_list = column_list[:first_index] + \
[first_name+'_extended' if first_rename is None else first_rename] + \
column_list[first_index+1:second_index] + \
[second_name+'_extended' if second_rename is None else second_rename] + \
column_list[second_index+1:]
# Assert that there are no duplicates!
if len(set(new_column_list)) != len(new_column_list):
raise Exception('Duplicate columns names found. Note that you cannot rename a column with a name '
'that is already taken by another column.')
# List of tuples. Rows in new test_data frame.
old_attr_df_tuples = []
new_attr_df_dicts = []
# MultiIndex tuples
index_tuples = []
def iter_product(item1,item2):
"""
Enumerates possible combinations of items from item1 and item 2. Allows atomic values.
:param item1: Any
:param item2: Any
:return: A list of tuples.
"""
if hasattr(item1, '__iter__') and type(item1) != str:
iter1 = item1
else:
iter1 = [item1]
if hasattr(item2, '__iter__') and type(item2) != str:
iter2 = item2
else:
iter2 = [item2]
return it.product(iter1,iter2)
# Create test_data for output.
with tqdm.tqdm(total=count) as pbar:
for row in self._data.itertuples(index=False):
# Enumerate commit/file pairs
for index in iter_product(row[first_index],row[second_index]):
new_row = row[:first_index] + \
(index[0],) + \
row[first_index+1:second_index] + \
(index[1],) + \
row[second_index+1:]
# Add new row to list of row tuples
old_attr_df_tuples.append(new_row)
# Add key tuple to list of indices
index_tuples.append((index[0],index[1]))
# If there's test_data in either of the columns add the test_data to the new attr test_data frame.
temp_attrs = {}
# Get a copy of the first cell value for this index.
# If it's a dict, get the appropriate entry.
temp_first = row[first_index]
if type(temp_first) == dict:
temp_first = temp_first[index[0]]
temp_second = row[second_index]
if type(temp_second) == dict:
temp_second = temp_second[index[1]]
# Get nested test_data for this index.
if type(temp_first) == dict:
for k in temp_first:
temp_attrs[first_name + '/' + k] = temp_first[k]
if type(temp_second) == dict:
for k in temp_second:
temp_attrs[second_name + '/' + k] = temp_second[k]
# Add to the "new test_data" records.
new_attr_df_dicts.append(temp_attrs)
# Update progress bar
pbar.update(update_interval)
# An expanded test_data frame with only the columns of the original test_data frame
df_1 = pd.DataFrame.from_records(old_attr_df_tuples,
columns=new_column_list)
# An expanded test_data frame containing any test_data held in value:key collections in the expanded cols
df_2 = pd.DataFrame.from_records(new_attr_df_dicts)
# The final expanded test_data set
df_out = pd.concat([df_1, df_2], axis=1)
# Set new index
# index_cols has been depracated
# df_out = df_out.set_index(index_cols)
# Drop unwanted columns
for col in drop:
if col in df_out.columns:
df_out = df_out.drop(col,1)
if drop_collections is True:
df_out = self._drop_collections(df_out)
return df_out
|
networks-lab/tidyextractors | tidyextractors/base_extractor.py | BaseExtractor._drop_collections | python | def _drop_collections(self, df):
all_cols = df.columns
keep_cols = []
# Check whether each column contains collections.
for c in all_cols:
if len(self._col_type_set(c, df).intersection([set, dict, list])) == 0:
keep_cols.append(c)
return df[keep_cols] | Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.
:param pandas.DataFrame df: Usually self._data.
:return: pandas.DataFrame | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/base_extractor.py#L105-L119 | [
"def _col_type_set(self, col, df):\n \"\"\"\n Determines the set of types present in a DataFrame column.\n\n :param str col: A column name.\n :param pandas.DataFrame df: The dataset. Usually ``self._data``.\n :return: A set of Types.\n \"\"\"\n type_set = set()\n if df[col].dtype == np.dtype... | class BaseExtractor(object):
"""
BaseExtractor defines a basic interface, initialization routine, and data
manipulation tools for extractor subclasses.
"""
# _data stores the main collection of extracted test_data
_data = None
def __init__(self, source, auto_extract=True, *args, **kwargs):
"""
Extractor initialization. Should not be overridden by extractor subclasses.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param bool auto_extract: Extract data from source upon initialization?
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
"""
# Extract test_data unless otherwise specified
if auto_extract:
self._extract(source, *args, **kwargs)
# Do subclass initialization
self.__sub_init__(source, *args, **kwargs)
def __sub_init__(self, source, *args, **kwargs):
"""
Subclass initialization routine. Used so that subclasses do not need to override ``BaseExtractor.__init__``.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
:return: None
"""
pass
def __len__(self):
"""
Length of self._data.
:return: Integer
"""
return len(self._data)
def _extract(self, source, *args, **kwargs):
"""
This method handles data extraction, and should be overridden by extractor subclasses.
Default behaviour initializes an empty ``pandas.DataFrame``.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
:return: None
"""
self._data = pd.DataFrame()
def _col_type_set(self, col, df):
"""
Determines the set of types present in a DataFrame column.
:param str col: A column name.
:param pandas.DataFrame df: The dataset. Usually ``self._data``.
:return: A set of Types.
"""
type_set = set()
if df[col].dtype == np.dtype(object):
unindexed_col = list(df[col])
for i in range(0, len(df[col])):
if unindexed_col[i] == np.nan:
continue
else:
type_set.add(type(unindexed_col[i]))
return type_set
else:
type_set.add(df[col].dtype)
return type_set
def raw(self, drop_collections = False):
"""
Produces the extractor object's data as it is stored internally.
:param bool drop_collections: Defaults to False. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
def expand_on(self, col1, col2, rename1 = None, rename2 = None, drop = [], drop_collections = False):
"""
Returns a reshaped version of extractor's data, where unique combinations of values from col1 and col2
are given individual rows.
Example function call from ``tidymbox``:
.. code-block:: python
self.expand_on('From', 'To', ['MessageID', 'Recipient'], rename1='From', rename2='Recipient')
Columns to be expanded upon should be either atomic values or dictionaries of dictionaries. For example:
Input Data:
+-----------------+-------------------------------------------------------------------+
| col1 (Atomic) | col2 (Dict of Dict) |
+=================+===================================================================+
| value1 | {valueA : {attr1: X1, attr2: Y1}, valueB: {attr1: X2, attr2: Y2} |
+-----------------+-------------------------------------------------------------------+
| value2 | {valueC : {attr1: X3, attr2: Y3}, valueD: {attr1: X4, attr2: Y4} |
+-----------------+-------------------------------------------------------------------+
Output Data:
+---------------+---------------+-------+-------+
| col1_extended | col2_extended | attr1 | attr2 |
+===============+===============+=======+=======+
| value1 | valueA | X1 | Y1 |
+---------------+---------------+-------+-------+
| value1 | valueB | X2 | Y2 |
+---------------+---------------+-------+-------+
| value2 | valueA | X3 | Y3 |
+---------------+---------------+-------+-------+
| value2 | valueB | X4 | Y4 |
+---------------+---------------+-------+-------+
:param str col1: The first column to expand on. May be an atomic value, or a dict of dict.
:param str col2: The second column to expand on. May be an atomic value, or a dict of dict.
:param str rename1: The name for col1 after expansion. Defaults to col1_extended.
:param str rename2: The name for col2 after expansion. Defaults to col2_extended.
:param list drop: Column names to be dropped from output.
:param bool drop_collections: Should columns with compound values be dropped?
:return: pandas.DataFrame
"""
# Assumption 1: Expanded columns are either atomic are built in collections
# Assumption 2: New test_data columns added to rows from dicts in columns of collections.
# How many rows expected in the output?
count = len(self._data)
# How often should the progress bar be updated?
update_interval = max(min(count//100, 100), 5)
# What are the column names?
column_list = list(self._data.columns)
# Determine column index (for itertuples)
try:
col1_index = column_list.index(col1)
except ValueError:
warnings.warn('Could not find "{}" in columns.'.format(col1))
raise
try:
col2_index = column_list.index(col2)
except ValueError:
warnings.warn('Could not find "{}" in columns.'.format(col2))
raise
# Standardize the order of the specified columns
first_index = min(col1_index, col2_index)
second_index = max(col1_index, col2_index)
first_name = column_list[first_index]
second_name = column_list[second_index]
first_rename = rename1 if first_index == col1_index else rename2
second_rename = rename2 if first_index == col1_index else rename1
# New column names:
new_column_list = column_list[:first_index] + \
[first_name+'_extended' if first_rename is None else first_rename] + \
column_list[first_index+1:second_index] + \
[second_name+'_extended' if second_rename is None else second_rename] + \
column_list[second_index+1:]
# Assert that there are no duplicates!
if len(set(new_column_list)) != len(new_column_list):
raise Exception('Duplicate columns names found. Note that you cannot rename a column with a name '
'that is already taken by another column.')
# List of tuples. Rows in new test_data frame.
old_attr_df_tuples = []
new_attr_df_dicts = []
# MultiIndex tuples
index_tuples = []
def iter_product(item1,item2):
"""
Enumerates possible combinations of items from item1 and item 2. Allows atomic values.
:param item1: Any
:param item2: Any
:return: A list of tuples.
"""
if hasattr(item1, '__iter__') and type(item1) != str:
iter1 = item1
else:
iter1 = [item1]
if hasattr(item2, '__iter__') and type(item2) != str:
iter2 = item2
else:
iter2 = [item2]
return it.product(iter1,iter2)
# Create test_data for output.
with tqdm.tqdm(total=count) as pbar:
for row in self._data.itertuples(index=False):
# Enumerate commit/file pairs
for index in iter_product(row[first_index],row[second_index]):
new_row = row[:first_index] + \
(index[0],) + \
row[first_index+1:second_index] + \
(index[1],) + \
row[second_index+1:]
# Add new row to list of row tuples
old_attr_df_tuples.append(new_row)
# Add key tuple to list of indices
index_tuples.append((index[0],index[1]))
# If there's test_data in either of the columns add the test_data to the new attr test_data frame.
temp_attrs = {}
# Get a copy of the first cell value for this index.
# If it's a dict, get the appropriate entry.
temp_first = row[first_index]
if type(temp_first) == dict:
temp_first = temp_first[index[0]]
temp_second = row[second_index]
if type(temp_second) == dict:
temp_second = temp_second[index[1]]
# Get nested test_data for this index.
if type(temp_first) == dict:
for k in temp_first:
temp_attrs[first_name + '/' + k] = temp_first[k]
if type(temp_second) == dict:
for k in temp_second:
temp_attrs[second_name + '/' + k] = temp_second[k]
# Add to the "new test_data" records.
new_attr_df_dicts.append(temp_attrs)
# Update progress bar
pbar.update(update_interval)
# An expanded test_data frame with only the columns of the original test_data frame
df_1 = pd.DataFrame.from_records(old_attr_df_tuples,
columns=new_column_list)
# An expanded test_data frame containing any test_data held in value:key collections in the expanded cols
df_2 = pd.DataFrame.from_records(new_attr_df_dicts)
# The final expanded test_data set
df_out = pd.concat([df_1, df_2], axis=1)
# Set new index
# index_cols has been depracated
# df_out = df_out.set_index(index_cols)
# Drop unwanted columns
for col in drop:
if col in df_out.columns:
df_out = df_out.drop(col,1)
if drop_collections is True:
df_out = self._drop_collections(df_out)
return df_out
|
networks-lab/tidyextractors | tidyextractors/base_extractor.py | BaseExtractor.raw | python | def raw(self, drop_collections = False):
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df | Produces the extractor object's data as it is stored internally.
:param bool drop_collections: Defaults to False. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/base_extractor.py#L121-L134 | [
"def _drop_collections(self, df):\n \"\"\"\n Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.\n\n :param pandas.DataFrame df: Usually self._data.\n :return: pandas.DataFrame\n \"\"\"\n all_cols = df.columns\n keep_cols = []\n\n # Check whether each column cont... | class BaseExtractor(object):
"""
BaseExtractor defines a basic interface, initialization routine, and data
manipulation tools for extractor subclasses.
"""
# _data stores the main collection of extracted test_data
_data = None
def __init__(self, source, auto_extract=True, *args, **kwargs):
"""
Extractor initialization. Should not be overridden by extractor subclasses.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param bool auto_extract: Extract data from source upon initialization?
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
"""
# Extract test_data unless otherwise specified
if auto_extract:
self._extract(source, *args, **kwargs)
# Do subclass initialization
self.__sub_init__(source, *args, **kwargs)
def __sub_init__(self, source, *args, **kwargs):
"""
Subclass initialization routine. Used so that subclasses do not need to override ``BaseExtractor.__init__``.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
:return: None
"""
pass
def __len__(self):
"""
Length of self._data.
:return: Integer
"""
return len(self._data)
def _extract(self, source, *args, **kwargs):
"""
This method handles data extraction, and should be overridden by extractor subclasses.
Default behaviour initializes an empty ``pandas.DataFrame``.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
:return: None
"""
self._data = pd.DataFrame()
def _col_type_set(self, col, df):
"""
Determines the set of types present in a DataFrame column.
:param str col: A column name.
:param pandas.DataFrame df: The dataset. Usually ``self._data``.
:return: A set of Types.
"""
type_set = set()
if df[col].dtype == np.dtype(object):
unindexed_col = list(df[col])
for i in range(0, len(df[col])):
if unindexed_col[i] == np.nan:
continue
else:
type_set.add(type(unindexed_col[i]))
return type_set
else:
type_set.add(df[col].dtype)
return type_set
def _drop_collections(self, df):
"""
Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.
:param pandas.DataFrame df: Usually self._data.
:return: pandas.DataFrame
"""
all_cols = df.columns
keep_cols = []
# Check whether each column contains collections.
for c in all_cols:
if len(self._col_type_set(c, df).intersection([set, dict, list])) == 0:
keep_cols.append(c)
return df[keep_cols]
def expand_on(self, col1, col2, rename1 = None, rename2 = None, drop = [], drop_collections = False):
"""
Returns a reshaped version of extractor's data, where unique combinations of values from col1 and col2
are given individual rows.
Example function call from ``tidymbox``:
.. code-block:: python
self.expand_on('From', 'To', ['MessageID', 'Recipient'], rename1='From', rename2='Recipient')
Columns to be expanded upon should be either atomic values or dictionaries of dictionaries. For example:
Input Data:
+-----------------+-------------------------------------------------------------------+
| col1 (Atomic) | col2 (Dict of Dict) |
+=================+===================================================================+
| value1 | {valueA : {attr1: X1, attr2: Y1}, valueB: {attr1: X2, attr2: Y2} |
+-----------------+-------------------------------------------------------------------+
| value2 | {valueC : {attr1: X3, attr2: Y3}, valueD: {attr1: X4, attr2: Y4} |
+-----------------+-------------------------------------------------------------------+
Output Data:
+---------------+---------------+-------+-------+
| col1_extended | col2_extended | attr1 | attr2 |
+===============+===============+=======+=======+
| value1 | valueA | X1 | Y1 |
+---------------+---------------+-------+-------+
| value1 | valueB | X2 | Y2 |
+---------------+---------------+-------+-------+
| value2 | valueA | X3 | Y3 |
+---------------+---------------+-------+-------+
| value2 | valueB | X4 | Y4 |
+---------------+---------------+-------+-------+
:param str col1: The first column to expand on. May be an atomic value, or a dict of dict.
:param str col2: The second column to expand on. May be an atomic value, or a dict of dict.
:param str rename1: The name for col1 after expansion. Defaults to col1_extended.
:param str rename2: The name for col2 after expansion. Defaults to col2_extended.
:param list drop: Column names to be dropped from output.
:param bool drop_collections: Should columns with compound values be dropped?
:return: pandas.DataFrame
"""
# Assumption 1: Expanded columns are either atomic are built in collections
# Assumption 2: New test_data columns added to rows from dicts in columns of collections.
# How many rows expected in the output?
count = len(self._data)
# How often should the progress bar be updated?
update_interval = max(min(count//100, 100), 5)
# What are the column names?
column_list = list(self._data.columns)
# Determine column index (for itertuples)
try:
col1_index = column_list.index(col1)
except ValueError:
warnings.warn('Could not find "{}" in columns.'.format(col1))
raise
try:
col2_index = column_list.index(col2)
except ValueError:
warnings.warn('Could not find "{}" in columns.'.format(col2))
raise
# Standardize the order of the specified columns
first_index = min(col1_index, col2_index)
second_index = max(col1_index, col2_index)
first_name = column_list[first_index]
second_name = column_list[second_index]
first_rename = rename1 if first_index == col1_index else rename2
second_rename = rename2 if first_index == col1_index else rename1
# New column names:
new_column_list = column_list[:first_index] + \
[first_name+'_extended' if first_rename is None else first_rename] + \
column_list[first_index+1:second_index] + \
[second_name+'_extended' if second_rename is None else second_rename] + \
column_list[second_index+1:]
# Assert that there are no duplicates!
if len(set(new_column_list)) != len(new_column_list):
raise Exception('Duplicate columns names found. Note that you cannot rename a column with a name '
'that is already taken by another column.')
# List of tuples. Rows in new test_data frame.
old_attr_df_tuples = []
new_attr_df_dicts = []
# MultiIndex tuples
index_tuples = []
def iter_product(item1,item2):
"""
Enumerates possible combinations of items from item1 and item 2. Allows atomic values.
:param item1: Any
:param item2: Any
:return: A list of tuples.
"""
if hasattr(item1, '__iter__') and type(item1) != str:
iter1 = item1
else:
iter1 = [item1]
if hasattr(item2, '__iter__') and type(item2) != str:
iter2 = item2
else:
iter2 = [item2]
return it.product(iter1,iter2)
# Create test_data for output.
with tqdm.tqdm(total=count) as pbar:
for row in self._data.itertuples(index=False):
# Enumerate commit/file pairs
for index in iter_product(row[first_index],row[second_index]):
new_row = row[:first_index] + \
(index[0],) + \
row[first_index+1:second_index] + \
(index[1],) + \
row[second_index+1:]
# Add new row to list of row tuples
old_attr_df_tuples.append(new_row)
# Add key tuple to list of indices
index_tuples.append((index[0],index[1]))
# If there's test_data in either of the columns add the test_data to the new attr test_data frame.
temp_attrs = {}
# Get a copy of the first cell value for this index.
# If it's a dict, get the appropriate entry.
temp_first = row[first_index]
if type(temp_first) == dict:
temp_first = temp_first[index[0]]
temp_second = row[second_index]
if type(temp_second) == dict:
temp_second = temp_second[index[1]]
# Get nested test_data for this index.
if type(temp_first) == dict:
for k in temp_first:
temp_attrs[first_name + '/' + k] = temp_first[k]
if type(temp_second) == dict:
for k in temp_second:
temp_attrs[second_name + '/' + k] = temp_second[k]
# Add to the "new test_data" records.
new_attr_df_dicts.append(temp_attrs)
# Update progress bar
pbar.update(update_interval)
# An expanded test_data frame with only the columns of the original test_data frame
df_1 = pd.DataFrame.from_records(old_attr_df_tuples,
columns=new_column_list)
# An expanded test_data frame containing any test_data held in value:key collections in the expanded cols
df_2 = pd.DataFrame.from_records(new_attr_df_dicts)
# The final expanded test_data set
df_out = pd.concat([df_1, df_2], axis=1)
# Set new index
# index_cols has been depracated
# df_out = df_out.set_index(index_cols)
# Drop unwanted columns
for col in drop:
if col in df_out.columns:
df_out = df_out.drop(col,1)
if drop_collections is True:
df_out = self._drop_collections(df_out)
return df_out
|
networks-lab/tidyextractors | tidyextractors/base_extractor.py | BaseExtractor.expand_on | python | def expand_on(self, col1, col2, rename1 = None, rename2 = None, drop = [], drop_collections = False):
# Assumption 1: Expanded columns are either atomic are built in collections
# Assumption 2: New test_data columns added to rows from dicts in columns of collections.
# How many rows expected in the output?
count = len(self._data)
# How often should the progress bar be updated?
update_interval = max(min(count//100, 100), 5)
# What are the column names?
column_list = list(self._data.columns)
# Determine column index (for itertuples)
try:
col1_index = column_list.index(col1)
except ValueError:
warnings.warn('Could not find "{}" in columns.'.format(col1))
raise
try:
col2_index = column_list.index(col2)
except ValueError:
warnings.warn('Could not find "{}" in columns.'.format(col2))
raise
# Standardize the order of the specified columns
first_index = min(col1_index, col2_index)
second_index = max(col1_index, col2_index)
first_name = column_list[first_index]
second_name = column_list[second_index]
first_rename = rename1 if first_index == col1_index else rename2
second_rename = rename2 if first_index == col1_index else rename1
# New column names:
new_column_list = column_list[:first_index] + \
[first_name+'_extended' if first_rename is None else first_rename] + \
column_list[first_index+1:second_index] + \
[second_name+'_extended' if second_rename is None else second_rename] + \
column_list[second_index+1:]
# Assert that there are no duplicates!
if len(set(new_column_list)) != len(new_column_list):
raise Exception('Duplicate columns names found. Note that you cannot rename a column with a name '
'that is already taken by another column.')
# List of tuples. Rows in new test_data frame.
old_attr_df_tuples = []
new_attr_df_dicts = []
# MultiIndex tuples
index_tuples = []
def iter_product(item1,item2):
"""
Enumerates possible combinations of items from item1 and item 2. Allows atomic values.
:param item1: Any
:param item2: Any
:return: A list of tuples.
"""
if hasattr(item1, '__iter__') and type(item1) != str:
iter1 = item1
else:
iter1 = [item1]
if hasattr(item2, '__iter__') and type(item2) != str:
iter2 = item2
else:
iter2 = [item2]
return it.product(iter1,iter2)
# Create test_data for output.
with tqdm.tqdm(total=count) as pbar:
for row in self._data.itertuples(index=False):
# Enumerate commit/file pairs
for index in iter_product(row[first_index],row[second_index]):
new_row = row[:first_index] + \
(index[0],) + \
row[first_index+1:second_index] + \
(index[1],) + \
row[second_index+1:]
# Add new row to list of row tuples
old_attr_df_tuples.append(new_row)
# Add key tuple to list of indices
index_tuples.append((index[0],index[1]))
# If there's test_data in either of the columns add the test_data to the new attr test_data frame.
temp_attrs = {}
# Get a copy of the first cell value for this index.
# If it's a dict, get the appropriate entry.
temp_first = row[first_index]
if type(temp_first) == dict:
temp_first = temp_first[index[0]]
temp_second = row[second_index]
if type(temp_second) == dict:
temp_second = temp_second[index[1]]
# Get nested test_data for this index.
if type(temp_first) == dict:
for k in temp_first:
temp_attrs[first_name + '/' + k] = temp_first[k]
if type(temp_second) == dict:
for k in temp_second:
temp_attrs[second_name + '/' + k] = temp_second[k]
# Add to the "new test_data" records.
new_attr_df_dicts.append(temp_attrs)
# Update progress bar
pbar.update(update_interval)
# An expanded test_data frame with only the columns of the original test_data frame
df_1 = pd.DataFrame.from_records(old_attr_df_tuples,
columns=new_column_list)
# An expanded test_data frame containing any test_data held in value:key collections in the expanded cols
df_2 = pd.DataFrame.from_records(new_attr_df_dicts)
# The final expanded test_data set
df_out = pd.concat([df_1, df_2], axis=1)
# Set new index
# index_cols has been depracated
# df_out = df_out.set_index(index_cols)
# Drop unwanted columns
for col in drop:
if col in df_out.columns:
df_out = df_out.drop(col,1)
if drop_collections is True:
df_out = self._drop_collections(df_out)
return df_out | Returns a reshaped version of extractor's data, where unique combinations of values from col1 and col2
are given individual rows.
Example function call from ``tidymbox``:
.. code-block:: python
self.expand_on('From', 'To', ['MessageID', 'Recipient'], rename1='From', rename2='Recipient')
Columns to be expanded upon should be either atomic values or dictionaries of dictionaries. For example:
Input Data:
+-----------------+-------------------------------------------------------------------+
| col1 (Atomic) | col2 (Dict of Dict) |
+=================+===================================================================+
| value1 | {valueA : {attr1: X1, attr2: Y1}, valueB: {attr1: X2, attr2: Y2} |
+-----------------+-------------------------------------------------------------------+
| value2 | {valueC : {attr1: X3, attr2: Y3}, valueD: {attr1: X4, attr2: Y4} |
+-----------------+-------------------------------------------------------------------+
Output Data:
+---------------+---------------+-------+-------+
| col1_extended | col2_extended | attr1 | attr2 |
+===============+===============+=======+=======+
| value1 | valueA | X1 | Y1 |
+---------------+---------------+-------+-------+
| value1 | valueB | X2 | Y2 |
+---------------+---------------+-------+-------+
| value2 | valueA | X3 | Y3 |
+---------------+---------------+-------+-------+
| value2 | valueB | X4 | Y4 |
+---------------+---------------+-------+-------+
:param str col1: The first column to expand on. May be an atomic value, or a dict of dict.
:param str col2: The second column to expand on. May be an atomic value, or a dict of dict.
:param str rename1: The name for col1 after expansion. Defaults to col1_extended.
:param str rename2: The name for col2 after expansion. Defaults to col2_extended.
:param list drop: Column names to be dropped from output.
:param bool drop_collections: Should columns with compound values be dropped?
:return: pandas.DataFrame | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/base_extractor.py#L136-L317 | [
"def _drop_collections(self, df):\n \"\"\"\n Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.\n\n :param pandas.DataFrame df: Usually self._data.\n :return: pandas.DataFrame\n \"\"\"\n all_cols = df.columns\n keep_cols = []\n\n # Check whether each column cont... | class BaseExtractor(object):
"""
BaseExtractor defines a basic interface, initialization routine, and data
manipulation tools for extractor subclasses.
"""
# _data stores the main collection of extracted test_data
_data = None
def __init__(self, source, auto_extract=True, *args, **kwargs):
"""
Extractor initialization. Should not be overridden by extractor subclasses.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param bool auto_extract: Extract data from source upon initialization?
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
"""
# Extract test_data unless otherwise specified
if auto_extract:
self._extract(source, *args, **kwargs)
# Do subclass initialization
self.__sub_init__(source, *args, **kwargs)
def __sub_init__(self, source, *args, **kwargs):
"""
Subclass initialization routine. Used so that subclasses do not need to override ``BaseExtractor.__init__``.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
:return: None
"""
pass
def __len__(self):
"""
Length of self._data.
:return: Integer
"""
return len(self._data)
def _extract(self, source, *args, **kwargs):
"""
This method handles data extraction, and should be overridden by extractor subclasses.
Default behaviour initializes an empty ``pandas.DataFrame``.
:param source: Specifies data source. Differs by subclass.
:param args: Arbitrary arguments permitted for extensibility.
:param kwargs: Arbitrary keyword arguments permitted for extensibility.
:return: None
"""
self._data = pd.DataFrame()
def _col_type_set(self, col, df):
"""
Determines the set of types present in a DataFrame column.
:param str col: A column name.
:param pandas.DataFrame df: The dataset. Usually ``self._data``.
:return: A set of Types.
"""
type_set = set()
if df[col].dtype == np.dtype(object):
unindexed_col = list(df[col])
for i in range(0, len(df[col])):
if unindexed_col[i] == np.nan:
continue
else:
type_set.add(type(unindexed_col[i]))
return type_set
else:
type_set.add(df[col].dtype)
return type_set
def _drop_collections(self, df):
"""
Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.
:param pandas.DataFrame df: Usually self._data.
:return: pandas.DataFrame
"""
all_cols = df.columns
keep_cols = []
# Check whether each column contains collections.
for c in all_cols:
if len(self._col_type_set(c, df).intersection([set, dict, list])) == 0:
keep_cols.append(c)
return df[keep_cols]
def raw(self, drop_collections = False):
"""
Produces the extractor object's data as it is stored internally.
:param bool drop_collections: Defaults to False. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
|
networks-lab/tidyextractors | tidyextractors/tidytwitter/twitter_extractor.py | TwitterExtractor._extract | python | def _extract(self, source, extract_tweets=True, *args, **kwargs):
# Check that the proper API keywords were provided.
for cred in ['access_token', 'access_secret', 'consumer_key', 'consumer_secret']:
if cred not in kwargs:
raise ValueError('API credentials missing from keyword arguments: {}'.format(cred))
# Set up API access
self._auth = OAuthHandler(kwargs['consumer_key'], kwargs['consumer_secret'])
self._auth.set_access_token(kwargs['access_token'],kwargs['access_secret'])
self._api = tweepy.API(self._auth)
# Make row dictionaries and count tweets
rows = []
num_tweets = 0
pbar1 = tqdm.tqdm(range(0,len(source)))
pbar1.set_description('Extracting user data...')
for u in source:
r = self._make_user_dict(u)
num_tweets = num_tweets + min(r['statuses_count'], 3200)
rows.append(r)
pbar1.update(1)
if extract_tweets is True:
# Extract tweets
pbar2 = tqdm.tqdm(range(0,num_tweets))
for r in rows:
if r['statuses_count'] > 0:
r['tweets'] = self._get_user_tweets(r['screen_name'])
else:
r['tweets'] = []
pbar2.set_description('Extracted tweets by {}'.format(r['screen_name']))
pbar2.update(r['statuses_count'])
self._data = pd.DataFrame.from_records(rows) | Extracts user data Using the twitter API. Mutates _data.
NOTE: TwitterExtractor requires a complete set of Twitter API credentials
to initialize: 'access_token', 'access_secret', 'consumer_key', and 'consumer_secret'.
:param list source: A list of user screen name strings.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidytwitter/twitter_extractor.py#L51-L94 | null | class TwitterExtractor(BaseExtractor):
"""
The ``TwitterExtractor`` class is for extracting user data from Twitter. This class
has methods for outputting data into the ``users`` and ``tweets`` tidy formats, and a
raw untidy format.
:param list source: A list of user screen name strings.
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
:param str access_token: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str access_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_key: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
"""
def users(self, drop_collections = True):
"""
Returns a table of Twitter user data, with "users" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
def tweets(self):
"""
Returns a table of Twitter user data, with "tweets" as rows/observations.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame
"""
# I've hard coded these. Seemed like a good idea at the time...
# TODO: Fix this.
all_columns = ['contributors_enabled', 'created_at', 'default_profile',
'default_profile_image', 'description', 'entities', 'favourites_count',
'follow_request_sent', 'followers_count', 'following', 'friends_count',
'geo_enabled', 'has_extended_profile', 'id_str',
'is_translation_enabled', 'is_translator', 'lang', 'listed_count',
'location', 'name', 'needs_phone_verification', 'notifications',
'profile_background_color', 'profile_background_image_url',
'profile_background_image_url_https', 'profile_background_tile',
'profile_banner_url', 'profile_image_url', 'profile_image_url_https',
'profile_link_color', 'profile_location',
'profile_sidebar_border_color', 'profile_sidebar_fill_color',
'profile_text_color', 'profile_use_background_image', 'protected',
'screen_name', 'statuses_count', 'suspended', 'time_zone',
'translator_type', 'url', 'utc_offset', 'verified', 'tweets/created',
'tweets/retweet', 'tweets/rt author', 'tweets/text']
keep_columns = ['id_str', 'lang', 'location', 'name',
'protected', 'screen_name','time_zone',
'utc_offset', 'tweets/created', 'tweets/retweet',
'tweets/rt author', 'tweets/text']
drop_columns = list(set(all_columns).difference(set(keep_columns)))
base_df = self.expand_on('id', 'tweets', rename1='id', rename2='tweet_id', drop=drop_columns)
return self._drop_collections(base_df)
def _handle_object(self, name, obj):
"""
Process an object using twitter_object_handlers_lookup.
Doesn't currently do anything (as of 2017-06-16).
:param str name: Object name
:param obj: An object to be processed
:return: A dictionary of attributes
"""
if type(obj) in twitter_object_handlers_lookup:
return twitter_object_handlers_lookup[type(obj)](name, obj)
else:
return {name: obj}
def _make_object_dict(self, obj):
"""
Processes an object, exporting its data as a nested dictionary.
:param obj: An object
:return: A nested dictionary of object data
"""
data = {}
for attr in dir(obj):
if attr[0] is not '_' and attr is not 'status':
datum = getattr(obj, attr)
if not isinstance(datum, types.MethodType):
data.update(self._handle_object(attr,datum))
return data
def _make_user_dict(self, username):
"""
Processes a Twitter User object, exporting as a nested dictionary.
Complex values (i.e. objects that aren't int, bool, float, str, or
a collection of such) are converted to strings (i.e. using __str__
or __repr__). To access user data only, use make_user_dict(username)['_json'].
:param username: A Twitter username string.
:return: A nested dictionary of user data.
"""
user = self._api.get_user(username)
return self._make_object_dict(user)
def _get_user_tweets(self, screen_name):
# TODO: Implement tweet limit
# Twitter only allows access to a users most recent 3240 tweets with this method
# initialize a list to hold all the tweepy Tweets
alltweets = []
# make initial request for most recent tweets (200 is the maximum allowed count)
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200)
# save most recent tweets
alltweets.extend(new_tweets)
# save the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# keep grabbing tweets until there are no tweets left to grab
while len(new_tweets) > 0:
# all subsequent requests use the max_id param to prevent duplicates
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200,max_id=oldest)
# save most recent tweets
alltweets.extend(new_tweets)
# update the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# transform the tweepy tweets into a 2D array that will populate the csv
outtweets = {tweet.id_str: {'created':tweet.created_at,'text':tweet.text} for tweet in alltweets}
# Twitter-aware tokenizer
tknzr = TweetTokenizer()
# Extend data with linguistic processing
for tweet_id in outtweets:
# Get tweet data from dictionary
tweet = outtweets[tweet_id]
# Lowercase tokenized tweet text
tweet_tokens = tknzr.tokenize(tweet['text'])
# Parts-of-speech tags for tokenized text
tweet_pos = nltk.pos_tag(tweet_tokens)
# Is the tweet a rewteet?
tweet['retweet'] = tweet_pos[0][0] == 'RT'
# If retweeted, who was the original author?
if tweet['retweet'] is True:
tweet['rt_author'] = tweet_pos[1][0]
else:
tweet['rt_author'] = ''
return outtweets
|
networks-lab/tidyextractors | tidyextractors/tidytwitter/twitter_extractor.py | TwitterExtractor.users | python | def users(self, drop_collections = True):
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df | Returns a table of Twitter user data, with "users" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidytwitter/twitter_extractor.py#L96-L109 | [
"def _drop_collections(self, df):\n \"\"\"\n Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.\n\n :param pandas.DataFrame df: Usually self._data.\n :return: pandas.DataFrame\n \"\"\"\n all_cols = df.columns\n keep_cols = []\n\n # Check whether each column cont... | class TwitterExtractor(BaseExtractor):
"""
The ``TwitterExtractor`` class is for extracting user data from Twitter. This class
has methods for outputting data into the ``users`` and ``tweets`` tidy formats, and a
raw untidy format.
:param list source: A list of user screen name strings.
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
:param str access_token: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str access_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_key: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
"""
def _extract(self, source, extract_tweets=True, *args, **kwargs):
"""
Extracts user data Using the twitter API. Mutates _data.
NOTE: TwitterExtractor requires a complete set of Twitter API credentials
to initialize: 'access_token', 'access_secret', 'consumer_key', and 'consumer_secret'.
:param list source: A list of user screen name strings.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None
"""
# Check that the proper API keywords were provided.
for cred in ['access_token', 'access_secret', 'consumer_key', 'consumer_secret']:
if cred not in kwargs:
raise ValueError('API credentials missing from keyword arguments: {}'.format(cred))
# Set up API access
self._auth = OAuthHandler(kwargs['consumer_key'], kwargs['consumer_secret'])
self._auth.set_access_token(kwargs['access_token'],kwargs['access_secret'])
self._api = tweepy.API(self._auth)
# Make row dictionaries and count tweets
rows = []
num_tweets = 0
pbar1 = tqdm.tqdm(range(0,len(source)))
pbar1.set_description('Extracting user data...')
for u in source:
r = self._make_user_dict(u)
num_tweets = num_tweets + min(r['statuses_count'], 3200)
rows.append(r)
pbar1.update(1)
if extract_tweets is True:
# Extract tweets
pbar2 = tqdm.tqdm(range(0,num_tweets))
for r in rows:
if r['statuses_count'] > 0:
r['tweets'] = self._get_user_tweets(r['screen_name'])
else:
r['tweets'] = []
pbar2.set_description('Extracted tweets by {}'.format(r['screen_name']))
pbar2.update(r['statuses_count'])
self._data = pd.DataFrame.from_records(rows)
def tweets(self):
"""
Returns a table of Twitter user data, with "tweets" as rows/observations.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame
"""
# I've hard coded these. Seemed like a good idea at the time...
# TODO: Fix this.
all_columns = ['contributors_enabled', 'created_at', 'default_profile',
'default_profile_image', 'description', 'entities', 'favourites_count',
'follow_request_sent', 'followers_count', 'following', 'friends_count',
'geo_enabled', 'has_extended_profile', 'id_str',
'is_translation_enabled', 'is_translator', 'lang', 'listed_count',
'location', 'name', 'needs_phone_verification', 'notifications',
'profile_background_color', 'profile_background_image_url',
'profile_background_image_url_https', 'profile_background_tile',
'profile_banner_url', 'profile_image_url', 'profile_image_url_https',
'profile_link_color', 'profile_location',
'profile_sidebar_border_color', 'profile_sidebar_fill_color',
'profile_text_color', 'profile_use_background_image', 'protected',
'screen_name', 'statuses_count', 'suspended', 'time_zone',
'translator_type', 'url', 'utc_offset', 'verified', 'tweets/created',
'tweets/retweet', 'tweets/rt author', 'tweets/text']
keep_columns = ['id_str', 'lang', 'location', 'name',
'protected', 'screen_name','time_zone',
'utc_offset', 'tweets/created', 'tweets/retweet',
'tweets/rt author', 'tweets/text']
drop_columns = list(set(all_columns).difference(set(keep_columns)))
base_df = self.expand_on('id', 'tweets', rename1='id', rename2='tweet_id', drop=drop_columns)
return self._drop_collections(base_df)
def _handle_object(self, name, obj):
"""
Process an object using twitter_object_handlers_lookup.
Doesn't currently do anything (as of 2017-06-16).
:param str name: Object name
:param obj: An object to be processed
:return: A dictionary of attributes
"""
if type(obj) in twitter_object_handlers_lookup:
return twitter_object_handlers_lookup[type(obj)](name, obj)
else:
return {name: obj}
def _make_object_dict(self, obj):
"""
Processes an object, exporting its data as a nested dictionary.
:param obj: An object
:return: A nested dictionary of object data
"""
data = {}
for attr in dir(obj):
if attr[0] is not '_' and attr is not 'status':
datum = getattr(obj, attr)
if not isinstance(datum, types.MethodType):
data.update(self._handle_object(attr,datum))
return data
def _make_user_dict(self, username):
"""
Processes a Twitter User object, exporting as a nested dictionary.
Complex values (i.e. objects that aren't int, bool, float, str, or
a collection of such) are converted to strings (i.e. using __str__
or __repr__). To access user data only, use make_user_dict(username)['_json'].
:param username: A Twitter username string.
:return: A nested dictionary of user data.
"""
user = self._api.get_user(username)
return self._make_object_dict(user)
def _get_user_tweets(self, screen_name):
# TODO: Implement tweet limit
# Twitter only allows access to a users most recent 3240 tweets with this method
# initialize a list to hold all the tweepy Tweets
alltweets = []
# make initial request for most recent tweets (200 is the maximum allowed count)
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200)
# save most recent tweets
alltweets.extend(new_tweets)
# save the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# keep grabbing tweets until there are no tweets left to grab
while len(new_tweets) > 0:
# all subsequent requests use the max_id param to prevent duplicates
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200,max_id=oldest)
# save most recent tweets
alltweets.extend(new_tweets)
# update the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# transform the tweepy tweets into a 2D array that will populate the csv
outtweets = {tweet.id_str: {'created':tweet.created_at,'text':tweet.text} for tweet in alltweets}
# Twitter-aware tokenizer
tknzr = TweetTokenizer()
# Extend data with linguistic processing
for tweet_id in outtweets:
# Get tweet data from dictionary
tweet = outtweets[tweet_id]
# Lowercase tokenized tweet text
tweet_tokens = tknzr.tokenize(tweet['text'])
# Parts-of-speech tags for tokenized text
tweet_pos = nltk.pos_tag(tweet_tokens)
# Is the tweet a rewteet?
tweet['retweet'] = tweet_pos[0][0] == 'RT'
# If retweeted, who was the original author?
if tweet['retweet'] is True:
tweet['rt_author'] = tweet_pos[1][0]
else:
tweet['rt_author'] = ''
return outtweets
|
networks-lab/tidyextractors | tidyextractors/tidytwitter/twitter_extractor.py | TwitterExtractor.tweets | python | def tweets(self):
# I've hard coded these. Seemed like a good idea at the time...
# TODO: Fix this.
all_columns = ['contributors_enabled', 'created_at', 'default_profile',
'default_profile_image', 'description', 'entities', 'favourites_count',
'follow_request_sent', 'followers_count', 'following', 'friends_count',
'geo_enabled', 'has_extended_profile', 'id_str',
'is_translation_enabled', 'is_translator', 'lang', 'listed_count',
'location', 'name', 'needs_phone_verification', 'notifications',
'profile_background_color', 'profile_background_image_url',
'profile_background_image_url_https', 'profile_background_tile',
'profile_banner_url', 'profile_image_url', 'profile_image_url_https',
'profile_link_color', 'profile_location',
'profile_sidebar_border_color', 'profile_sidebar_fill_color',
'profile_text_color', 'profile_use_background_image', 'protected',
'screen_name', 'statuses_count', 'suspended', 'time_zone',
'translator_type', 'url', 'utc_offset', 'verified', 'tweets/created',
'tweets/retweet', 'tweets/rt author', 'tweets/text']
keep_columns = ['id_str', 'lang', 'location', 'name',
'protected', 'screen_name','time_zone',
'utc_offset', 'tweets/created', 'tweets/retweet',
'tweets/rt author', 'tweets/text']
drop_columns = list(set(all_columns).difference(set(keep_columns)))
base_df = self.expand_on('id', 'tweets', rename1='id', rename2='tweet_id', drop=drop_columns)
return self._drop_collections(base_df) | Returns a table of Twitter user data, with "tweets" as rows/observations.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidytwitter/twitter_extractor.py#L111-L149 | [
"def _drop_collections(self, df):\n \"\"\"\n Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.\n\n :param pandas.DataFrame df: Usually self._data.\n :return: pandas.DataFrame\n \"\"\"\n all_cols = df.columns\n keep_cols = []\n\n # Check whether each column cont... | class TwitterExtractor(BaseExtractor):
"""
The ``TwitterExtractor`` class is for extracting user data from Twitter. This class
has methods for outputting data into the ``users`` and ``tweets`` tidy formats, and a
raw untidy format.
:param list source: A list of user screen name strings.
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
:param str access_token: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str access_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_key: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
"""
def _extract(self, source, extract_tweets=True, *args, **kwargs):
"""
Extracts user data Using the twitter API. Mutates _data.
NOTE: TwitterExtractor requires a complete set of Twitter API credentials
to initialize: 'access_token', 'access_secret', 'consumer_key', and 'consumer_secret'.
:param list source: A list of user screen name strings.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None
"""
# Check that the proper API keywords were provided.
for cred in ['access_token', 'access_secret', 'consumer_key', 'consumer_secret']:
if cred not in kwargs:
raise ValueError('API credentials missing from keyword arguments: {}'.format(cred))
# Set up API access
self._auth = OAuthHandler(kwargs['consumer_key'], kwargs['consumer_secret'])
self._auth.set_access_token(kwargs['access_token'],kwargs['access_secret'])
self._api = tweepy.API(self._auth)
# Make row dictionaries and count tweets
rows = []
num_tweets = 0
pbar1 = tqdm.tqdm(range(0,len(source)))
pbar1.set_description('Extracting user data...')
for u in source:
r = self._make_user_dict(u)
num_tweets = num_tweets + min(r['statuses_count'], 3200)
rows.append(r)
pbar1.update(1)
if extract_tweets is True:
# Extract tweets
pbar2 = tqdm.tqdm(range(0,num_tweets))
for r in rows:
if r['statuses_count'] > 0:
r['tweets'] = self._get_user_tweets(r['screen_name'])
else:
r['tweets'] = []
pbar2.set_description('Extracted tweets by {}'.format(r['screen_name']))
pbar2.update(r['statuses_count'])
self._data = pd.DataFrame.from_records(rows)
def users(self, drop_collections = True):
"""
Returns a table of Twitter user data, with "users" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
def _handle_object(self, name, obj):
"""
Process an object using twitter_object_handlers_lookup.
Doesn't currently do anything (as of 2017-06-16).
:param str name: Object name
:param obj: An object to be processed
:return: A dictionary of attributes
"""
if type(obj) in twitter_object_handlers_lookup:
return twitter_object_handlers_lookup[type(obj)](name, obj)
else:
return {name: obj}
def _make_object_dict(self, obj):
"""
Processes an object, exporting its data as a nested dictionary.
:param obj: An object
:return: A nested dictionary of object data
"""
data = {}
for attr in dir(obj):
if attr[0] is not '_' and attr is not 'status':
datum = getattr(obj, attr)
if not isinstance(datum, types.MethodType):
data.update(self._handle_object(attr,datum))
return data
def _make_user_dict(self, username):
"""
Processes a Twitter User object, exporting as a nested dictionary.
Complex values (i.e. objects that aren't int, bool, float, str, or
a collection of such) are converted to strings (i.e. using __str__
or __repr__). To access user data only, use make_user_dict(username)['_json'].
:param username: A Twitter username string.
:return: A nested dictionary of user data.
"""
user = self._api.get_user(username)
return self._make_object_dict(user)
def _get_user_tweets(self, screen_name):
# TODO: Implement tweet limit
# Twitter only allows access to a users most recent 3240 tweets with this method
# initialize a list to hold all the tweepy Tweets
alltweets = []
# make initial request for most recent tweets (200 is the maximum allowed count)
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200)
# save most recent tweets
alltweets.extend(new_tweets)
# save the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# keep grabbing tweets until there are no tweets left to grab
while len(new_tweets) > 0:
# all subsequent requests use the max_id param to prevent duplicates
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200,max_id=oldest)
# save most recent tweets
alltweets.extend(new_tweets)
# update the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# transform the tweepy tweets into a 2D array that will populate the csv
outtweets = {tweet.id_str: {'created':tweet.created_at,'text':tweet.text} for tweet in alltweets}
# Twitter-aware tokenizer
tknzr = TweetTokenizer()
# Extend data with linguistic processing
for tweet_id in outtweets:
# Get tweet data from dictionary
tweet = outtweets[tweet_id]
# Lowercase tokenized tweet text
tweet_tokens = tknzr.tokenize(tweet['text'])
# Parts-of-speech tags for tokenized text
tweet_pos = nltk.pos_tag(tweet_tokens)
# Is the tweet a rewteet?
tweet['retweet'] = tweet_pos[0][0] == 'RT'
# If retweeted, who was the original author?
if tweet['retweet'] is True:
tweet['rt_author'] = tweet_pos[1][0]
else:
tweet['rt_author'] = ''
return outtweets
|
networks-lab/tidyextractors | tidyextractors/tidytwitter/twitter_extractor.py | TwitterExtractor._handle_object | python | def _handle_object(self, name, obj):
if type(obj) in twitter_object_handlers_lookup:
return twitter_object_handlers_lookup[type(obj)](name, obj)
else:
return {name: obj} | Process an object using twitter_object_handlers_lookup.
Doesn't currently do anything (as of 2017-06-16).
:param str name: Object name
:param obj: An object to be processed
:return: A dictionary of attributes | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidytwitter/twitter_extractor.py#L151-L163 | null | class TwitterExtractor(BaseExtractor):
"""
The ``TwitterExtractor`` class is for extracting user data from Twitter. This class
has methods for outputting data into the ``users`` and ``tweets`` tidy formats, and a
raw untidy format.
:param list source: A list of user screen name strings.
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
:param str access_token: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str access_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_key: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
"""
def _extract(self, source, extract_tweets=True, *args, **kwargs):
"""
Extracts user data Using the twitter API. Mutates _data.
NOTE: TwitterExtractor requires a complete set of Twitter API credentials
to initialize: 'access_token', 'access_secret', 'consumer_key', and 'consumer_secret'.
:param list source: A list of user screen name strings.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None
"""
# Check that the proper API keywords were provided.
for cred in ['access_token', 'access_secret', 'consumer_key', 'consumer_secret']:
if cred not in kwargs:
raise ValueError('API credentials missing from keyword arguments: {}'.format(cred))
# Set up API access
self._auth = OAuthHandler(kwargs['consumer_key'], kwargs['consumer_secret'])
self._auth.set_access_token(kwargs['access_token'],kwargs['access_secret'])
self._api = tweepy.API(self._auth)
# Make row dictionaries and count tweets
rows = []
num_tweets = 0
pbar1 = tqdm.tqdm(range(0,len(source)))
pbar1.set_description('Extracting user data...')
for u in source:
r = self._make_user_dict(u)
num_tweets = num_tweets + min(r['statuses_count'], 3200)
rows.append(r)
pbar1.update(1)
if extract_tweets is True:
# Extract tweets
pbar2 = tqdm.tqdm(range(0,num_tweets))
for r in rows:
if r['statuses_count'] > 0:
r['tweets'] = self._get_user_tweets(r['screen_name'])
else:
r['tweets'] = []
pbar2.set_description('Extracted tweets by {}'.format(r['screen_name']))
pbar2.update(r['statuses_count'])
self._data = pd.DataFrame.from_records(rows)
def users(self, drop_collections = True):
"""
Returns a table of Twitter user data, with "users" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
def tweets(self):
"""
Returns a table of Twitter user data, with "tweets" as rows/observations.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame
"""
# I've hard coded these. Seemed like a good idea at the time...
# TODO: Fix this.
all_columns = ['contributors_enabled', 'created_at', 'default_profile',
'default_profile_image', 'description', 'entities', 'favourites_count',
'follow_request_sent', 'followers_count', 'following', 'friends_count',
'geo_enabled', 'has_extended_profile', 'id_str',
'is_translation_enabled', 'is_translator', 'lang', 'listed_count',
'location', 'name', 'needs_phone_verification', 'notifications',
'profile_background_color', 'profile_background_image_url',
'profile_background_image_url_https', 'profile_background_tile',
'profile_banner_url', 'profile_image_url', 'profile_image_url_https',
'profile_link_color', 'profile_location',
'profile_sidebar_border_color', 'profile_sidebar_fill_color',
'profile_text_color', 'profile_use_background_image', 'protected',
'screen_name', 'statuses_count', 'suspended', 'time_zone',
'translator_type', 'url', 'utc_offset', 'verified', 'tweets/created',
'tweets/retweet', 'tweets/rt author', 'tweets/text']
keep_columns = ['id_str', 'lang', 'location', 'name',
'protected', 'screen_name','time_zone',
'utc_offset', 'tweets/created', 'tweets/retweet',
'tweets/rt author', 'tweets/text']
drop_columns = list(set(all_columns).difference(set(keep_columns)))
base_df = self.expand_on('id', 'tweets', rename1='id', rename2='tweet_id', drop=drop_columns)
return self._drop_collections(base_df)
def _make_object_dict(self, obj):
"""
Processes an object, exporting its data as a nested dictionary.
:param obj: An object
:return: A nested dictionary of object data
"""
data = {}
for attr in dir(obj):
if attr[0] is not '_' and attr is not 'status':
datum = getattr(obj, attr)
if not isinstance(datum, types.MethodType):
data.update(self._handle_object(attr,datum))
return data
def _make_user_dict(self, username):
"""
Processes a Twitter User object, exporting as a nested dictionary.
Complex values (i.e. objects that aren't int, bool, float, str, or
a collection of such) are converted to strings (i.e. using __str__
or __repr__). To access user data only, use make_user_dict(username)['_json'].
:param username: A Twitter username string.
:return: A nested dictionary of user data.
"""
user = self._api.get_user(username)
return self._make_object_dict(user)
def _get_user_tweets(self, screen_name):
# TODO: Implement tweet limit
# Twitter only allows access to a users most recent 3240 tweets with this method
# initialize a list to hold all the tweepy Tweets
alltweets = []
# make initial request for most recent tweets (200 is the maximum allowed count)
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200)
# save most recent tweets
alltweets.extend(new_tweets)
# save the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# keep grabbing tweets until there are no tweets left to grab
while len(new_tweets) > 0:
# all subsequent requests use the max_id param to prevent duplicates
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200,max_id=oldest)
# save most recent tweets
alltweets.extend(new_tweets)
# update the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# transform the tweepy tweets into a 2D array that will populate the csv
outtweets = {tweet.id_str: {'created':tweet.created_at,'text':tweet.text} for tweet in alltweets}
# Twitter-aware tokenizer
tknzr = TweetTokenizer()
# Extend data with linguistic processing
for tweet_id in outtweets:
# Get tweet data from dictionary
tweet = outtweets[tweet_id]
# Lowercase tokenized tweet text
tweet_tokens = tknzr.tokenize(tweet['text'])
# Parts-of-speech tags for tokenized text
tweet_pos = nltk.pos_tag(tweet_tokens)
# Is the tweet a rewteet?
tweet['retweet'] = tweet_pos[0][0] == 'RT'
# If retweeted, who was the original author?
if tweet['retweet'] is True:
tweet['rt_author'] = tweet_pos[1][0]
else:
tweet['rt_author'] = ''
return outtweets
|
networks-lab/tidyextractors | tidyextractors/tidytwitter/twitter_extractor.py | TwitterExtractor._make_object_dict | python | def _make_object_dict(self, obj):
data = {}
for attr in dir(obj):
if attr[0] is not '_' and attr is not 'status':
datum = getattr(obj, attr)
if not isinstance(datum, types.MethodType):
data.update(self._handle_object(attr,datum))
return data | Processes an object, exporting its data as a nested dictionary.
:param obj: An object
:return: A nested dictionary of object data | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidytwitter/twitter_extractor.py#L165-L178 | null | class TwitterExtractor(BaseExtractor):
"""
The ``TwitterExtractor`` class is for extracting user data from Twitter. This class
has methods for outputting data into the ``users`` and ``tweets`` tidy formats, and a
raw untidy format.
:param list source: A list of user screen name strings.
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
:param str access_token: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str access_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_key: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
"""
def _extract(self, source, extract_tweets=True, *args, **kwargs):
"""
Extracts user data Using the twitter API. Mutates _data.
NOTE: TwitterExtractor requires a complete set of Twitter API credentials
to initialize: 'access_token', 'access_secret', 'consumer_key', and 'consumer_secret'.
:param list source: A list of user screen name strings.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None
"""
# Check that the proper API keywords were provided.
for cred in ['access_token', 'access_secret', 'consumer_key', 'consumer_secret']:
if cred not in kwargs:
raise ValueError('API credentials missing from keyword arguments: {}'.format(cred))
# Set up API access
self._auth = OAuthHandler(kwargs['consumer_key'], kwargs['consumer_secret'])
self._auth.set_access_token(kwargs['access_token'],kwargs['access_secret'])
self._api = tweepy.API(self._auth)
# Make row dictionaries and count tweets
rows = []
num_tweets = 0
pbar1 = tqdm.tqdm(range(0,len(source)))
pbar1.set_description('Extracting user data...')
for u in source:
r = self._make_user_dict(u)
num_tweets = num_tweets + min(r['statuses_count'], 3200)
rows.append(r)
pbar1.update(1)
if extract_tweets is True:
# Extract tweets
pbar2 = tqdm.tqdm(range(0,num_tweets))
for r in rows:
if r['statuses_count'] > 0:
r['tweets'] = self._get_user_tweets(r['screen_name'])
else:
r['tweets'] = []
pbar2.set_description('Extracted tweets by {}'.format(r['screen_name']))
pbar2.update(r['statuses_count'])
self._data = pd.DataFrame.from_records(rows)
def users(self, drop_collections = True):
"""
Returns a table of Twitter user data, with "users" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
def tweets(self):
"""
Returns a table of Twitter user data, with "tweets" as rows/observations.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame
"""
# I've hard coded these. Seemed like a good idea at the time...
# TODO: Fix this.
all_columns = ['contributors_enabled', 'created_at', 'default_profile',
'default_profile_image', 'description', 'entities', 'favourites_count',
'follow_request_sent', 'followers_count', 'following', 'friends_count',
'geo_enabled', 'has_extended_profile', 'id_str',
'is_translation_enabled', 'is_translator', 'lang', 'listed_count',
'location', 'name', 'needs_phone_verification', 'notifications',
'profile_background_color', 'profile_background_image_url',
'profile_background_image_url_https', 'profile_background_tile',
'profile_banner_url', 'profile_image_url', 'profile_image_url_https',
'profile_link_color', 'profile_location',
'profile_sidebar_border_color', 'profile_sidebar_fill_color',
'profile_text_color', 'profile_use_background_image', 'protected',
'screen_name', 'statuses_count', 'suspended', 'time_zone',
'translator_type', 'url', 'utc_offset', 'verified', 'tweets/created',
'tweets/retweet', 'tweets/rt author', 'tweets/text']
keep_columns = ['id_str', 'lang', 'location', 'name',
'protected', 'screen_name','time_zone',
'utc_offset', 'tweets/created', 'tweets/retweet',
'tweets/rt author', 'tweets/text']
drop_columns = list(set(all_columns).difference(set(keep_columns)))
base_df = self.expand_on('id', 'tweets', rename1='id', rename2='tweet_id', drop=drop_columns)
return self._drop_collections(base_df)
def _handle_object(self, name, obj):
"""
Process an object using twitter_object_handlers_lookup.
Doesn't currently do anything (as of 2017-06-16).
:param str name: Object name
:param obj: An object to be processed
:return: A dictionary of attributes
"""
if type(obj) in twitter_object_handlers_lookup:
return twitter_object_handlers_lookup[type(obj)](name, obj)
else:
return {name: obj}
def _make_user_dict(self, username):
"""
Processes a Twitter User object, exporting as a nested dictionary.
Complex values (i.e. objects that aren't int, bool, float, str, or
a collection of such) are converted to strings (i.e. using __str__
or __repr__). To access user data only, use make_user_dict(username)['_json'].
:param username: A Twitter username string.
:return: A nested dictionary of user data.
"""
user = self._api.get_user(username)
return self._make_object_dict(user)
def _get_user_tweets(self, screen_name):
# TODO: Implement tweet limit
# Twitter only allows access to a users most recent 3240 tweets with this method
# initialize a list to hold all the tweepy Tweets
alltweets = []
# make initial request for most recent tweets (200 is the maximum allowed count)
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200)
# save most recent tweets
alltweets.extend(new_tweets)
# save the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# keep grabbing tweets until there are no tweets left to grab
while len(new_tweets) > 0:
# all subsequent requests use the max_id param to prevent duplicates
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200,max_id=oldest)
# save most recent tweets
alltweets.extend(new_tweets)
# update the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# transform the tweepy tweets into a 2D array that will populate the csv
outtweets = {tweet.id_str: {'created':tweet.created_at,'text':tweet.text} for tweet in alltweets}
# Twitter-aware tokenizer
tknzr = TweetTokenizer()
# Extend data with linguistic processing
for tweet_id in outtweets:
# Get tweet data from dictionary
tweet = outtweets[tweet_id]
# Lowercase tokenized tweet text
tweet_tokens = tknzr.tokenize(tweet['text'])
# Parts-of-speech tags for tokenized text
tweet_pos = nltk.pos_tag(tweet_tokens)
# Is the tweet a rewteet?
tweet['retweet'] = tweet_pos[0][0] == 'RT'
# If retweeted, who was the original author?
if tweet['retweet'] is True:
tweet['rt_author'] = tweet_pos[1][0]
else:
tweet['rt_author'] = ''
return outtweets
|
networks-lab/tidyextractors | tidyextractors/tidytwitter/twitter_extractor.py | TwitterExtractor._make_user_dict | python | def _make_user_dict(self, username):
user = self._api.get_user(username)
return self._make_object_dict(user) | Processes a Twitter User object, exporting as a nested dictionary.
Complex values (i.e. objects that aren't int, bool, float, str, or
a collection of such) are converted to strings (i.e. using __str__
or __repr__). To access user data only, use make_user_dict(username)['_json'].
:param username: A Twitter username string.
:return: A nested dictionary of user data. | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidytwitter/twitter_extractor.py#L180-L191 | null | class TwitterExtractor(BaseExtractor):
"""
The ``TwitterExtractor`` class is for extracting user data from Twitter. This class
has methods for outputting data into the ``users`` and ``tweets`` tidy formats, and a
raw untidy format.
:param list source: A list of user screen name strings.
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
:param str access_token: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str access_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_key: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
:param str consumer_secret: One of four required keyword arguments that make up a
complete set of Twitter API credentials.
"""
def _extract(self, source, extract_tweets=True, *args, **kwargs):
"""
Extracts user data Using the twitter API. Mutates _data.
NOTE: TwitterExtractor requires a complete set of Twitter API credentials
to initialize: 'access_token', 'access_secret', 'consumer_key', and 'consumer_secret'.
:param list source: A list of user screen name strings.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None
"""
# Check that the proper API keywords were provided.
for cred in ['access_token', 'access_secret', 'consumer_key', 'consumer_secret']:
if cred not in kwargs:
raise ValueError('API credentials missing from keyword arguments: {}'.format(cred))
# Set up API access
self._auth = OAuthHandler(kwargs['consumer_key'], kwargs['consumer_secret'])
self._auth.set_access_token(kwargs['access_token'],kwargs['access_secret'])
self._api = tweepy.API(self._auth)
# Make row dictionaries and count tweets
rows = []
num_tweets = 0
pbar1 = tqdm.tqdm(range(0,len(source)))
pbar1.set_description('Extracting user data...')
for u in source:
r = self._make_user_dict(u)
num_tweets = num_tweets + min(r['statuses_count'], 3200)
rows.append(r)
pbar1.update(1)
if extract_tweets is True:
# Extract tweets
pbar2 = tqdm.tqdm(range(0,num_tweets))
for r in rows:
if r['statuses_count'] > 0:
r['tweets'] = self._get_user_tweets(r['screen_name'])
else:
r['tweets'] = []
pbar2.set_description('Extracted tweets by {}'.format(r['screen_name']))
pbar2.update(r['statuses_count'])
self._data = pd.DataFrame.from_records(rows)
def users(self, drop_collections = True):
"""
Returns a table of Twitter user data, with "users" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
def tweets(self):
"""
Returns a table of Twitter user data, with "tweets" as rows/observations.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame
"""
# I've hard coded these. Seemed like a good idea at the time...
# TODO: Fix this.
all_columns = ['contributors_enabled', 'created_at', 'default_profile',
'default_profile_image', 'description', 'entities', 'favourites_count',
'follow_request_sent', 'followers_count', 'following', 'friends_count',
'geo_enabled', 'has_extended_profile', 'id_str',
'is_translation_enabled', 'is_translator', 'lang', 'listed_count',
'location', 'name', 'needs_phone_verification', 'notifications',
'profile_background_color', 'profile_background_image_url',
'profile_background_image_url_https', 'profile_background_tile',
'profile_banner_url', 'profile_image_url', 'profile_image_url_https',
'profile_link_color', 'profile_location',
'profile_sidebar_border_color', 'profile_sidebar_fill_color',
'profile_text_color', 'profile_use_background_image', 'protected',
'screen_name', 'statuses_count', 'suspended', 'time_zone',
'translator_type', 'url', 'utc_offset', 'verified', 'tweets/created',
'tweets/retweet', 'tweets/rt author', 'tweets/text']
keep_columns = ['id_str', 'lang', 'location', 'name',
'protected', 'screen_name','time_zone',
'utc_offset', 'tweets/created', 'tweets/retweet',
'tweets/rt author', 'tweets/text']
drop_columns = list(set(all_columns).difference(set(keep_columns)))
base_df = self.expand_on('id', 'tweets', rename1='id', rename2='tweet_id', drop=drop_columns)
return self._drop_collections(base_df)
def _handle_object(self, name, obj):
"""
Process an object using twitter_object_handlers_lookup.
Doesn't currently do anything (as of 2017-06-16).
:param str name: Object name
:param obj: An object to be processed
:return: A dictionary of attributes
"""
if type(obj) in twitter_object_handlers_lookup:
return twitter_object_handlers_lookup[type(obj)](name, obj)
else:
return {name: obj}
def _make_object_dict(self, obj):
"""
Processes an object, exporting its data as a nested dictionary.
:param obj: An object
:return: A nested dictionary of object data
"""
data = {}
for attr in dir(obj):
if attr[0] is not '_' and attr is not 'status':
datum = getattr(obj, attr)
if not isinstance(datum, types.MethodType):
data.update(self._handle_object(attr,datum))
return data
def _get_user_tweets(self, screen_name):
# TODO: Implement tweet limit
# Twitter only allows access to a users most recent 3240 tweets with this method
# initialize a list to hold all the tweepy Tweets
alltweets = []
# make initial request for most recent tweets (200 is the maximum allowed count)
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200)
# save most recent tweets
alltweets.extend(new_tweets)
# save the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# keep grabbing tweets until there are no tweets left to grab
while len(new_tweets) > 0:
# all subsequent requests use the max_id param to prevent duplicates
new_tweets = self._api.user_timeline(screen_name = screen_name,count=200,max_id=oldest)
# save most recent tweets
alltweets.extend(new_tweets)
# update the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
# transform the tweepy tweets into a 2D array that will populate the csv
outtweets = {tweet.id_str: {'created':tweet.created_at,'text':tweet.text} for tweet in alltweets}
# Twitter-aware tokenizer
tknzr = TweetTokenizer()
# Extend data with linguistic processing
for tweet_id in outtweets:
# Get tweet data from dictionary
tweet = outtweets[tweet_id]
# Lowercase tokenized tweet text
tweet_tokens = tknzr.tokenize(tweet['text'])
# Parts-of-speech tags for tokenized text
tweet_pos = nltk.pos_tag(tweet_tokens)
# Is the tweet a rewteet?
tweet['retweet'] = tweet_pos[0][0] == 'RT'
# If retweeted, who was the original author?
if tweet['retweet'] is True:
tweet['rt_author'] = tweet_pos[1][0]
else:
tweet['rt_author'] = ''
return outtweets
|
networks-lab/tidyextractors | tidyextractors/tidygit/git_object_handlers.py | handle_stats | python | def handle_stats(name, obj):
return {'total_deletions': obj.total['deletions'],
'total_insertions': obj.total['insertions'],
'total_lines': obj.total['lines'],
'total_files': obj.total['files'],
'changes': obj.files} | Stats object handler.
:param name: Unused String
:param obj: GitPython Stats
:return: Dictionary of attributes. | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidygit/git_object_handlers.py#L29-L40 | null | # *********************************************************************************************
# Copyright (C) 2017 Joel Becker, Jillian Anderson, Steve McColl and Dr. John McLevey
#
# This file is part of the tidyextractors package developed for Dr John McLevey's Networks Lab
# at the University of Waterloo. For more information, see
# http://tidyextractors.readthedocs.io/en/latest/
#
# tidyextractors is free software: you can redistribute it and/or modify it under the terms of
# the GNU General Public License as published by the Free Software Foundation, either version 3
# of the License, or (at your option) any later version.
#
# tidyextractors is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with tidyextractors.
# If not, see <http://www.gnu.org/licenses/>.
# *********************************************************************************************
import git
# All handlers have the following pattern:
# handler(name,object) => dictionary of attributes
# attributes in the dict will become values in rows
# of the output dataframe.
def handle_actor(name, obj):
"""
Actor object handler.
:param name: Unused String
:param obj: GitPython Actor
:return: Dictionary of attributes.
"""
return {'author_name': obj.name,
'author_email': obj.email}
# Handler functions to turn objects into usable attributes.
# Functions return a dictionary of attributes, which
# will appear in a row of the pandas dataframe.
git_object_handlers_lookup = {git.Stats: handle_stats,
git.Actor: handle_actor} |
networks-lab/tidyextractors | tidyextractors/tidygit/get_log.py | handle_object | python | def handle_object(name, obj):
if type(obj) in git_object_handlers_lookup:
return git_object_handlers_lookup[type(obj)](name, obj)
else:
return {name:obj} | This helper function handles incoming test_data for make_object_dict.
If thereis a special handler in git_object_handlers_lookup, this
is used. Otherwise, the given name:obj pair is returned
as-is.
:param name: String
:param obj: The object to be processed.
:return: A dictionary of attributes. | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidygit/get_log.py#L39-L52 | null | # *********************************************************************************************
# Copyright (C) 2017 Joel Becker, Jillian Anderson, Steve McColl and Dr. John McLevey
#
# This file is part of the tidyextractors package developed for Dr John McLevey's Networks Lab
# at the University of Waterloo. For more information, see
# http://tidyextractors.readthedocs.io/en/latest/
#
# tidyextractors is free software: you can redistribute it and/or modify it under the terms of
# the GNU General Public License as published by the Free Software Foundation, either version 3
# of the License, or (at your option) any later version.
#
# tidyextractors is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with tidyextractors.
# If not, see <http://www.gnu.org/licenses/>.
# *********************************************************************************************
import git
import tqdm
import pandas as pd
from tidyextractors.tidygit.git_object_handlers import git_object_handlers_lookup
# TODO: Increase get_log efficiency i.e. using gitnet implementation
# A default list of attributes to be extracted.
simple_attributes = ['author',
'author_tz_offset',
'authored_date',
'authored_datetime',
'encoding',
'hexsha',
'stats',
'summary',
'type'
]
def make_object_dict(obj, keep=[]):
"""
Processes an object, exporting its data as a nested dictionary.
Individual objects are handled using handle_object.
:param obj: The object to be processed.
:param keep: Object attributes to be kept. Defaults to all attributes.
:return: A dictionary of attributes.
"""
data = {}
if keep == []:
get_attrs = dir(obj)
else:
get_attrs = keep
for attr in get_attrs:
datum = getattr(obj,attr)
data.update(handle_object(attr,datum))
return data
def extract_log(rpath,extract=simple_attributes):
"""
Extracts Git commit test_data from a local repository.
:param rpath: The path to a local Git repo.
:param extract: A list of attribute name strings.
:return: A Pandas dataframe containing Git commit test_data.
"""
# Get repo
m_repo = git.Repo(rpath)
# Count commits
count = 0
m_commits = m_repo.iter_commits()
for commit in m_commits:
count += 1
# Initialize progress bar and index
with tqdm.tqdm(total=count) as pbar:
# Get commits again
m_commits = m_repo.iter_commits()
# Setup test_data extraction
update_interval = max(min(count//100,100),5)
index = 0
buffer = []
# Extract commit test_data
while True:
# Add the next commit to the buffer
try:
next_commit = next(m_commits)
buffer.append(make_object_dict(next_commit,extract))
index += 1
if index%update_interval == 0:
pbar.update(update_interval)
# If no more commits, clear the buffer
except StopIteration:
break
# final_df = pd.concat(sub_df_list)
return pd.DataFrame(buffer)
|
networks-lab/tidyextractors | tidyextractors/tidygit/get_log.py | make_object_dict | python | def make_object_dict(obj, keep=[]):
data = {}
if keep == []:
get_attrs = dir(obj)
else:
get_attrs = keep
for attr in get_attrs:
datum = getattr(obj,attr)
data.update(handle_object(attr,datum))
return data | Processes an object, exporting its data as a nested dictionary.
Individual objects are handled using handle_object.
:param obj: The object to be processed.
:param keep: Object attributes to be kept. Defaults to all attributes.
:return: A dictionary of attributes. | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidygit/get_log.py#L55-L72 | [
"def handle_object(name, obj):\n \"\"\"\n This helper function handles incoming test_data for make_object_dict.\n If thereis a special handler in git_object_handlers_lookup, this\n is used. Otherwise, the given name:obj pair is returned\n as-is.\n :param name: String\n :param obj: The object to... | # *********************************************************************************************
# Copyright (C) 2017 Joel Becker, Jillian Anderson, Steve McColl and Dr. John McLevey
#
# This file is part of the tidyextractors package developed for Dr John McLevey's Networks Lab
# at the University of Waterloo. For more information, see
# http://tidyextractors.readthedocs.io/en/latest/
#
# tidyextractors is free software: you can redistribute it and/or modify it under the terms of
# the GNU General Public License as published by the Free Software Foundation, either version 3
# of the License, or (at your option) any later version.
#
# tidyextractors is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with tidyextractors.
# If not, see <http://www.gnu.org/licenses/>.
# *********************************************************************************************
import git
import tqdm
import pandas as pd
from tidyextractors.tidygit.git_object_handlers import git_object_handlers_lookup
# TODO: Increase get_log efficiency i.e. using gitnet implementation
# A default list of attributes to be extracted.
simple_attributes = ['author',
'author_tz_offset',
'authored_date',
'authored_datetime',
'encoding',
'hexsha',
'stats',
'summary',
'type'
]
def handle_object(name, obj):
"""
This helper function handles incoming test_data for make_object_dict.
If thereis a special handler in git_object_handlers_lookup, this
is used. Otherwise, the given name:obj pair is returned
as-is.
:param name: String
:param obj: The object to be processed.
:return: A dictionary of attributes.
"""
if type(obj) in git_object_handlers_lookup:
return git_object_handlers_lookup[type(obj)](name, obj)
else:
return {name:obj}
def extract_log(rpath,extract=simple_attributes):
"""
Extracts Git commit test_data from a local repository.
:param rpath: The path to a local Git repo.
:param extract: A list of attribute name strings.
:return: A Pandas dataframe containing Git commit test_data.
"""
# Get repo
m_repo = git.Repo(rpath)
# Count commits
count = 0
m_commits = m_repo.iter_commits()
for commit in m_commits:
count += 1
# Initialize progress bar and index
with tqdm.tqdm(total=count) as pbar:
# Get commits again
m_commits = m_repo.iter_commits()
# Setup test_data extraction
update_interval = max(min(count//100,100),5)
index = 0
buffer = []
# Extract commit test_data
while True:
# Add the next commit to the buffer
try:
next_commit = next(m_commits)
buffer.append(make_object_dict(next_commit,extract))
index += 1
if index%update_interval == 0:
pbar.update(update_interval)
# If no more commits, clear the buffer
except StopIteration:
break
# final_df = pd.concat(sub_df_list)
return pd.DataFrame(buffer)
|
networks-lab/tidyextractors | tidyextractors/tidygit/get_log.py | extract_log | python | def extract_log(rpath,extract=simple_attributes):
# Get repo
m_repo = git.Repo(rpath)
# Count commits
count = 0
m_commits = m_repo.iter_commits()
for commit in m_commits:
count += 1
# Initialize progress bar and index
with tqdm.tqdm(total=count) as pbar:
# Get commits again
m_commits = m_repo.iter_commits()
# Setup test_data extraction
update_interval = max(min(count//100,100),5)
index = 0
buffer = []
# Extract commit test_data
while True:
# Add the next commit to the buffer
try:
next_commit = next(m_commits)
buffer.append(make_object_dict(next_commit,extract))
index += 1
if index%update_interval == 0:
pbar.update(update_interval)
# If no more commits, clear the buffer
except StopIteration:
break
# final_df = pd.concat(sub_df_list)
return pd.DataFrame(buffer) | Extracts Git commit test_data from a local repository.
:param rpath: The path to a local Git repo.
:param extract: A list of attribute name strings.
:return: A Pandas dataframe containing Git commit test_data. | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidygit/get_log.py#L75-L119 | [
"def make_object_dict(obj, keep=[]):\n \"\"\"\n Processes an object, exporting its data as a nested dictionary.\n Individual objects are handled using handle_object.\n :param obj: The object to be processed.\n :param keep: Object attributes to be kept. Defaults to all attributes.\n :return: A dict... | # *********************************************************************************************
# Copyright (C) 2017 Joel Becker, Jillian Anderson, Steve McColl and Dr. John McLevey
#
# This file is part of the tidyextractors package developed for Dr John McLevey's Networks Lab
# at the University of Waterloo. For more information, see
# http://tidyextractors.readthedocs.io/en/latest/
#
# tidyextractors is free software: you can redistribute it and/or modify it under the terms of
# the GNU General Public License as published by the Free Software Foundation, either version 3
# of the License, or (at your option) any later version.
#
# tidyextractors is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with tidyextractors.
# If not, see <http://www.gnu.org/licenses/>.
# *********************************************************************************************
import git
import tqdm
import pandas as pd
from tidyextractors.tidygit.git_object_handlers import git_object_handlers_lookup
# TODO: Increase get_log efficiency i.e. using gitnet implementation
# A default list of attributes to be extracted.
simple_attributes = ['author',
'author_tz_offset',
'authored_date',
'authored_datetime',
'encoding',
'hexsha',
'stats',
'summary',
'type'
]
def handle_object(name, obj):
"""
This helper function handles incoming test_data for make_object_dict.
If thereis a special handler in git_object_handlers_lookup, this
is used. Otherwise, the given name:obj pair is returned
as-is.
:param name: String
:param obj: The object to be processed.
:return: A dictionary of attributes.
"""
if type(obj) in git_object_handlers_lookup:
return git_object_handlers_lookup[type(obj)](name, obj)
else:
return {name:obj}
def make_object_dict(obj, keep=[]):
"""
Processes an object, exporting its data as a nested dictionary.
Individual objects are handled using handle_object.
:param obj: The object to be processed.
:param keep: Object attributes to be kept. Defaults to all attributes.
:return: A dictionary of attributes.
"""
data = {}
if keep == []:
get_attrs = dir(obj)
else:
get_attrs = keep
for attr in get_attrs:
datum = getattr(obj,attr)
data.update(handle_object(attr,datum))
return data
|
networks-lab/tidyextractors | tidyextractors/tidymbox/mbox_to_pandas.py | clean_addresses | python | def clean_addresses(addresses):
if addresses is None:
return []
addresses = addresses.replace("\'", "")
address_list = re.split('[,;]', addresses)
clean_list = []
for address in address_list:
temp_clean_address = clean_address(address)
clean_list.append(temp_clean_address)
return clean_list | Cleans email address.
:param addresses: List of strings (email addresses)
:return: List of strings (cleaned email addresses) | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidymbox/mbox_to_pandas.py#L31-L45 | [
"def clean_address(address):\n \"\"\"\n Cleans a single email address.\n :param address: String (email address)\n :return: String (clean email address)\n \"\"\"\n if isinstance(address, header.Header):\n return clean_address(address.encode('ascii'))\n\n elif isinstance(address, str):\n ... | # *********************************************************************************************
# Copyright (C) 2017 Joel Becker, Jillian Anderson, Steve McColl and Dr. John McLevey
#
# This file is part of the tidyextractors package developed for Dr John McLevey's Networks Lab
# at the University of Waterloo. For more information, see
# http://tidyextractors.readthedocs.io/en/latest/
#
# tidyextractors is free software: you can redistribute it and/or modify it under the terms of
# the GNU General Public License as published by the Free Software Foundation, either version 3
# of the License, or (at your option) any later version.
#
# tidyextractors is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with tidyextractors.
# If not, see <http://www.gnu.org/licenses/>.
# *********************************************************************************************
import os
import re
import tqdm
import mailbox
import warnings
import pandas as pd
import email.utils as email
import email.header as header
# Adapted from Phil Deutsch's "mbox-analysis" https://github.com/phildeutsch/mbox-analysis
def clean_address(address):
"""
Cleans a single email address.
:param address: String (email address)
:return: String (clean email address)
"""
if isinstance(address, header.Header):
return clean_address(address.encode('ascii'))
elif isinstance(address, str):
address = address.replace("<", "")
address = address.replace(">", "")
address = address.replace("\"", "")
address = address.replace("\n", " ")
address = address.replace("MAILER-DAEMON", "")
address = address.lower().strip()
email = None
for word in address.split(' '):
email_regex = re.compile(
"^[a-zA-Z0-9._%-]+@[a-zA-Z0-9._%-]+.[a-zA-Z]{2,6}$"
)
email = re.match(email_regex, word)
if email is not None:
clean_email = email.group(0)
if email is None:
if address.split(' ')[-1].find('@') > -1:
clean_email = address.split(' ')[-1].strip()
elif address.split(' ')[-1].find('?') > -1:
clean_email = 'n/a'
else:
clean_email = address
return clean_email
elif address is None:
return None
else:
raise ValueError('An unexpected type was given to clean_address. Address was {}'.format(address))
return None
def get_body(message):
"""
Extracts body text from an mbox message.
:param message: Mbox message
:return: String
"""
try:
sm = str(message)
body_start = sm.find('iamunique', sm.find('iamunique')+1)
body_start = sm.find('Content-Transfer-Encoding', body_start+1)
body_start = sm.find('\n', body_start+1)+1
body_end = sm.find('From: ', body_start + 1)
if body_end == -1:
body_end = sm.find('iamunique', body_start + 1)
body_end = sm.find('\n', body_end - 25)
body = sm[body_start:body_end]
body = body.replace("=20\n", "")
body = body.replace("=FC", "ü")
body = body.replace("=F6", "ö")
body = body.replace("=84", "\"")
body = body.replace("=94", "\"")
body = body.replace("=96", "-")
body = body.replace("=92", "\'")
body = body.replace("=93", "\"")
body = body.replace("=E4", "ä")
body = body.replace("=DF", "ss")
body = body.replace("=", "")
body = body.replace("\"", "")
body = body.replace("\'", "")
except:
body = None
return body
def write_table(mboxfile, mailTable):
"""
Takes a list and extends it with lists of data, which is
extracted from mbox messages.
:param mboxfile: Mbox file name/path
:param mailTable: A list (of lists)
:return: An extended list of lists
"""
mail_box_contents = mailbox.mbox(mboxfile)
m_pbar = tqdm.tqdm(range(0,len(mail_box_contents)))
m_pbar.set_description('Extracting mbox messages...')
count = 0
update_interval = min(50,len(mail_box_contents))
for message in mail_box_contents:
count += 1
if count % update_interval == 0:
m_pbar.update(update_interval)
clean_from = clean_address(message['From'])
clean_to = clean_addresses(message['To'])
clean_cc = clean_addresses(message['Cc'])
try:
clean_date = email.parsedate_to_datetime(message['Date'])
except:
clean_date = None
mailTable.append([
clean_from,
clean_to,
clean_cc,
clean_date,
message['Subject'],
get_body(message)
])
def mbox_to_pandas(mbox_path):
"""
Extracts all mbox messages from mbox files in mbox_path.
:param mbox_path: Path to an mbox file OR a directory containing mbox files.
:return: A Pandas DataFrame with messages as rows/observations.
"""
if os.path.isfile(mbox_path):
mbox_files = [mbox_path]
else:
mbox_files = [os.path.join(dirpath, f) for dirpath, dirnames, files in os.walk(mbox_path) for f in files if f.endswith('mbox')]
mail_table = []
f_pbar = tqdm.tqdm(range(0,len(mbox_files)))
f_pbar.set_description('Extracting mbox files...')
for mbox_file in mbox_files:
write_table(mbox_file, mail_table)
f_pbar.update(1)
df_out = pd.DataFrame(mail_table)
df_out.columns = ['From', 'To', 'Cc', 'Date', 'Subject', 'Body']
df_out['NumTo'] = df_out['To'].map(lambda i: len(i))
df_out['NumCC'] = df_out['Cc'].map(lambda i: len(i))
return df_out |
networks-lab/tidyextractors | tidyextractors/tidymbox/mbox_to_pandas.py | clean_address | python | def clean_address(address):
if isinstance(address, header.Header):
return clean_address(address.encode('ascii'))
elif isinstance(address, str):
address = address.replace("<", "")
address = address.replace(">", "")
address = address.replace("\"", "")
address = address.replace("\n", " ")
address = address.replace("MAILER-DAEMON", "")
address = address.lower().strip()
email = None
for word in address.split(' '):
email_regex = re.compile(
"^[a-zA-Z0-9._%-]+@[a-zA-Z0-9._%-]+.[a-zA-Z]{2,6}$"
)
email = re.match(email_regex, word)
if email is not None:
clean_email = email.group(0)
if email is None:
if address.split(' ')[-1].find('@') > -1:
clean_email = address.split(' ')[-1].strip()
elif address.split(' ')[-1].find('?') > -1:
clean_email = 'n/a'
else:
clean_email = address
return clean_email
elif address is None:
return None
else:
raise ValueError('An unexpected type was given to clean_address. Address was {}'.format(address))
return None | Cleans a single email address.
:param address: String (email address)
:return: String (clean email address) | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidymbox/mbox_to_pandas.py#L48-L88 | [
"def clean_address(address):\n \"\"\"\n Cleans a single email address.\n :param address: String (email address)\n :return: String (clean email address)\n \"\"\"\n if isinstance(address, header.Header):\n return clean_address(address.encode('ascii'))\n\n elif isinstance(address, str):\n ... | # *********************************************************************************************
# Copyright (C) 2017 Joel Becker, Jillian Anderson, Steve McColl and Dr. John McLevey
#
# This file is part of the tidyextractors package developed for Dr John McLevey's Networks Lab
# at the University of Waterloo. For more information, see
# http://tidyextractors.readthedocs.io/en/latest/
#
# tidyextractors is free software: you can redistribute it and/or modify it under the terms of
# the GNU General Public License as published by the Free Software Foundation, either version 3
# of the License, or (at your option) any later version.
#
# tidyextractors is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with tidyextractors.
# If not, see <http://www.gnu.org/licenses/>.
# *********************************************************************************************
import os
import re
import tqdm
import mailbox
import warnings
import pandas as pd
import email.utils as email
import email.header as header
# Adapted from Phil Deutsch's "mbox-analysis" https://github.com/phildeutsch/mbox-analysis
def clean_addresses(addresses):
"""
Cleans email address.
:param addresses: List of strings (email addresses)
:return: List of strings (cleaned email addresses)
"""
if addresses is None:
return []
addresses = addresses.replace("\'", "")
address_list = re.split('[,;]', addresses)
clean_list = []
for address in address_list:
temp_clean_address = clean_address(address)
clean_list.append(temp_clean_address)
return clean_list
def get_body(message):
"""
Extracts body text from an mbox message.
:param message: Mbox message
:return: String
"""
try:
sm = str(message)
body_start = sm.find('iamunique', sm.find('iamunique')+1)
body_start = sm.find('Content-Transfer-Encoding', body_start+1)
body_start = sm.find('\n', body_start+1)+1
body_end = sm.find('From: ', body_start + 1)
if body_end == -1:
body_end = sm.find('iamunique', body_start + 1)
body_end = sm.find('\n', body_end - 25)
body = sm[body_start:body_end]
body = body.replace("=20\n", "")
body = body.replace("=FC", "ü")
body = body.replace("=F6", "ö")
body = body.replace("=84", "\"")
body = body.replace("=94", "\"")
body = body.replace("=96", "-")
body = body.replace("=92", "\'")
body = body.replace("=93", "\"")
body = body.replace("=E4", "ä")
body = body.replace("=DF", "ss")
body = body.replace("=", "")
body = body.replace("\"", "")
body = body.replace("\'", "")
except:
body = None
return body
def write_table(mboxfile, mailTable):
"""
Takes a list and extends it with lists of data, which is
extracted from mbox messages.
:param mboxfile: Mbox file name/path
:param mailTable: A list (of lists)
:return: An extended list of lists
"""
mail_box_contents = mailbox.mbox(mboxfile)
m_pbar = tqdm.tqdm(range(0,len(mail_box_contents)))
m_pbar.set_description('Extracting mbox messages...')
count = 0
update_interval = min(50,len(mail_box_contents))
for message in mail_box_contents:
count += 1
if count % update_interval == 0:
m_pbar.update(update_interval)
clean_from = clean_address(message['From'])
clean_to = clean_addresses(message['To'])
clean_cc = clean_addresses(message['Cc'])
try:
clean_date = email.parsedate_to_datetime(message['Date'])
except:
clean_date = None
mailTable.append([
clean_from,
clean_to,
clean_cc,
clean_date,
message['Subject'],
get_body(message)
])
def mbox_to_pandas(mbox_path):
"""
Extracts all mbox messages from mbox files in mbox_path.
:param mbox_path: Path to an mbox file OR a directory containing mbox files.
:return: A Pandas DataFrame with messages as rows/observations.
"""
if os.path.isfile(mbox_path):
mbox_files = [mbox_path]
else:
mbox_files = [os.path.join(dirpath, f) for dirpath, dirnames, files in os.walk(mbox_path) for f in files if f.endswith('mbox')]
mail_table = []
f_pbar = tqdm.tqdm(range(0,len(mbox_files)))
f_pbar.set_description('Extracting mbox files...')
for mbox_file in mbox_files:
write_table(mbox_file, mail_table)
f_pbar.update(1)
df_out = pd.DataFrame(mail_table)
df_out.columns = ['From', 'To', 'Cc', 'Date', 'Subject', 'Body']
df_out['NumTo'] = df_out['To'].map(lambda i: len(i))
df_out['NumCC'] = df_out['Cc'].map(lambda i: len(i))
return df_out |
networks-lab/tidyextractors | tidyextractors/tidymbox/mbox_to_pandas.py | get_body | python | def get_body(message):
try:
sm = str(message)
body_start = sm.find('iamunique', sm.find('iamunique')+1)
body_start = sm.find('Content-Transfer-Encoding', body_start+1)
body_start = sm.find('\n', body_start+1)+1
body_end = sm.find('From: ', body_start + 1)
if body_end == -1:
body_end = sm.find('iamunique', body_start + 1)
body_end = sm.find('\n', body_end - 25)
body = sm[body_start:body_end]
body = body.replace("=20\n", "")
body = body.replace("=FC", "ü")
body = body.replace("=F6", "ö")
body = body.replace("=84", "\"")
body = body.replace("=94", "\"")
body = body.replace("=96", "-")
body = body.replace("=92", "\'")
body = body.replace("=93", "\"")
body = body.replace("=E4", "ä")
body = body.replace("=DF", "ss")
body = body.replace("=", "")
body = body.replace("\"", "")
body = body.replace("\'", "")
except:
body = None
return body | Extracts body text from an mbox message.
:param message: Mbox message
:return: String | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidymbox/mbox_to_pandas.py#L91-L125 | null | # *********************************************************************************************
# Copyright (C) 2017 Joel Becker, Jillian Anderson, Steve McColl and Dr. John McLevey
#
# This file is part of the tidyextractors package developed for Dr John McLevey's Networks Lab
# at the University of Waterloo. For more information, see
# http://tidyextractors.readthedocs.io/en/latest/
#
# tidyextractors is free software: you can redistribute it and/or modify it under the terms of
# the GNU General Public License as published by the Free Software Foundation, either version 3
# of the License, or (at your option) any later version.
#
# tidyextractors is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with tidyextractors.
# If not, see <http://www.gnu.org/licenses/>.
# *********************************************************************************************
import os
import re
import tqdm
import mailbox
import warnings
import pandas as pd
import email.utils as email
import email.header as header
# Adapted from Phil Deutsch's "mbox-analysis" https://github.com/phildeutsch/mbox-analysis
def clean_addresses(addresses):
"""
Cleans email address.
:param addresses: List of strings (email addresses)
:return: List of strings (cleaned email addresses)
"""
if addresses is None:
return []
addresses = addresses.replace("\'", "")
address_list = re.split('[,;]', addresses)
clean_list = []
for address in address_list:
temp_clean_address = clean_address(address)
clean_list.append(temp_clean_address)
return clean_list
def clean_address(address):
"""
Cleans a single email address.
:param address: String (email address)
:return: String (clean email address)
"""
if isinstance(address, header.Header):
return clean_address(address.encode('ascii'))
elif isinstance(address, str):
address = address.replace("<", "")
address = address.replace(">", "")
address = address.replace("\"", "")
address = address.replace("\n", " ")
address = address.replace("MAILER-DAEMON", "")
address = address.lower().strip()
email = None
for word in address.split(' '):
email_regex = re.compile(
"^[a-zA-Z0-9._%-]+@[a-zA-Z0-9._%-]+.[a-zA-Z]{2,6}$"
)
email = re.match(email_regex, word)
if email is not None:
clean_email = email.group(0)
if email is None:
if address.split(' ')[-1].find('@') > -1:
clean_email = address.split(' ')[-1].strip()
elif address.split(' ')[-1].find('?') > -1:
clean_email = 'n/a'
else:
clean_email = address
return clean_email
elif address is None:
return None
else:
raise ValueError('An unexpected type was given to clean_address. Address was {}'.format(address))
return None
def write_table(mboxfile, mailTable):
"""
Takes a list and extends it with lists of data, which is
extracted from mbox messages.
:param mboxfile: Mbox file name/path
:param mailTable: A list (of lists)
:return: An extended list of lists
"""
mail_box_contents = mailbox.mbox(mboxfile)
m_pbar = tqdm.tqdm(range(0,len(mail_box_contents)))
m_pbar.set_description('Extracting mbox messages...')
count = 0
update_interval = min(50,len(mail_box_contents))
for message in mail_box_contents:
count += 1
if count % update_interval == 0:
m_pbar.update(update_interval)
clean_from = clean_address(message['From'])
clean_to = clean_addresses(message['To'])
clean_cc = clean_addresses(message['Cc'])
try:
clean_date = email.parsedate_to_datetime(message['Date'])
except:
clean_date = None
mailTable.append([
clean_from,
clean_to,
clean_cc,
clean_date,
message['Subject'],
get_body(message)
])
def mbox_to_pandas(mbox_path):
"""
Extracts all mbox messages from mbox files in mbox_path.
:param mbox_path: Path to an mbox file OR a directory containing mbox files.
:return: A Pandas DataFrame with messages as rows/observations.
"""
if os.path.isfile(mbox_path):
mbox_files = [mbox_path]
else:
mbox_files = [os.path.join(dirpath, f) for dirpath, dirnames, files in os.walk(mbox_path) for f in files if f.endswith('mbox')]
mail_table = []
f_pbar = tqdm.tqdm(range(0,len(mbox_files)))
f_pbar.set_description('Extracting mbox files...')
for mbox_file in mbox_files:
write_table(mbox_file, mail_table)
f_pbar.update(1)
df_out = pd.DataFrame(mail_table)
df_out.columns = ['From', 'To', 'Cc', 'Date', 'Subject', 'Body']
df_out['NumTo'] = df_out['To'].map(lambda i: len(i))
df_out['NumCC'] = df_out['Cc'].map(lambda i: len(i))
return df_out |
networks-lab/tidyextractors | tidyextractors/tidymbox/mbox_to_pandas.py | write_table | python | def write_table(mboxfile, mailTable):
mail_box_contents = mailbox.mbox(mboxfile)
m_pbar = tqdm.tqdm(range(0,len(mail_box_contents)))
m_pbar.set_description('Extracting mbox messages...')
count = 0
update_interval = min(50,len(mail_box_contents))
for message in mail_box_contents:
count += 1
if count % update_interval == 0:
m_pbar.update(update_interval)
clean_from = clean_address(message['From'])
clean_to = clean_addresses(message['To'])
clean_cc = clean_addresses(message['Cc'])
try:
clean_date = email.parsedate_to_datetime(message['Date'])
except:
clean_date = None
mailTable.append([
clean_from,
clean_to,
clean_cc,
clean_date,
message['Subject'],
get_body(message)
]) | Takes a list and extends it with lists of data, which is
extracted from mbox messages.
:param mboxfile: Mbox file name/path
:param mailTable: A list (of lists)
:return: An extended list of lists | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidymbox/mbox_to_pandas.py#L128-L164 | [
"def clean_addresses(addresses):\n \"\"\"\n Cleans email address.\n :param addresses: List of strings (email addresses)\n :return: List of strings (cleaned email addresses)\n \"\"\"\n if addresses is None:\n return []\n addresses = addresses.replace(\"\\'\", \"\")\n address_list = re.... | # *********************************************************************************************
# Copyright (C) 2017 Joel Becker, Jillian Anderson, Steve McColl and Dr. John McLevey
#
# This file is part of the tidyextractors package developed for Dr John McLevey's Networks Lab
# at the University of Waterloo. For more information, see
# http://tidyextractors.readthedocs.io/en/latest/
#
# tidyextractors is free software: you can redistribute it and/or modify it under the terms of
# the GNU General Public License as published by the Free Software Foundation, either version 3
# of the License, or (at your option) any later version.
#
# tidyextractors is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with tidyextractors.
# If not, see <http://www.gnu.org/licenses/>.
# *********************************************************************************************
import os
import re
import tqdm
import mailbox
import warnings
import pandas as pd
import email.utils as email
import email.header as header
# Adapted from Phil Deutsch's "mbox-analysis" https://github.com/phildeutsch/mbox-analysis
def clean_addresses(addresses):
"""
Cleans email address.
:param addresses: List of strings (email addresses)
:return: List of strings (cleaned email addresses)
"""
if addresses is None:
return []
addresses = addresses.replace("\'", "")
address_list = re.split('[,;]', addresses)
clean_list = []
for address in address_list:
temp_clean_address = clean_address(address)
clean_list.append(temp_clean_address)
return clean_list
def clean_address(address):
"""
Cleans a single email address.
:param address: String (email address)
:return: String (clean email address)
"""
if isinstance(address, header.Header):
return clean_address(address.encode('ascii'))
elif isinstance(address, str):
address = address.replace("<", "")
address = address.replace(">", "")
address = address.replace("\"", "")
address = address.replace("\n", " ")
address = address.replace("MAILER-DAEMON", "")
address = address.lower().strip()
email = None
for word in address.split(' '):
email_regex = re.compile(
"^[a-zA-Z0-9._%-]+@[a-zA-Z0-9._%-]+.[a-zA-Z]{2,6}$"
)
email = re.match(email_regex, word)
if email is not None:
clean_email = email.group(0)
if email is None:
if address.split(' ')[-1].find('@') > -1:
clean_email = address.split(' ')[-1].strip()
elif address.split(' ')[-1].find('?') > -1:
clean_email = 'n/a'
else:
clean_email = address
return clean_email
elif address is None:
return None
else:
raise ValueError('An unexpected type was given to clean_address. Address was {}'.format(address))
return None
def get_body(message):
"""
Extracts body text from an mbox message.
:param message: Mbox message
:return: String
"""
try:
sm = str(message)
body_start = sm.find('iamunique', sm.find('iamunique')+1)
body_start = sm.find('Content-Transfer-Encoding', body_start+1)
body_start = sm.find('\n', body_start+1)+1
body_end = sm.find('From: ', body_start + 1)
if body_end == -1:
body_end = sm.find('iamunique', body_start + 1)
body_end = sm.find('\n', body_end - 25)
body = sm[body_start:body_end]
body = body.replace("=20\n", "")
body = body.replace("=FC", "ü")
body = body.replace("=F6", "ö")
body = body.replace("=84", "\"")
body = body.replace("=94", "\"")
body = body.replace("=96", "-")
body = body.replace("=92", "\'")
body = body.replace("=93", "\"")
body = body.replace("=E4", "ä")
body = body.replace("=DF", "ss")
body = body.replace("=", "")
body = body.replace("\"", "")
body = body.replace("\'", "")
except:
body = None
return body
def mbox_to_pandas(mbox_path):
"""
Extracts all mbox messages from mbox files in mbox_path.
:param mbox_path: Path to an mbox file OR a directory containing mbox files.
:return: A Pandas DataFrame with messages as rows/observations.
"""
if os.path.isfile(mbox_path):
mbox_files = [mbox_path]
else:
mbox_files = [os.path.join(dirpath, f) for dirpath, dirnames, files in os.walk(mbox_path) for f in files if f.endswith('mbox')]
mail_table = []
f_pbar = tqdm.tqdm(range(0,len(mbox_files)))
f_pbar.set_description('Extracting mbox files...')
for mbox_file in mbox_files:
write_table(mbox_file, mail_table)
f_pbar.update(1)
df_out = pd.DataFrame(mail_table)
df_out.columns = ['From', 'To', 'Cc', 'Date', 'Subject', 'Body']
df_out['NumTo'] = df_out['To'].map(lambda i: len(i))
df_out['NumCC'] = df_out['Cc'].map(lambda i: len(i))
return df_out |
networks-lab/tidyextractors | tidyextractors/tidymbox/mbox_to_pandas.py | mbox_to_pandas | python | def mbox_to_pandas(mbox_path):
if os.path.isfile(mbox_path):
mbox_files = [mbox_path]
else:
mbox_files = [os.path.join(dirpath, f) for dirpath, dirnames, files in os.walk(mbox_path) for f in files if f.endswith('mbox')]
mail_table = []
f_pbar = tqdm.tqdm(range(0,len(mbox_files)))
f_pbar.set_description('Extracting mbox files...')
for mbox_file in mbox_files:
write_table(mbox_file, mail_table)
f_pbar.update(1)
df_out = pd.DataFrame(mail_table)
df_out.columns = ['From', 'To', 'Cc', 'Date', 'Subject', 'Body']
df_out['NumTo'] = df_out['To'].map(lambda i: len(i))
df_out['NumCC'] = df_out['Cc'].map(lambda i: len(i))
return df_out | Extracts all mbox messages from mbox files in mbox_path.
:param mbox_path: Path to an mbox file OR a directory containing mbox files.
:return: A Pandas DataFrame with messages as rows/observations. | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidymbox/mbox_to_pandas.py#L167-L191 | [
"def write_table(mboxfile, mailTable):\n \"\"\"\n Takes a list and extends it with lists of data, which is\n extracted from mbox messages.\n :param mboxfile: Mbox file name/path\n :param mailTable: A list (of lists)\n :return: An extended list of lists\n \"\"\"\n mail_box_contents = mailbox.... | # *********************************************************************************************
# Copyright (C) 2017 Joel Becker, Jillian Anderson, Steve McColl and Dr. John McLevey
#
# This file is part of the tidyextractors package developed for Dr John McLevey's Networks Lab
# at the University of Waterloo. For more information, see
# http://tidyextractors.readthedocs.io/en/latest/
#
# tidyextractors is free software: you can redistribute it and/or modify it under the terms of
# the GNU General Public License as published by the Free Software Foundation, either version 3
# of the License, or (at your option) any later version.
#
# tidyextractors is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along with tidyextractors.
# If not, see <http://www.gnu.org/licenses/>.
# *********************************************************************************************
import os
import re
import tqdm
import mailbox
import warnings
import pandas as pd
import email.utils as email
import email.header as header
# Adapted from Phil Deutsch's "mbox-analysis" https://github.com/phildeutsch/mbox-analysis
def clean_addresses(addresses):
"""
Cleans email address.
:param addresses: List of strings (email addresses)
:return: List of strings (cleaned email addresses)
"""
if addresses is None:
return []
addresses = addresses.replace("\'", "")
address_list = re.split('[,;]', addresses)
clean_list = []
for address in address_list:
temp_clean_address = clean_address(address)
clean_list.append(temp_clean_address)
return clean_list
def clean_address(address):
"""
Cleans a single email address.
:param address: String (email address)
:return: String (clean email address)
"""
if isinstance(address, header.Header):
return clean_address(address.encode('ascii'))
elif isinstance(address, str):
address = address.replace("<", "")
address = address.replace(">", "")
address = address.replace("\"", "")
address = address.replace("\n", " ")
address = address.replace("MAILER-DAEMON", "")
address = address.lower().strip()
email = None
for word in address.split(' '):
email_regex = re.compile(
"^[a-zA-Z0-9._%-]+@[a-zA-Z0-9._%-]+.[a-zA-Z]{2,6}$"
)
email = re.match(email_regex, word)
if email is not None:
clean_email = email.group(0)
if email is None:
if address.split(' ')[-1].find('@') > -1:
clean_email = address.split(' ')[-1].strip()
elif address.split(' ')[-1].find('?') > -1:
clean_email = 'n/a'
else:
clean_email = address
return clean_email
elif address is None:
return None
else:
raise ValueError('An unexpected type was given to clean_address. Address was {}'.format(address))
return None
def get_body(message):
"""
Extracts body text from an mbox message.
:param message: Mbox message
:return: String
"""
try:
sm = str(message)
body_start = sm.find('iamunique', sm.find('iamunique')+1)
body_start = sm.find('Content-Transfer-Encoding', body_start+1)
body_start = sm.find('\n', body_start+1)+1
body_end = sm.find('From: ', body_start + 1)
if body_end == -1:
body_end = sm.find('iamunique', body_start + 1)
body_end = sm.find('\n', body_end - 25)
body = sm[body_start:body_end]
body = body.replace("=20\n", "")
body = body.replace("=FC", "ü")
body = body.replace("=F6", "ö")
body = body.replace("=84", "\"")
body = body.replace("=94", "\"")
body = body.replace("=96", "-")
body = body.replace("=92", "\'")
body = body.replace("=93", "\"")
body = body.replace("=E4", "ä")
body = body.replace("=DF", "ss")
body = body.replace("=", "")
body = body.replace("\"", "")
body = body.replace("\'", "")
except:
body = None
return body
def write_table(mboxfile, mailTable):
"""
Takes a list and extends it with lists of data, which is
extracted from mbox messages.
:param mboxfile: Mbox file name/path
:param mailTable: A list (of lists)
:return: An extended list of lists
"""
mail_box_contents = mailbox.mbox(mboxfile)
m_pbar = tqdm.tqdm(range(0,len(mail_box_contents)))
m_pbar.set_description('Extracting mbox messages...')
count = 0
update_interval = min(50,len(mail_box_contents))
for message in mail_box_contents:
count += 1
if count % update_interval == 0:
m_pbar.update(update_interval)
clean_from = clean_address(message['From'])
clean_to = clean_addresses(message['To'])
clean_cc = clean_addresses(message['Cc'])
try:
clean_date = email.parsedate_to_datetime(message['Date'])
except:
clean_date = None
mailTable.append([
clean_from,
clean_to,
clean_cc,
clean_date,
message['Subject'],
get_body(message)
])
|
networks-lab/tidyextractors | tidyextractors/tidygit/git_extractor.py | GitExtractor._extract | python | def _extract(self, source, *args, **kwargs):
# Extract git test_data
self._data = extract_log(source)
# Shorten hashes
self._data['hexsha'] = self._data['hexsha'].apply(lambda s: s[:7]) | Extracts data from a local git repository. Mutates _data.
:param str source: The path to a local git repository.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidygit/git_extractor.py#L34-L47 | null | class GitExtractor(BaseExtractor):
"""
The ``GitExtractor`` class is for extracting data from local git repositories. This class
has methods for outputting data into the ``changes`` and ``commits`` tidy formats, and a
raw untidy format.
:param str source: The path to a local git repository
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
"""
def commits(self, drop_collections=True):
"""
Returns a table of git log data, with "commits" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
def changes(self):
"""
Returns a table of git log data, with "changes" as rows/observations.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame
"""
return self.expand_on('hexsha', 'changes', rename1='hexsha', rename2='file') |
networks-lab/tidyextractors | tidyextractors/tidygit/git_extractor.py | GitExtractor.commits | python | def commits(self, drop_collections=True):
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df | Returns a table of git log data, with "commits" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidygit/git_extractor.py#L49-L62 | [
"def _drop_collections(self, df):\n \"\"\"\n Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.\n\n :param pandas.DataFrame df: Usually self._data.\n :return: pandas.DataFrame\n \"\"\"\n all_cols = df.columns\n keep_cols = []\n\n # Check whether each column cont... | class GitExtractor(BaseExtractor):
"""
The ``GitExtractor`` class is for extracting data from local git repositories. This class
has methods for outputting data into the ``changes`` and ``commits`` tidy formats, and a
raw untidy format.
:param str source: The path to a local git repository
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
"""
def _extract(self, source, *args, **kwargs):
"""
Extracts data from a local git repository. Mutates _data.
:param str source: The path to a local git repository.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None
"""
# Extract git test_data
self._data = extract_log(source)
# Shorten hashes
self._data['hexsha'] = self._data['hexsha'].apply(lambda s: s[:7])
def changes(self):
"""
Returns a table of git log data, with "changes" as rows/observations.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame
"""
return self.expand_on('hexsha', 'changes', rename1='hexsha', rename2='file') |
networks-lab/tidyextractors | tidyextractors/tidymbox/mbox_extractor.py | MboxExtractor._extract | python | def _extract(self, source, *args, **kwargs):
# Extract data
self._data = mbox_to_pandas(source)
self._data['MessageID'] = pd.Series(range(0,len(self._data))) | Extracts data from mbox files. Mutates _data.
:param str source: The path to one or more mbox files.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidymbox/mbox_extractor.py#L36-L47 | null | class MboxExtractor(BaseExtractor):
"""
The ``MboxExtractor`` class is for extracting data from local Mbox files. This class
has methods for outputting data into the ``emails`` and ``sends`` tidy formats, and a
raw untidy format.
:param str source: The path to either a single mbox file or a directory containing multiple mbox files.
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
"""
def emails(self, drop_collections = True):
"""
Returns a table of mbox message data, with "messages" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
def sends(self):
"""
Returns a table of mbox message data, with "sender/recipient" pairs as rows/observations.
.. note::
Rows may have a recipient from either "TO" or "CC". SendType column specifies this for each row.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame
"""
# Expand on each "to" field
on_to_df = self.expand_on('From', 'To', rename1='From', rename2='Recipient')
on_cc_df = self.expand_on('From', 'Cc', rename1='From', rename2='Recipient')
# Specify how it was sent
on_to_df['SendType'] = 'To'
on_cc_df['SendType'] = 'Cc'
# Combine dataframes
output_df = pd.concat([on_to_df, on_cc_df])
return self._drop_collections(output_df)
|
networks-lab/tidyextractors | tidyextractors/tidymbox/mbox_extractor.py | MboxExtractor.emails | python | def emails(self, drop_collections = True):
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df | Returns a table of mbox message data, with "messages" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidymbox/mbox_extractor.py#L49-L62 | [
"def _drop_collections(self, df):\n \"\"\"\n Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.\n\n :param pandas.DataFrame df: Usually self._data.\n :return: pandas.DataFrame\n \"\"\"\n all_cols = df.columns\n keep_cols = []\n\n # Check whether each column cont... | class MboxExtractor(BaseExtractor):
"""
The ``MboxExtractor`` class is for extracting data from local Mbox files. This class
has methods for outputting data into the ``emails`` and ``sends`` tidy formats, and a
raw untidy format.
:param str source: The path to either a single mbox file or a directory containing multiple mbox files.
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
"""
def _extract(self, source, *args, **kwargs):
"""
Extracts data from mbox files. Mutates _data.
:param str source: The path to one or more mbox files.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None
"""
# Extract data
self._data = mbox_to_pandas(source)
self._data['MessageID'] = pd.Series(range(0,len(self._data)))
def sends(self):
"""
Returns a table of mbox message data, with "sender/recipient" pairs as rows/observations.
.. note::
Rows may have a recipient from either "TO" or "CC". SendType column specifies this for each row.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame
"""
# Expand on each "to" field
on_to_df = self.expand_on('From', 'To', rename1='From', rename2='Recipient')
on_cc_df = self.expand_on('From', 'Cc', rename1='From', rename2='Recipient')
# Specify how it was sent
on_to_df['SendType'] = 'To'
on_cc_df['SendType'] = 'Cc'
# Combine dataframes
output_df = pd.concat([on_to_df, on_cc_df])
return self._drop_collections(output_df)
|
networks-lab/tidyextractors | tidyextractors/tidymbox/mbox_extractor.py | MboxExtractor.sends | python | def sends(self):
# Expand on each "to" field
on_to_df = self.expand_on('From', 'To', rename1='From', rename2='Recipient')
on_cc_df = self.expand_on('From', 'Cc', rename1='From', rename2='Recipient')
# Specify how it was sent
on_to_df['SendType'] = 'To'
on_cc_df['SendType'] = 'Cc'
# Combine dataframes
output_df = pd.concat([on_to_df, on_cc_df])
return self._drop_collections(output_df) | Returns a table of mbox message data, with "sender/recipient" pairs as rows/observations.
.. note::
Rows may have a recipient from either "TO" or "CC". SendType column specifies this for each row.
.. note::
drop_collections is not available for this method, since there are no meaningful collections to keep.
:return: pandas.DataFrame | train | https://github.com/networks-lab/tidyextractors/blob/658448ed533beecf32adcc188fc64d1068d15ca6/tidyextractors/tidymbox/mbox_extractor.py#L64-L89 | [
"def _drop_collections(self, df):\n \"\"\"\n Drops columns containing collections (i.e. sets, dicts, lists) from a DataFrame.\n\n :param pandas.DataFrame df: Usually self._data.\n :return: pandas.DataFrame\n \"\"\"\n all_cols = df.columns\n keep_cols = []\n\n # Check whether each column cont... | class MboxExtractor(BaseExtractor):
"""
The ``MboxExtractor`` class is for extracting data from local Mbox files. This class
has methods for outputting data into the ``emails`` and ``sends`` tidy formats, and a
raw untidy format.
:param str source: The path to either a single mbox file or a directory containing multiple mbox files.
:param bool auto_extract: Defaults to True. If True, data is extracted automatically.
Otherwise, extraction must be initiated through the internal interface.
"""
def _extract(self, source, *args, **kwargs):
"""
Extracts data from mbox files. Mutates _data.
:param str source: The path to one or more mbox files.
:param args: Arbitrary arguments for extensibility.
:param kwargs: Arbitrary keyword arguments for extensibility.
:return: None
"""
# Extract data
self._data = mbox_to_pandas(source)
self._data['MessageID'] = pd.Series(range(0,len(self._data)))
def emails(self, drop_collections = True):
"""
Returns a table of mbox message data, with "messages" as rows/observations.
:param bool drop_collections: Defaults to True. Indicates whether columns with lists/dicts/sets will be dropped.
:return: pandas.DataFrame
"""
base_df = self._data
if drop_collections is True:
out_df = self._drop_collections(base_df)
else:
out_df = base_df
return out_df
|
piotr-rusin/spam-lists | spam_lists/validation.py | is_valid_host | python | def is_valid_host(value):
host_validators = validators.ipv4, validators.ipv6, validators.domain
return any(f(value) for f in host_validators) | Check if given value is a valid host string.
:param value: a value to test
:returns: True if the value is valid | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/validation.py#L15-L22 | null | # -*- coding: utf-8 -*-
"""Function and method argument validators used by the library."""
from __future__ import unicode_literals
import functools
import re
from future.moves.urllib.parse import urlparse
import validators
from .exceptions import InvalidURLError, InvalidHostError
URL_REGEX = re.compile(r'^[a-z0-9\.\-\+]*://' # scheme
r'(?:\S+(?::\S*)?@)?' # authentication
r'(?:[^/:]+|\[[0-9a-f:\.]+\])' # host
r'(?::\d{2,5})?' # port
r'(?:[/?#][^\s]*)?' # path, query or fragment
r'$', re.IGNORECASE)
def is_valid_url(value):
"""Check if given value is a valid URL string.
:param value: a value to test
:returns: True if the value is valid
"""
match = URL_REGEX.match(value)
host_str = urlparse(value).hostname
return match and is_valid_host(host_str)
def accepts_valid_host(func):
"""Return a wrapper that runs given method only for valid hosts.
:param func: a method to be wrapped
:returns: a wrapper that adds argument validation
"""
@functools.wraps(func)
def wrapper(obj, value, *args, **kwargs):
"""Run the function and return a value for a valid host.
:param obj: an object in whose class the func is defined
:param value: a value expected to be a valid host string
:returns: a return value of the function func
:raises InvalidHostError: if the value is not valid
"""
if not is_valid_host(value):
raise InvalidHostError
return func(obj, value, *args, **kwargs)
return wrapper
def accepts_valid_urls(func):
"""Return a wrapper that runs given method only for valid URLs.
:param func: a method to be wrapped
:returns: a wrapper that adds argument validation
"""
@functools.wraps(func)
def wrapper(obj, urls, *args, **kwargs):
"""Run the function and return a value for valid URLs.
:param obj: an object in whose class f is defined
:param urls: an iterable containing URLs
:returns: a return value of the function f
:raises InvalidURLError: if the iterable contains invalid URLs
"""
invalid_urls = [u for u in urls if not is_valid_url(u)]
if invalid_urls:
msg_tpl = 'The values: {} are not valid URLs'
msg = msg_tpl.format(','.join(invalid_urls))
raise InvalidURLError(msg)
return func(obj, urls, *args, **kwargs)
return wrapper
|
piotr-rusin/spam-lists | spam_lists/validation.py | is_valid_url | python | def is_valid_url(value):
match = URL_REGEX.match(value)
host_str = urlparse(value).hostname
return match and is_valid_host(host_str) | Check if given value is a valid URL string.
:param value: a value to test
:returns: True if the value is valid | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/validation.py#L33-L41 | [
"def is_valid_host(value):\n \"\"\"Check if given value is a valid host string.\n\n :param value: a value to test\n :returns: True if the value is valid\n \"\"\"\n host_validators = validators.ipv4, validators.ipv6, validators.domain\n return any(f(value) for f in host_validators)\n"
] | # -*- coding: utf-8 -*-
"""Function and method argument validators used by the library."""
from __future__ import unicode_literals
import functools
import re
from future.moves.urllib.parse import urlparse
import validators
from .exceptions import InvalidURLError, InvalidHostError
def is_valid_host(value):
"""Check if given value is a valid host string.
:param value: a value to test
:returns: True if the value is valid
"""
host_validators = validators.ipv4, validators.ipv6, validators.domain
return any(f(value) for f in host_validators)
URL_REGEX = re.compile(r'^[a-z0-9\.\-\+]*://' # scheme
r'(?:\S+(?::\S*)?@)?' # authentication
r'(?:[^/:]+|\[[0-9a-f:\.]+\])' # host
r'(?::\d{2,5})?' # port
r'(?:[/?#][^\s]*)?' # path, query or fragment
r'$', re.IGNORECASE)
def accepts_valid_host(func):
"""Return a wrapper that runs given method only for valid hosts.
:param func: a method to be wrapped
:returns: a wrapper that adds argument validation
"""
@functools.wraps(func)
def wrapper(obj, value, *args, **kwargs):
"""Run the function and return a value for a valid host.
:param obj: an object in whose class the func is defined
:param value: a value expected to be a valid host string
:returns: a return value of the function func
:raises InvalidHostError: if the value is not valid
"""
if not is_valid_host(value):
raise InvalidHostError
return func(obj, value, *args, **kwargs)
return wrapper
def accepts_valid_urls(func):
"""Return a wrapper that runs given method only for valid URLs.
:param func: a method to be wrapped
:returns: a wrapper that adds argument validation
"""
@functools.wraps(func)
def wrapper(obj, urls, *args, **kwargs):
"""Run the function and return a value for valid URLs.
:param obj: an object in whose class f is defined
:param urls: an iterable containing URLs
:returns: a return value of the function f
:raises InvalidURLError: if the iterable contains invalid URLs
"""
invalid_urls = [u for u in urls if not is_valid_url(u)]
if invalid_urls:
msg_tpl = 'The values: {} are not valid URLs'
msg = msg_tpl.format(','.join(invalid_urls))
raise InvalidURLError(msg)
return func(obj, urls, *args, **kwargs)
return wrapper
|
piotr-rusin/spam-lists | spam_lists/validation.py | accepts_valid_host | python | def accepts_valid_host(func):
@functools.wraps(func)
def wrapper(obj, value, *args, **kwargs):
"""Run the function and return a value for a valid host.
:param obj: an object in whose class the func is defined
:param value: a value expected to be a valid host string
:returns: a return value of the function func
:raises InvalidHostError: if the value is not valid
"""
if not is_valid_host(value):
raise InvalidHostError
return func(obj, value, *args, **kwargs)
return wrapper | Return a wrapper that runs given method only for valid hosts.
:param func: a method to be wrapped
:returns: a wrapper that adds argument validation | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/validation.py#L44-L62 | null | # -*- coding: utf-8 -*-
"""Function and method argument validators used by the library."""
from __future__ import unicode_literals
import functools
import re
from future.moves.urllib.parse import urlparse
import validators
from .exceptions import InvalidURLError, InvalidHostError
def is_valid_host(value):
"""Check if given value is a valid host string.
:param value: a value to test
:returns: True if the value is valid
"""
host_validators = validators.ipv4, validators.ipv6, validators.domain
return any(f(value) for f in host_validators)
URL_REGEX = re.compile(r'^[a-z0-9\.\-\+]*://' # scheme
r'(?:\S+(?::\S*)?@)?' # authentication
r'(?:[^/:]+|\[[0-9a-f:\.]+\])' # host
r'(?::\d{2,5})?' # port
r'(?:[/?#][^\s]*)?' # path, query or fragment
r'$', re.IGNORECASE)
def is_valid_url(value):
"""Check if given value is a valid URL string.
:param value: a value to test
:returns: True if the value is valid
"""
match = URL_REGEX.match(value)
host_str = urlparse(value).hostname
return match and is_valid_host(host_str)
def accepts_valid_urls(func):
"""Return a wrapper that runs given method only for valid URLs.
:param func: a method to be wrapped
:returns: a wrapper that adds argument validation
"""
@functools.wraps(func)
def wrapper(obj, urls, *args, **kwargs):
"""Run the function and return a value for valid URLs.
:param obj: an object in whose class f is defined
:param urls: an iterable containing URLs
:returns: a return value of the function f
:raises InvalidURLError: if the iterable contains invalid URLs
"""
invalid_urls = [u for u in urls if not is_valid_url(u)]
if invalid_urls:
msg_tpl = 'The values: {} are not valid URLs'
msg = msg_tpl.format(','.join(invalid_urls))
raise InvalidURLError(msg)
return func(obj, urls, *args, **kwargs)
return wrapper
|
piotr-rusin/spam-lists | spam_lists/validation.py | accepts_valid_urls | python | def accepts_valid_urls(func):
@functools.wraps(func)
def wrapper(obj, urls, *args, **kwargs):
"""Run the function and return a value for valid URLs.
:param obj: an object in whose class f is defined
:param urls: an iterable containing URLs
:returns: a return value of the function f
:raises InvalidURLError: if the iterable contains invalid URLs
"""
invalid_urls = [u for u in urls if not is_valid_url(u)]
if invalid_urls:
msg_tpl = 'The values: {} are not valid URLs'
msg = msg_tpl.format(','.join(invalid_urls))
raise InvalidURLError(msg)
return func(obj, urls, *args, **kwargs)
return wrapper | Return a wrapper that runs given method only for valid URLs.
:param func: a method to be wrapped
:returns: a wrapper that adds argument validation | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/validation.py#L65-L86 | null | # -*- coding: utf-8 -*-
"""Function and method argument validators used by the library."""
from __future__ import unicode_literals
import functools
import re
from future.moves.urllib.parse import urlparse
import validators
from .exceptions import InvalidURLError, InvalidHostError
def is_valid_host(value):
"""Check if given value is a valid host string.
:param value: a value to test
:returns: True if the value is valid
"""
host_validators = validators.ipv4, validators.ipv6, validators.domain
return any(f(value) for f in host_validators)
URL_REGEX = re.compile(r'^[a-z0-9\.\-\+]*://' # scheme
r'(?:\S+(?::\S*)?@)?' # authentication
r'(?:[^/:]+|\[[0-9a-f:\.]+\])' # host
r'(?::\d{2,5})?' # port
r'(?:[/?#][^\s]*)?' # path, query or fragment
r'$', re.IGNORECASE)
def is_valid_url(value):
"""Check if given value is a valid URL string.
:param value: a value to test
:returns: True if the value is valid
"""
match = URL_REGEX.match(value)
host_str = urlparse(value).hostname
return match and is_valid_host(host_str)
def accepts_valid_host(func):
"""Return a wrapper that runs given method only for valid hosts.
:param func: a method to be wrapped
:returns: a wrapper that adds argument validation
"""
@functools.wraps(func)
def wrapper(obj, value, *args, **kwargs):
"""Run the function and return a value for a valid host.
:param obj: an object in whose class the func is defined
:param value: a value expected to be a valid host string
:returns: a return value of the function func
:raises InvalidHostError: if the value is not valid
"""
if not is_valid_host(value):
raise InvalidHostError
return func(obj, value, *args, **kwargs)
return wrapper
|
piotr-rusin/spam-lists | spam_lists/structures.py | create_host | python | def create_host(factories, value):
data = [value]
for func in factories:
try:
return func(value)
except InvalidHostError as ex:
data.append(str(ex))
msg_tpl = (
"Failed to create a host object for '{}', raising the following errors"
" in the process:" + "\n".join(data)
)
raise InvalidHostError(msg_tpl.format(value)) | Use the factories to create a host object.
:param factories: a list of functions that return host objects
(Hostname, IPv4Address, IPv6Address) for valid arguments
:param value: a value to be passed as argument to factories
:returns: an object representing the value, created by one of
the factories.
It is a return value of the first factory that could create it for
the given argument.
:raises InvalidHostError: if the value is not a valid input for any
factory used by this function | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/structures.py#L190-L216 | null | # -*- coding: utf-8 -*-
"""Classes representing values used by the library, and their factories."""
from __future__ import unicode_literals
from collections import namedtuple
import ipaddress
from builtins import str, object # pylint: disable=redefined-builtin
from dns import name
from dns.reversename import (
ipv4_reverse_domain, ipv6_reverse_domain, from_address as name_from_ip
)
from future.utils import raise_with_traceback
import tldextract
import validators
from .exceptions import (
InvalidHostError, InvalidHostnameError, InvalidIPv4Error, InvalidIPv6Error
)
from .compat import lru_cache
class Host(object):
"""A base class for host objects."""
def __lt__(self, other):
"""Check if self is less than the other.
This method is necessary for sorting and search algorithms
using bisect_right.
:param other: a value to be compared
:returns: result of comparison between value attributes of both
this object and the other, or of comparison between their
unicode string representations.
:raises TypeError: in case of the other not having either value
or to_unicode attributes
"""
try:
try:
result = self.value.__lt__(other.value)
except TypeError:
return self._compare_strings(other)
else:
if result == NotImplemented:
result = self._compare_strings(other)
return result
except AttributeError:
msg = 'Unorderable types: {}() < {}()'.format(
self.__class__.__name__,
other.__class__.__name__
)
raise TypeError(msg)
def _compare_strings(self, other):
return self.to_unicode() < other.to_unicode()
class Hostname(Host):
"""A class of objects representing hostname values.
The instances are used as values tested by clients of
hostname-listing services or as items stored by custom host list
objects.
"""
def __init__(self, value):
"""Initialize a new instance.
:param value: a string representing a hostname
:raises InvalidHostnameError: if value parameter is not
a valid domain
"""
value = str(value)
if not validators.domain(value):
msg = "'{}' is not a valid hostname".format(value)
raise_with_traceback(InvalidHostnameError(msg))
hostname = name.Name(value.split('.'))
self.value = hostname
self.relative_domain = hostname
def is_subdomain(self, other):
"""Test if the object is a subdomain of the other.
:param other: the object to which we compare this instance
:returns: True if this instance is a subdomain of the other
"""
compared = other.value if hasattr(other, 'value') else other
try:
return self.value.is_subdomain(compared)
except AttributeError:
return False
is_match = is_subdomain
def to_unicode(self):
"""Get a string value of the object.
:returns: the hostname as a unicode string
"""
return self.value.to_unicode()
class IPAddress(Host):
"""A class of objects representing IP address values.
The instances are used as values tested by clients of
IP-address-listing services or as items stored by custom host list
objects.
"""
reverse_domain = None
def __init__(self, value):
"""Initialize a new instance.
:param value: a valid ip address for this class
:raises self.invalid_ip_error_type: if the value is not
a valid ip address for this class
"""
try:
self.value = self.factory(value)
except ValueError:
msg_tpl = '{} is not a valid ip address for {}'
msg = msg_tpl.format(value, self.__class__)
raise_with_traceback(self.invalid_ip_error_type(msg))
@property
def relative_domain(self):
"""Get a relative domain name representing the ip address.
:returns: the reverse pointer relative to the common root
depending on the version of ip address represented by
this object
"""
return name_from_ip(str(self.value)).relativize(self.reverse_domain)
def is_subdomain(self, _):
# pylint: disable=no-self-use
"""Check if this object is a subdomain of the other.
:param other: another host
:returns: False, because ip address is not a domain
"""
return False
def to_unicode(self):
"""Get unicode string representing the object.
:returns: the ip value as unicode string
"""
return str(self.value)
def is_match(self, other):
"""Check if self matches the other.
:param other: the object to which this instance is compared
"""
return self == other
class IPv4Address(IPAddress):
"""A class of objects representing IPv4 addresses."""
factory = ipaddress.IPv4Address
reverse_domain = ipv4_reverse_domain
invalid_ip_error_type = InvalidIPv4Error
class IPv6Address(IPAddress):
"""A class of objects representing IPv6 addresses."""
factory = ipaddress.IPv6Address
reverse_domain = ipv6_reverse_domain
invalid_ip_error_type = InvalidIPv6Error
def cached(function):
return lru_cache()(function)
hostname = cached(Hostname)
ip_v4 = cached(IPv4Address)
ip_v6 = cached(IPv6Address)
@cached
def ip_address(value):
"""Create an IP address object.
:param value: a valid IP address
:returns: an instance of a subclass of .structures.IPAddress
:raises InvalidHostError: if the value is not a valid IPv4 or
IPv6 address
"""
factories = ip_v4, ip_v6
return create_host(factories, value)
def hostname_or_ip(value):
"""Create a hostname or an IP address object.
:param value: a valid host string
:returns: a host object for given value
:raises InvalidHostError: if the value is not a valid hostname or
IP address
"""
factories = ip_v4, ip_v6, hostname
return create_host(factories, value)
TLD_EXTRACTOR = tldextract.TLDExtract()
def registered_domain(value):
"""Create a Hostname instance representing a registered domain.
:param value: a valid host string
:returns: a Hostname instance representing a registered domain
extracted from the given value.
:raises InvalidHostnameError: if the value is not a valid hostname
"""
registered_domain_string = TLD_EXTRACTOR(value).registered_domain
return hostname(registered_domain_string)
def registered_domain_or_ip(value):
"""Get a host object for a registered domain or an ip address.
:param value: a valid hostname or ip string
:returns: a host object representing a registered domain extracted
from the given hostname, or an ip address
:raises InvalidHostError: if the value is not a valid host
"""
factories = ip_v4, ip_v6, registered_domain
return create_host(factories, value)
def non_ipv6_host(value):
"""Get a host object for a registered domain or an IPv4 address.
:param value: a valid hostname or IPv4 string
:returns: a host object representing a registered domain extracted
from the given hostname, or an IPv4 address
:raises InvalidHostError: if the value is not a valid hostname or
IPv4 address
"""
factories = ip_v4, registered_domain
return create_host(factories, value)
AddressListItem = namedtuple('AddressListItem', 'value source classification')
"""A container for data of an item listed by services or custom lists."""
|
piotr-rusin/spam-lists | spam_lists/structures.py | Hostname.is_subdomain | python | def is_subdomain(self, other):
compared = other.value if hasattr(other, 'value') else other
try:
return self.value.is_subdomain(compared)
except AttributeError:
return False | Test if the object is a subdomain of the other.
:param other: the object to which we compare this instance
:returns: True if this instance is a subdomain of the other | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/structures.py#L84-L94 | null | class Hostname(Host):
"""A class of objects representing hostname values.
The instances are used as values tested by clients of
hostname-listing services or as items stored by custom host list
objects.
"""
def __init__(self, value):
"""Initialize a new instance.
:param value: a string representing a hostname
:raises InvalidHostnameError: if value parameter is not
a valid domain
"""
value = str(value)
if not validators.domain(value):
msg = "'{}' is not a valid hostname".format(value)
raise_with_traceback(InvalidHostnameError(msg))
hostname = name.Name(value.split('.'))
self.value = hostname
self.relative_domain = hostname
is_match = is_subdomain
def to_unicode(self):
"""Get a string value of the object.
:returns: the hostname as a unicode string
"""
return self.value.to_unicode()
|
piotr-rusin/spam-lists | spam_lists/composites.py | RedirectURLResolver.get_locations | python | def get_locations(self, url):
if not is_valid_url(url):
raise InvalidURLError('{} is not a valid URL'.format(url))
try:
response = self.session.head(url)
except (ConnectionError, InvalidSchema, Timeout):
raise StopIteration
try:
generator = self.session.resolve_redirects(
response,
response.request
)
for response in generator:
yield response.url
except InvalidURL:
pass
except (ConnectionError, InvalidSchema, Timeout) as error:
last_url = response.headers['location']
if isinstance(error, Timeout) or is_valid_url(last_url):
yield last_url | Get valid location header values from responses.
:param url: a URL address. If a HEAD request sent to it
fails because the address has invalid schema, times out
or there is a connection error, the generator yields nothing.
:returns: valid redirection addresses. If a request for
a redirection address fails, and the address is still a valid
URL string, it's included as the last yielded value. If it's
not, the previous value is the last one.
:raises ValuError: if the argument is not a valid URL | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/composites.py#L68-L98 | [
"def is_valid_url(value):\n \"\"\"Check if given value is a valid URL string.\n\n :param value: a value to test\n :returns: True if the value is valid\n \"\"\"\n match = URL_REGEX.match(value)\n host_str = urlparse(value).hostname\n return match and is_valid_host(host_str)\n"
] | class RedirectURLResolver(object):
"""Extracts URL addresses from responses and location headers.
Instances of this class can be used to acquire the following:
* URL addresses of all responses acquired for a HEAD request in its
response history
* value of location header for the last response, if it is a valid
URL but we still couldn't get a response for it
"""
def __init__(self, requests_session=Session()):
"""Initialize a new instance.
:param requests_session: a session object implementing
methods:
* head(url) (for HEAD request)
* resolve_redirects(response, request)
"""
self.session = requests_session
def get_new_locations(self, urls):
"""Get valid location header values for all given URLs.
The returned values are new, that is: they do not repeat any
value contained in the original input. Only unique values
are yielded.
:param urls: a list of URL addresses
:returns: valid location header values from responses
to the URLs
"""
seen = set(urls)
for i in urls:
for k in self.get_locations(i):
if k not in seen:
seen.add(k)
yield k
def get_urls_and_locations(self, urls):
"""Get URLs and their redirection addresses.
:param urls: a list of URL addresses
:returns: an instance of CachedIterable containing given URLs
and valid location header values of their responses
"""
location_generator = self.get_new_locations(urls)
initial_cache = list(set(urls))
return CachedIterable(location_generator, initial_cache)
|
piotr-rusin/spam-lists | spam_lists/composites.py | RedirectURLResolver.get_new_locations | python | def get_new_locations(self, urls):
seen = set(urls)
for i in urls:
for k in self.get_locations(i):
if k not in seen:
seen.add(k)
yield k | Get valid location header values for all given URLs.
The returned values are new, that is: they do not repeat any
value contained in the original input. Only unique values
are yielded.
:param urls: a list of URL addresses
:returns: valid location header values from responses
to the URLs | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/composites.py#L100-L116 | [
"def get_locations(self, url):\n \"\"\"Get valid location header values from responses.\n\n :param url: a URL address. If a HEAD request sent to it\n fails because the address has invalid schema, times out\n or there is a connection error, the generator yields nothing.\n :returns: valid redirection a... | class RedirectURLResolver(object):
"""Extracts URL addresses from responses and location headers.
Instances of this class can be used to acquire the following:
* URL addresses of all responses acquired for a HEAD request in its
response history
* value of location header for the last response, if it is a valid
URL but we still couldn't get a response for it
"""
def __init__(self, requests_session=Session()):
"""Initialize a new instance.
:param requests_session: a session object implementing
methods:
* head(url) (for HEAD request)
* resolve_redirects(response, request)
"""
self.session = requests_session
def get_locations(self, url):
"""Get valid location header values from responses.
:param url: a URL address. If a HEAD request sent to it
fails because the address has invalid schema, times out
or there is a connection error, the generator yields nothing.
:returns: valid redirection addresses. If a request for
a redirection address fails, and the address is still a valid
URL string, it's included as the last yielded value. If it's
not, the previous value is the last one.
:raises ValuError: if the argument is not a valid URL
"""
if not is_valid_url(url):
raise InvalidURLError('{} is not a valid URL'.format(url))
try:
response = self.session.head(url)
except (ConnectionError, InvalidSchema, Timeout):
raise StopIteration
try:
generator = self.session.resolve_redirects(
response,
response.request
)
for response in generator:
yield response.url
except InvalidURL:
pass
except (ConnectionError, InvalidSchema, Timeout) as error:
last_url = response.headers['location']
if isinstance(error, Timeout) or is_valid_url(last_url):
yield last_url
def get_urls_and_locations(self, urls):
"""Get URLs and their redirection addresses.
:param urls: a list of URL addresses
:returns: an instance of CachedIterable containing given URLs
and valid location header values of their responses
"""
location_generator = self.get_new_locations(urls)
initial_cache = list(set(urls))
return CachedIterable(location_generator, initial_cache)
|
piotr-rusin/spam-lists | spam_lists/composites.py | RedirectURLResolver.get_urls_and_locations | python | def get_urls_and_locations(self, urls):
location_generator = self.get_new_locations(urls)
initial_cache = list(set(urls))
return CachedIterable(location_generator, initial_cache) | Get URLs and their redirection addresses.
:param urls: a list of URL addresses
:returns: an instance of CachedIterable containing given URLs
and valid location header values of their responses | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/composites.py#L118-L127 | [
"def get_new_locations(self, urls):\n \"\"\"Get valid location header values for all given URLs.\n\n The returned values are new, that is: they do not repeat any\n value contained in the original input. Only unique values\n are yielded.\n\n :param urls: a list of URL addresses\n :returns: valid lo... | class RedirectURLResolver(object):
"""Extracts URL addresses from responses and location headers.
Instances of this class can be used to acquire the following:
* URL addresses of all responses acquired for a HEAD request in its
response history
* value of location header for the last response, if it is a valid
URL but we still couldn't get a response for it
"""
def __init__(self, requests_session=Session()):
"""Initialize a new instance.
:param requests_session: a session object implementing
methods:
* head(url) (for HEAD request)
* resolve_redirects(response, request)
"""
self.session = requests_session
def get_locations(self, url):
"""Get valid location header values from responses.
:param url: a URL address. If a HEAD request sent to it
fails because the address has invalid schema, times out
or there is a connection error, the generator yields nothing.
:returns: valid redirection addresses. If a request for
a redirection address fails, and the address is still a valid
URL string, it's included as the last yielded value. If it's
not, the previous value is the last one.
:raises ValuError: if the argument is not a valid URL
"""
if not is_valid_url(url):
raise InvalidURLError('{} is not a valid URL'.format(url))
try:
response = self.session.head(url)
except (ConnectionError, InvalidSchema, Timeout):
raise StopIteration
try:
generator = self.session.resolve_redirects(
response,
response.request
)
for response in generator:
yield response.url
except InvalidURL:
pass
except (ConnectionError, InvalidSchema, Timeout) as error:
last_url = response.headers['location']
if isinstance(error, Timeout) or is_valid_url(last_url):
yield last_url
def get_new_locations(self, urls):
"""Get valid location header values for all given URLs.
The returned values are new, that is: they do not repeat any
value contained in the original input. Only unique values
are yielded.
:param urls: a list of URL addresses
:returns: valid location header values from responses
to the URLs
"""
seen = set(urls)
for i in urls:
for k in self.get_locations(i):
if k not in seen:
seen.add(k)
yield k
|
piotr-rusin/spam-lists | spam_lists/host_collections.py | BaseHostCollection.add | python | def add(self, host_value):
host_obj = self._host_factory(host_value)
if self._get_match(host_obj) is not None:
return
self._add_new(host_obj) | Add the given value to the collection.
:param host: an ip address or a hostname
:raises InvalidHostError: raised when the given value
is not a valid ip address nor a hostname | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/host_collections.py#L64-L74 | null | class BaseHostCollection(HostList):
"""Base class for containers storing ip addresses and domain names."""
def __init__(
self,
identifier,
classification,
hosts=None,
host_factory=hostname_or_ip
):
"""Initialize a new instance.
:param identifier: an identifier of this instance of host collection
:param classification: a list or tuple containing strings representing
types of items, assigned to each element of the collection
:param hosts: an object storing ip adresses and hostnames.
It must have the following methods:
* __getitem__
* __len__
* pop
* append
:param host_factory: a callable used to create hosts objects stored
in the collection or representing values searched in it.
"""
self.identifier = identifier
self.classification = set(classification)
self.hosts = hosts if hosts is not None else []
super(BaseHostCollection, self).__init__(host_factory)
def __len__(self):
"""Get the number of elements in the collection."""
return len(self.hosts)
def __getitem__(self, index):
"""Get an element of the collection with given index."""
if isinstance(index, slice):
return self.__class__(
self.identifier,
self.classification,
self.hosts[index],
self._host_factory
)
return self._host_factory(self.hosts[index])
def _contains(self, host_object):
match = self._get_match(host_object)
return match is not None
def _get_match_and_classification(self, host_object):
match = self._get_match(host_object)
_class = None if match is None else self.classification
return match, _class
def _add_new(self, host_object):
"""Add a new host to the collection.
A new host is defined as a value not currently listed
(in case of both hostnames and ip) or not currently covered by
another value, for eaxmple: a hostname whose parent domain is
not yet listed.
Before a new hostname can be added, all its subdomains already
present in the collection must be removed.
:param host_obj: an object representing value to be added.
It is assumed that, during execution of this method,
the value to be added is not currently listed.
"""
raise NotImplementedError
|
piotr-rusin/spam-lists | spam_lists/host_collections.py | SortedHostCollection._get_match | python | def _get_match(self, host_object):
i = self._get_insertion_point(host_object)
potential_match = None
try:
potential_match = self[i-1]
except IndexError:
pass
if host_object.is_match(potential_match):
return potential_match
return None | Get an item matching the given host object.
The item may be either a parent domain or identical value.
Parent domains and existing identical values always precede
insertion point for given value - therefore, we treat
an item just before insertion point as potential match.
:param host_object: an object representing ip address
or hostname whose match we are trying to find | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/host_collections.py#L118-L138 | null | class SortedHostCollection(BaseHostCollection):
"""Represents a custom sorted collection of hosts."""
def _get_insertion_point(self, host_obj):
return bisect_right(self, host_obj)
def _add_new(self, host_object):
"""Add a new host to the collection.
Before a new hostname can be added, all its subdomains already
present in the collection must be removed. Since the collection
is sorted, we can limit our search for them to a slice of
the collection starting from insertion point and ending with
the last detected subdomain.
:param host_obj: an object representing value to be added.
It is assumed that, during execution of this method,
the value to be added is not currently listed.
"""
i = self._get_insertion_point(host_object)
for listed in self[i:]:
if not listed.is_subdomain(host_object):
break
self.hosts.pop(i)
self.hosts.insert(i, host_object.to_unicode())
|
piotr-rusin/spam-lists | spam_lists/host_collections.py | SortedHostCollection._add_new | python | def _add_new(self, host_object):
i = self._get_insertion_point(host_object)
for listed in self[i:]:
if not listed.is_subdomain(host_object):
break
self.hosts.pop(i)
self.hosts.insert(i, host_object.to_unicode()) | Add a new host to the collection.
Before a new hostname can be added, all its subdomains already
present in the collection must be removed. Since the collection
is sorted, we can limit our search for them to a slice of
the collection starting from insertion point and ending with
the last detected subdomain.
:param host_obj: an object representing value to be added.
It is assumed that, during execution of this method,
the value to be added is not currently listed. | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/host_collections.py#L140-L160 | null | class SortedHostCollection(BaseHostCollection):
"""Represents a custom sorted collection of hosts."""
def _get_insertion_point(self, host_obj):
return bisect_right(self, host_obj)
def _get_match(self, host_object):
"""Get an item matching the given host object.
The item may be either a parent domain or identical value.
Parent domains and existing identical values always precede
insertion point for given value - therefore, we treat
an item just before insertion point as potential match.
:param host_object: an object representing ip address
or hostname whose match we are trying to find
"""
i = self._get_insertion_point(host_object)
potential_match = None
try:
potential_match = self[i-1]
except IndexError:
pass
if host_object.is_match(potential_match):
return potential_match
return None
|
piotr-rusin/spam-lists | spam_lists/host_list.py | HostList.lookup | python | def lookup(self, host_value):
try:
host_object = self._host_factory(host_value)
except InvalidHostError:
return None
result = self._get_match_and_classification(
host_object
)
host_item, classification = result
if host_item is not None:
return AddressListItem(
host_item.to_unicode(),
self,
classification
)
return None | Get a host value matching the given value.
:param host_value: a value of the host of a type that can be
listed by the service
:returns: an instance of AddressListItem representing
a matched value
:raises InvalidHostError: if the argument is not a valid
host string | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/host_list.py#L66-L90 | [
"def _get_match_and_classification(self, host_value):\n \"\"\"Get value and data stored for the given value.\n\n :param host_value: a host value\n :returns: a tuple containing listed item and its classification\n as a tuple containing all classification groups to which\n the item belongs\n \"\"\"\... | class HostList(object):
"""A base class for objects representing host lists.
Objects representing host lists are defined as custom host
whitelists and blacklists or clients of online host blacklists.
"""
def __init__(self, host_factory):
"""Initialize a new instance.
:param host_factory: a function responsible for creating valid
host objects. It may raise InvalidHostError (or its subclasses)
if a value passed to it is not a valid host of type accepted by
the factory.
"""
self._host_factory = host_factory
def _contains(self, host_value):
"""Check if host list contains a match for the given value.
:param host_value: a host value
:returns: True if the service lists a matching value
"""
raise NotImplementedError
def _get_match_and_classification(self, host_value):
"""Get value and data stored for the given value.
:param host_value: a host value
:returns: a tuple containing listed item and its classification
as a tuple containing all classification groups to which
the item belongs
"""
raise NotImplementedError
@accepts_valid_host
def __contains__(self, host_value):
"""Check if the given host value is listed by the host list.
:param host_value: a string representing a valid host
:returns: True if the host is listed
:raises InvalidHostError: if the argument is not a valid
host string
"""
try:
host_object = self._host_factory(host_value)
except InvalidHostError:
return False
return self._contains(host_object)
@accepts_valid_host
@accepts_valid_urls
def any_match(self, urls):
"""Check if any of the given URLs has a matching host.
:param urls: an iterable containing URLs
:returns: True if any host has a listed match
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
return any(urlparse(u).hostname in self for u in urls)
@accepts_valid_urls
def lookup_matching(self, urls):
"""Get matching hosts for the given URLs.
:param urls: an iterable containing URLs
:returns: instances of AddressListItem representing listed
hosts matching the ones used by the given URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
hosts = (urlparse(u).hostname for u in urls)
for val in hosts:
item = self.lookup(val)
if item is not None:
yield item
@accepts_valid_urls
def filter_matching(self, urls):
"""Get URLs with hosts matching any listed ones.
:param urls: an iterable containing URLs to filter
:returns: a generator yielding matching URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url in urls:
if urlparse(url).hostname in self:
yield url
|
piotr-rusin/spam-lists | spam_lists/host_list.py | HostList.any_match | python | def any_match(self, urls):
return any(urlparse(u).hostname in self for u in urls) | Check if any of the given URLs has a matching host.
:param urls: an iterable containing URLs
:returns: True if any host has a listed match
:raises InvalidURLError: if there are any invalid URLs in
the sequence | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/host_list.py#L93-L101 | null | class HostList(object):
"""A base class for objects representing host lists.
Objects representing host lists are defined as custom host
whitelists and blacklists or clients of online host blacklists.
"""
def __init__(self, host_factory):
"""Initialize a new instance.
:param host_factory: a function responsible for creating valid
host objects. It may raise InvalidHostError (or its subclasses)
if a value passed to it is not a valid host of type accepted by
the factory.
"""
self._host_factory = host_factory
def _contains(self, host_value):
"""Check if host list contains a match for the given value.
:param host_value: a host value
:returns: True if the service lists a matching value
"""
raise NotImplementedError
def _get_match_and_classification(self, host_value):
"""Get value and data stored for the given value.
:param host_value: a host value
:returns: a tuple containing listed item and its classification
as a tuple containing all classification groups to which
the item belongs
"""
raise NotImplementedError
@accepts_valid_host
def __contains__(self, host_value):
"""Check if the given host value is listed by the host list.
:param host_value: a string representing a valid host
:returns: True if the host is listed
:raises InvalidHostError: if the argument is not a valid
host string
"""
try:
host_object = self._host_factory(host_value)
except InvalidHostError:
return False
return self._contains(host_object)
@accepts_valid_host
def lookup(self, host_value):
"""Get a host value matching the given value.
:param host_value: a value of the host of a type that can be
listed by the service
:returns: an instance of AddressListItem representing
a matched value
:raises InvalidHostError: if the argument is not a valid
host string
"""
try:
host_object = self._host_factory(host_value)
except InvalidHostError:
return None
result = self._get_match_and_classification(
host_object
)
host_item, classification = result
if host_item is not None:
return AddressListItem(
host_item.to_unicode(),
self,
classification
)
return None
@accepts_valid_urls
@accepts_valid_urls
def lookup_matching(self, urls):
"""Get matching hosts for the given URLs.
:param urls: an iterable containing URLs
:returns: instances of AddressListItem representing listed
hosts matching the ones used by the given URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
hosts = (urlparse(u).hostname for u in urls)
for val in hosts:
item = self.lookup(val)
if item is not None:
yield item
@accepts_valid_urls
def filter_matching(self, urls):
"""Get URLs with hosts matching any listed ones.
:param urls: an iterable containing URLs to filter
:returns: a generator yielding matching URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url in urls:
if urlparse(url).hostname in self:
yield url
|
piotr-rusin/spam-lists | spam_lists/host_list.py | HostList.lookup_matching | python | def lookup_matching(self, urls):
hosts = (urlparse(u).hostname for u in urls)
for val in hosts:
item = self.lookup(val)
if item is not None:
yield item | Get matching hosts for the given URLs.
:param urls: an iterable containing URLs
:returns: instances of AddressListItem representing listed
hosts matching the ones used by the given URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/host_list.py#L104-L117 | null | class HostList(object):
"""A base class for objects representing host lists.
Objects representing host lists are defined as custom host
whitelists and blacklists or clients of online host blacklists.
"""
def __init__(self, host_factory):
"""Initialize a new instance.
:param host_factory: a function responsible for creating valid
host objects. It may raise InvalidHostError (or its subclasses)
if a value passed to it is not a valid host of type accepted by
the factory.
"""
self._host_factory = host_factory
def _contains(self, host_value):
"""Check if host list contains a match for the given value.
:param host_value: a host value
:returns: True if the service lists a matching value
"""
raise NotImplementedError
def _get_match_and_classification(self, host_value):
"""Get value and data stored for the given value.
:param host_value: a host value
:returns: a tuple containing listed item and its classification
as a tuple containing all classification groups to which
the item belongs
"""
raise NotImplementedError
@accepts_valid_host
def __contains__(self, host_value):
"""Check if the given host value is listed by the host list.
:param host_value: a string representing a valid host
:returns: True if the host is listed
:raises InvalidHostError: if the argument is not a valid
host string
"""
try:
host_object = self._host_factory(host_value)
except InvalidHostError:
return False
return self._contains(host_object)
@accepts_valid_host
def lookup(self, host_value):
"""Get a host value matching the given value.
:param host_value: a value of the host of a type that can be
listed by the service
:returns: an instance of AddressListItem representing
a matched value
:raises InvalidHostError: if the argument is not a valid
host string
"""
try:
host_object = self._host_factory(host_value)
except InvalidHostError:
return None
result = self._get_match_and_classification(
host_object
)
host_item, classification = result
if host_item is not None:
return AddressListItem(
host_item.to_unicode(),
self,
classification
)
return None
@accepts_valid_urls
def any_match(self, urls):
"""Check if any of the given URLs has a matching host.
:param urls: an iterable containing URLs
:returns: True if any host has a listed match
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
return any(urlparse(u).hostname in self for u in urls)
@accepts_valid_urls
@accepts_valid_urls
def filter_matching(self, urls):
"""Get URLs with hosts matching any listed ones.
:param urls: an iterable containing URLs to filter
:returns: a generator yielding matching URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url in urls:
if urlparse(url).hostname in self:
yield url
|
piotr-rusin/spam-lists | spam_lists/host_list.py | HostList.filter_matching | python | def filter_matching(self, urls):
for url in urls:
if urlparse(url).hostname in self:
yield url | Get URLs with hosts matching any listed ones.
:param urls: an iterable containing URLs to filter
:returns: a generator yielding matching URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/host_list.py#L120-L130 | null | class HostList(object):
"""A base class for objects representing host lists.
Objects representing host lists are defined as custom host
whitelists and blacklists or clients of online host blacklists.
"""
def __init__(self, host_factory):
"""Initialize a new instance.
:param host_factory: a function responsible for creating valid
host objects. It may raise InvalidHostError (or its subclasses)
if a value passed to it is not a valid host of type accepted by
the factory.
"""
self._host_factory = host_factory
def _contains(self, host_value):
"""Check if host list contains a match for the given value.
:param host_value: a host value
:returns: True if the service lists a matching value
"""
raise NotImplementedError
def _get_match_and_classification(self, host_value):
"""Get value and data stored for the given value.
:param host_value: a host value
:returns: a tuple containing listed item and its classification
as a tuple containing all classification groups to which
the item belongs
"""
raise NotImplementedError
@accepts_valid_host
def __contains__(self, host_value):
"""Check if the given host value is listed by the host list.
:param host_value: a string representing a valid host
:returns: True if the host is listed
:raises InvalidHostError: if the argument is not a valid
host string
"""
try:
host_object = self._host_factory(host_value)
except InvalidHostError:
return False
return self._contains(host_object)
@accepts_valid_host
def lookup(self, host_value):
"""Get a host value matching the given value.
:param host_value: a value of the host of a type that can be
listed by the service
:returns: an instance of AddressListItem representing
a matched value
:raises InvalidHostError: if the argument is not a valid
host string
"""
try:
host_object = self._host_factory(host_value)
except InvalidHostError:
return None
result = self._get_match_and_classification(
host_object
)
host_item, classification = result
if host_item is not None:
return AddressListItem(
host_item.to_unicode(),
self,
classification
)
return None
@accepts_valid_urls
def any_match(self, urls):
"""Check if any of the given URLs has a matching host.
:param urls: an iterable containing URLs
:returns: True if any host has a listed match
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
return any(urlparse(u).hostname in self for u in urls)
@accepts_valid_urls
def lookup_matching(self, urls):
"""Get matching hosts for the given URLs.
:param urls: an iterable containing URLs
:returns: instances of AddressListItem representing listed
hosts matching the ones used by the given URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
hosts = (urlparse(u).hostname for u in urls)
for val in hosts:
item = self.lookup(val)
if item is not None:
yield item
@accepts_valid_urls
|
piotr-rusin/spam-lists | spam_lists/clients.py | get_powers_of_2 | python | def get_powers_of_2(_sum):
return [2**y for y, x in enumerate(bin(_sum)[:1:-1]) if int(x)] | Get powers of 2 that sum up to the given number.
This function transforms given integer to a binary string.
A reversed value limited to digits of binary number is extracted
from it, and each of its characters is enumerated.
Each digit is tested for not being 0. If the test passes, the index
associated with the digit is used as an exponent to get the next
value in the sequence to be returned.
:param _sum: a sum of all elements of the sequence to be returned
:returns: a list of powers of two whose sum is given | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/clients.py#L101-L115 | null | # -*- coding: utf-8 -*-
"""Clients of online blacklist services.
This module contains classes of clients of online services recognizing
hostnames, IP addresses or URLs as malicious.
It also contains instances of those of the classes for which no
user-specific information (like application identifiers or API codes)
is required.
"""
from __future__ import unicode_literals
# pylint: disable=redefined-builtin
from builtins import zip, str, range, object
from dns import name
from dns.resolver import NXDOMAIN, query
from future.utils import raise_from
from requests import get, post
from requests.exceptions import HTTPError
from .exceptions import UnathorizedAPIKeyError, UnknownCodeError
from .host_list import HostList
from .structures import (
AddressListItem, non_ipv6_host, ip_address, registered_domain,
registered_domain_or_ip
)
from .validation import accepts_valid_urls
class DNSBL(HostList):
"""Represents a DNSBL service client."""
def __init__(
self,
identifier,
query_suffix,
classification_map,
host_factory
):
"""Initialize a new DNSBL object.
:param identifier: a value designating DNSBL service provider:
its name or url address.
:param query_suffix: a suffix added to DNSBL query address
:param classification_map: item classes associated with
DNSBL query return codes
:param host_factory: a callable object that returns an object
representing host and providing method for getting a relative
domain pertaining to it.
"""
self._identifier = identifier
self._query_suffix = name.from_text(query_suffix)
self._classification_map = classification_map
self._host_factory = host_factory
super(DNSBL, self).__init__(host_factory)
def _query(self, host_object):
"""Query the DNSBL service for given value.
:param host_object: an object representing host, created by
self._host_factory
:returns: an instance of dns.resolver.Answer for given value if
it is listed. Otherwise, it returns None.
"""
host_to_query = host_object.relative_domain
query_name = host_to_query.derelativize(self._query_suffix)
try:
return query(query_name)
except NXDOMAIN:
return None
def __str__(self):
"""Convert the client to a string."""
return str(self._identifier)
def _contains(self, host_object):
return bool(self._query(host_object))
def _get_entry_classification(self, code):
return [self._classification_map[code]]
def _get_match_and_classification(self, host_object):
answers = self._query(host_object)
if answers is None:
return None, None
try:
classification = set()
for answer in answers:
last_octet = answer.to_text().split('.')[-1]
classes = self._get_entry_classification(int(last_octet))
classification.update(classes)
return host_object, classification
except KeyError as ex:
msg_tpl = "The code '{}' has no corresponding classification value"
msg = msg_tpl.format(ex.args[0])
raise_from(UnknownCodeError(msg), ex)
class BitmaskingDNSBL(DNSBL):
"""A class of clients of DNSBL services using bitmasking.
This class represents clients of DNSBL services mapping listed
items to numbers representing bit vectors whose bit values
mark membership of an item in a sublist or a taxonomic group.
"""
def _get_entry_classification(self, code):
codes = get_powers_of_2(code)
return [cl for c in codes for cl
in DNSBL._get_entry_classification(self, c)]
class HpHosts(HostList):
"""A class of clients of hpHosts service."""
identifier = ' http://www.hosts-file.net/'
_NOT_LISTED = 'Not Listed'
def __init__(self, client_name):
"""Initialize a new instance.
:param client_name: name of client using the service
"""
self.app_id = client_name
super(HpHosts, self).__init__(non_ipv6_host)
def _query(self, host_object, classification=False):
"""Query the client for data of given host.
:param host_object: an object representing a host value
:param classification: if True: hpHosts is queried also
for classification for given host, if listed
:returns: content of response to GET request to hpHosts
for data on the given host
"""
template = 'http://verify.hosts-file.net/?v={}&s={}'
url = template.format(self.app_id, host_object.to_unicode())
url = url + '&class=true' if classification else url
return get(url).text
def _contains(self, host_object):
return self._NOT_LISTED not in self._query(host_object)
def _get_match_and_classification(self, host_object):
data = self._query(host_object, True)
if self._NOT_LISTED in data:
return None, None
elements = data.split(',')
classification = set(elements[1:])
return host_object, classification
class GoogleSafeBrowsing(object):
"""A class of clients of Google Safe Browsing Lookup API."""
protocol_version = '3.1'
max_urls_per_request = 500
def __init__(self, client_name, app_version, api_key):
"""Initialize a new instance.
:param client_name: name of an application using the API
:param app_version: version of the application
:param api_key: API key given by Google:
https://developers.google.com/safe-browsing/key_signup
"""
self.api_key = api_key
self.client_name = client_name
self.app_version = app_version
self._request_address_val = ''
@property
def _request_address(self):
"""Get address of a POST request to the service."""
if not self._request_address_val:
template = (
'https://sb-ssl.google.com/safebrowsing/api/lookup'
'?client={0}&key={1}&appver={2}&pver={3}'
)
self._request_address_val = template.format(
self.client_name,
self.api_key,
self.app_version,
self.protocol_version
)
return self._request_address_val
def _query_once(self, urls):
"""Perform a single POST request using lookup API.
:param urls: a sequence of URLs to put in request body
:returns: a response object
:raises UnathorizedAPIKeyError: when the API key for this
instance is not valid
:raises HTTPError: if the HTTPError was raised for a HTTP code
other than 401, the exception is reraised
"""
request_body = '{}\n{}'.format(len(urls), '\n'.join(urls))
response = post(self._request_address, request_body)
try:
response.raise_for_status()
except HTTPError as error:
if response.status_code == 401:
msg = 'The API key is not authorized'
raise_from(UnathorizedAPIKeyError(msg), error)
else:
raise
return response
def _query(self, urls):
"""Test URLs for being listed by the service.
:param urls: a sequence of URLs to be tested
:returns: a tuple containing chunk of URLs and a response
pertaining to them if the code of response was 200, which
means at least one of the queried URLs is matched in either
the phishing, malware, or unwanted software lists.
"""
urls = list(set(urls))
for i in range(0, len(urls), self.max_urls_per_request):
chunk = urls[i:i+self.max_urls_per_request]
response = self._query_once(chunk)
if response.status_code == 200:
yield chunk, response
@accepts_valid_urls
def any_match(self, urls):
"""Check if the service recognizes any of given URLs as spam.
:param urls: a sequence of URLs to be tested
:returns: True if any of the URLs was recognized as spam
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
return any(self._query(urls))
def _get_match_and_classification(self, urls):
"""Get classification for all matching URLs.
:param urls: a sequence of URLs to test
:return: a tuple containing matching URL and classification
string pertaining to it
"""
for url_list, response in self._query(urls):
classification_set = response.text.splitlines()
for url, _class in zip(url_list, classification_set):
if _class != 'ok':
yield url, _class
@accepts_valid_urls
def lookup_matching(self, urls):
"""Get items for all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: objects representing listed URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _class in self._get_match_and_classification(urls):
classification = set(_class.split(','))
yield AddressListItem(url, self, classification)
@accepts_valid_urls
def filter_matching(self, urls):
"""Get all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: spam URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _ in self._get_match_and_classification(urls):
yield url
SPAMHAUS_XBL_CLASSIFICATION = (
'CBL (3rd party exploits such as proxies, trojans, etc.)'
)
SPAMHAUS_PBL_CLASSIFICATION = (
'End-user Non-MTA IP addresses set by ISP outbound mail policy'
)
SPAMHAUS_ZEN_CLASSIFICATION = {
2: 'Direct UBE sources, spam operations & spam services',
3: 'Direct snowshoe spam sources detected via automation',
4: SPAMHAUS_XBL_CLASSIFICATION,
5: SPAMHAUS_XBL_CLASSIFICATION,
6: SPAMHAUS_XBL_CLASSIFICATION,
7: SPAMHAUS_XBL_CLASSIFICATION,
10: SPAMHAUS_PBL_CLASSIFICATION,
11: SPAMHAUS_PBL_CLASSIFICATION
}
SPAMHAUS_ZEN = DNSBL(
'spamhaus_zen',
'zen.spamhaus.org',
SPAMHAUS_ZEN_CLASSIFICATION,
ip_address
)
SPAMHAUS_DBL_CLASSIFICATION = {
2: 'spam domain',
4: 'phishing domain',
5: 'malware domain',
6: 'botnet C&C domain',
102: 'abused legit spam',
103: 'abused spammed redirector domain',
104: 'abused legit phishing',
105: 'abused legit malware',
106: 'abused legit botnet C&C',
}
SPAMHAUS_DBL = DNSBL(
'spamhaus_dbl',
'dbl.spamhaus.org',
SPAMHAUS_DBL_CLASSIFICATION,
registered_domain
)
SURBL_MULTI_CLASSIFICATION = {
2: 'deprecated (previously SpamCop web sites)',
4: 'listed on WS (will migrate to ABUSE on 1 May 2016)',
8: 'phishing',
16: 'malware',
32: 'deprecated (previously AbuseButler web sites)',
64: 'spam and other abuse sites: (previously jwSpamSpy + Prolocation'
' sites, SpamCop web sites, AbuseButler web sites)',
128: 'Cracked sites'
}
SURBL_MULTI = BitmaskingDNSBL(
'surbl_multi',
'multi.surbl.org',
SURBL_MULTI_CLASSIFICATION,
registered_domain_or_ip
)
|
piotr-rusin/spam-lists | spam_lists/clients.py | DNSBL._query | python | def _query(self, host_object):
host_to_query = host_object.relative_domain
query_name = host_to_query.derelativize(self._query_suffix)
try:
return query(query_name)
except NXDOMAIN:
return None | Query the DNSBL service for given value.
:param host_object: an object representing host, created by
self._host_factory
:returns: an instance of dns.resolver.Answer for given value if
it is listed. Otherwise, it returns None. | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/clients.py#L59-L72 | null | class DNSBL(HostList):
"""Represents a DNSBL service client."""
def __init__(
self,
identifier,
query_suffix,
classification_map,
host_factory
):
"""Initialize a new DNSBL object.
:param identifier: a value designating DNSBL service provider:
its name or url address.
:param query_suffix: a suffix added to DNSBL query address
:param classification_map: item classes associated with
DNSBL query return codes
:param host_factory: a callable object that returns an object
representing host and providing method for getting a relative
domain pertaining to it.
"""
self._identifier = identifier
self._query_suffix = name.from_text(query_suffix)
self._classification_map = classification_map
self._host_factory = host_factory
super(DNSBL, self).__init__(host_factory)
def __str__(self):
"""Convert the client to a string."""
return str(self._identifier)
def _contains(self, host_object):
return bool(self._query(host_object))
def _get_entry_classification(self, code):
return [self._classification_map[code]]
def _get_match_and_classification(self, host_object):
answers = self._query(host_object)
if answers is None:
return None, None
try:
classification = set()
for answer in answers:
last_octet = answer.to_text().split('.')[-1]
classes = self._get_entry_classification(int(last_octet))
classification.update(classes)
return host_object, classification
except KeyError as ex:
msg_tpl = "The code '{}' has no corresponding classification value"
msg = msg_tpl.format(ex.args[0])
raise_from(UnknownCodeError(msg), ex)
|
piotr-rusin/spam-lists | spam_lists/clients.py | HpHosts._query | python | def _query(self, host_object, classification=False):
template = 'http://verify.hosts-file.net/?v={}&s={}'
url = template.format(self.app_id, host_object.to_unicode())
url = url + '&class=true' if classification else url
return get(url).text | Query the client for data of given host.
:param host_object: an object representing a host value
:param classification: if True: hpHosts is queried also
for classification for given host, if listed
:returns: content of response to GET request to hpHosts
for data on the given host | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/clients.py#L146-L158 | null | class HpHosts(HostList):
"""A class of clients of hpHosts service."""
identifier = ' http://www.hosts-file.net/'
_NOT_LISTED = 'Not Listed'
def __init__(self, client_name):
"""Initialize a new instance.
:param client_name: name of client using the service
"""
self.app_id = client_name
super(HpHosts, self).__init__(non_ipv6_host)
def _contains(self, host_object):
return self._NOT_LISTED not in self._query(host_object)
def _get_match_and_classification(self, host_object):
data = self._query(host_object, True)
if self._NOT_LISTED in data:
return None, None
elements = data.split(',')
classification = set(elements[1:])
return host_object, classification
|
piotr-rusin/spam-lists | spam_lists/clients.py | GoogleSafeBrowsing._request_address | python | def _request_address(self):
if not self._request_address_val:
template = (
'https://sb-ssl.google.com/safebrowsing/api/lookup'
'?client={0}&key={1}&appver={2}&pver={3}'
)
self._request_address_val = template.format(
self.client_name,
self.api_key,
self.app_version,
self.protocol_version
)
return self._request_address_val | Get address of a POST request to the service. | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/clients.py#L192-L205 | null | class GoogleSafeBrowsing(object):
"""A class of clients of Google Safe Browsing Lookup API."""
protocol_version = '3.1'
max_urls_per_request = 500
def __init__(self, client_name, app_version, api_key):
"""Initialize a new instance.
:param client_name: name of an application using the API
:param app_version: version of the application
:param api_key: API key given by Google:
https://developers.google.com/safe-browsing/key_signup
"""
self.api_key = api_key
self.client_name = client_name
self.app_version = app_version
self._request_address_val = ''
@property
def _query_once(self, urls):
"""Perform a single POST request using lookup API.
:param urls: a sequence of URLs to put in request body
:returns: a response object
:raises UnathorizedAPIKeyError: when the API key for this
instance is not valid
:raises HTTPError: if the HTTPError was raised for a HTTP code
other than 401, the exception is reraised
"""
request_body = '{}\n{}'.format(len(urls), '\n'.join(urls))
response = post(self._request_address, request_body)
try:
response.raise_for_status()
except HTTPError as error:
if response.status_code == 401:
msg = 'The API key is not authorized'
raise_from(UnathorizedAPIKeyError(msg), error)
else:
raise
return response
def _query(self, urls):
"""Test URLs for being listed by the service.
:param urls: a sequence of URLs to be tested
:returns: a tuple containing chunk of URLs and a response
pertaining to them if the code of response was 200, which
means at least one of the queried URLs is matched in either
the phishing, malware, or unwanted software lists.
"""
urls = list(set(urls))
for i in range(0, len(urls), self.max_urls_per_request):
chunk = urls[i:i+self.max_urls_per_request]
response = self._query_once(chunk)
if response.status_code == 200:
yield chunk, response
@accepts_valid_urls
def any_match(self, urls):
"""Check if the service recognizes any of given URLs as spam.
:param urls: a sequence of URLs to be tested
:returns: True if any of the URLs was recognized as spam
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
return any(self._query(urls))
def _get_match_and_classification(self, urls):
"""Get classification for all matching URLs.
:param urls: a sequence of URLs to test
:return: a tuple containing matching URL and classification
string pertaining to it
"""
for url_list, response in self._query(urls):
classification_set = response.text.splitlines()
for url, _class in zip(url_list, classification_set):
if _class != 'ok':
yield url, _class
@accepts_valid_urls
def lookup_matching(self, urls):
"""Get items for all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: objects representing listed URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _class in self._get_match_and_classification(urls):
classification = set(_class.split(','))
yield AddressListItem(url, self, classification)
@accepts_valid_urls
def filter_matching(self, urls):
"""Get all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: spam URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _ in self._get_match_and_classification(urls):
yield url
|
piotr-rusin/spam-lists | spam_lists/clients.py | GoogleSafeBrowsing._query_once | python | def _query_once(self, urls):
request_body = '{}\n{}'.format(len(urls), '\n'.join(urls))
response = post(self._request_address, request_body)
try:
response.raise_for_status()
except HTTPError as error:
if response.status_code == 401:
msg = 'The API key is not authorized'
raise_from(UnathorizedAPIKeyError(msg), error)
else:
raise
return response | Perform a single POST request using lookup API.
:param urls: a sequence of URLs to put in request body
:returns: a response object
:raises UnathorizedAPIKeyError: when the API key for this
instance is not valid
:raises HTTPError: if the HTTPError was raised for a HTTP code
other than 401, the exception is reraised | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/clients.py#L207-L227 | null | class GoogleSafeBrowsing(object):
"""A class of clients of Google Safe Browsing Lookup API."""
protocol_version = '3.1'
max_urls_per_request = 500
def __init__(self, client_name, app_version, api_key):
"""Initialize a new instance.
:param client_name: name of an application using the API
:param app_version: version of the application
:param api_key: API key given by Google:
https://developers.google.com/safe-browsing/key_signup
"""
self.api_key = api_key
self.client_name = client_name
self.app_version = app_version
self._request_address_val = ''
@property
def _request_address(self):
"""Get address of a POST request to the service."""
if not self._request_address_val:
template = (
'https://sb-ssl.google.com/safebrowsing/api/lookup'
'?client={0}&key={1}&appver={2}&pver={3}'
)
self._request_address_val = template.format(
self.client_name,
self.api_key,
self.app_version,
self.protocol_version
)
return self._request_address_val
def _query(self, urls):
"""Test URLs for being listed by the service.
:param urls: a sequence of URLs to be tested
:returns: a tuple containing chunk of URLs and a response
pertaining to them if the code of response was 200, which
means at least one of the queried URLs is matched in either
the phishing, malware, or unwanted software lists.
"""
urls = list(set(urls))
for i in range(0, len(urls), self.max_urls_per_request):
chunk = urls[i:i+self.max_urls_per_request]
response = self._query_once(chunk)
if response.status_code == 200:
yield chunk, response
@accepts_valid_urls
def any_match(self, urls):
"""Check if the service recognizes any of given URLs as spam.
:param urls: a sequence of URLs to be tested
:returns: True if any of the URLs was recognized as spam
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
return any(self._query(urls))
def _get_match_and_classification(self, urls):
"""Get classification for all matching URLs.
:param urls: a sequence of URLs to test
:return: a tuple containing matching URL and classification
string pertaining to it
"""
for url_list, response in self._query(urls):
classification_set = response.text.splitlines()
for url, _class in zip(url_list, classification_set):
if _class != 'ok':
yield url, _class
@accepts_valid_urls
def lookup_matching(self, urls):
"""Get items for all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: objects representing listed URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _class in self._get_match_and_classification(urls):
classification = set(_class.split(','))
yield AddressListItem(url, self, classification)
@accepts_valid_urls
def filter_matching(self, urls):
"""Get all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: spam URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _ in self._get_match_and_classification(urls):
yield url
|
piotr-rusin/spam-lists | spam_lists/clients.py | GoogleSafeBrowsing._query | python | def _query(self, urls):
urls = list(set(urls))
for i in range(0, len(urls), self.max_urls_per_request):
chunk = urls[i:i+self.max_urls_per_request]
response = self._query_once(chunk)
if response.status_code == 200:
yield chunk, response | Test URLs for being listed by the service.
:param urls: a sequence of URLs to be tested
:returns: a tuple containing chunk of URLs and a response
pertaining to them if the code of response was 200, which
means at least one of the queried URLs is matched in either
the phishing, malware, or unwanted software lists. | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/clients.py#L229-L243 | [
"def _query_once(self, urls):\n \"\"\"Perform a single POST request using lookup API.\n\n :param urls: a sequence of URLs to put in request body\n :returns: a response object\n :raises UnathorizedAPIKeyError: when the API key for this\n instance is not valid\n :raises HTTPError: if the HTTPError w... | class GoogleSafeBrowsing(object):
"""A class of clients of Google Safe Browsing Lookup API."""
protocol_version = '3.1'
max_urls_per_request = 500
def __init__(self, client_name, app_version, api_key):
"""Initialize a new instance.
:param client_name: name of an application using the API
:param app_version: version of the application
:param api_key: API key given by Google:
https://developers.google.com/safe-browsing/key_signup
"""
self.api_key = api_key
self.client_name = client_name
self.app_version = app_version
self._request_address_val = ''
@property
def _request_address(self):
"""Get address of a POST request to the service."""
if not self._request_address_val:
template = (
'https://sb-ssl.google.com/safebrowsing/api/lookup'
'?client={0}&key={1}&appver={2}&pver={3}'
)
self._request_address_val = template.format(
self.client_name,
self.api_key,
self.app_version,
self.protocol_version
)
return self._request_address_val
def _query_once(self, urls):
"""Perform a single POST request using lookup API.
:param urls: a sequence of URLs to put in request body
:returns: a response object
:raises UnathorizedAPIKeyError: when the API key for this
instance is not valid
:raises HTTPError: if the HTTPError was raised for a HTTP code
other than 401, the exception is reraised
"""
request_body = '{}\n{}'.format(len(urls), '\n'.join(urls))
response = post(self._request_address, request_body)
try:
response.raise_for_status()
except HTTPError as error:
if response.status_code == 401:
msg = 'The API key is not authorized'
raise_from(UnathorizedAPIKeyError(msg), error)
else:
raise
return response
@accepts_valid_urls
def any_match(self, urls):
"""Check if the service recognizes any of given URLs as spam.
:param urls: a sequence of URLs to be tested
:returns: True if any of the URLs was recognized as spam
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
return any(self._query(urls))
def _get_match_and_classification(self, urls):
"""Get classification for all matching URLs.
:param urls: a sequence of URLs to test
:return: a tuple containing matching URL and classification
string pertaining to it
"""
for url_list, response in self._query(urls):
classification_set = response.text.splitlines()
for url, _class in zip(url_list, classification_set):
if _class != 'ok':
yield url, _class
@accepts_valid_urls
def lookup_matching(self, urls):
"""Get items for all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: objects representing listed URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _class in self._get_match_and_classification(urls):
classification = set(_class.split(','))
yield AddressListItem(url, self, classification)
@accepts_valid_urls
def filter_matching(self, urls):
"""Get all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: spam URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _ in self._get_match_and_classification(urls):
yield url
|
piotr-rusin/spam-lists | spam_lists/clients.py | GoogleSafeBrowsing._get_match_and_classification | python | def _get_match_and_classification(self, urls):
for url_list, response in self._query(urls):
classification_set = response.text.splitlines()
for url, _class in zip(url_list, classification_set):
if _class != 'ok':
yield url, _class | Get classification for all matching URLs.
:param urls: a sequence of URLs to test
:return: a tuple containing matching URL and classification
string pertaining to it | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/clients.py#L256-L267 | [
"def _query(self, urls):\n \"\"\"Test URLs for being listed by the service.\n\n :param urls: a sequence of URLs to be tested\n :returns: a tuple containing chunk of URLs and a response\n pertaining to them if the code of response was 200, which\n means at least one of the queried URLs is matched in ... | class GoogleSafeBrowsing(object):
"""A class of clients of Google Safe Browsing Lookup API."""
protocol_version = '3.1'
max_urls_per_request = 500
def __init__(self, client_name, app_version, api_key):
"""Initialize a new instance.
:param client_name: name of an application using the API
:param app_version: version of the application
:param api_key: API key given by Google:
https://developers.google.com/safe-browsing/key_signup
"""
self.api_key = api_key
self.client_name = client_name
self.app_version = app_version
self._request_address_val = ''
@property
def _request_address(self):
"""Get address of a POST request to the service."""
if not self._request_address_val:
template = (
'https://sb-ssl.google.com/safebrowsing/api/lookup'
'?client={0}&key={1}&appver={2}&pver={3}'
)
self._request_address_val = template.format(
self.client_name,
self.api_key,
self.app_version,
self.protocol_version
)
return self._request_address_val
def _query_once(self, urls):
"""Perform a single POST request using lookup API.
:param urls: a sequence of URLs to put in request body
:returns: a response object
:raises UnathorizedAPIKeyError: when the API key for this
instance is not valid
:raises HTTPError: if the HTTPError was raised for a HTTP code
other than 401, the exception is reraised
"""
request_body = '{}\n{}'.format(len(urls), '\n'.join(urls))
response = post(self._request_address, request_body)
try:
response.raise_for_status()
except HTTPError as error:
if response.status_code == 401:
msg = 'The API key is not authorized'
raise_from(UnathorizedAPIKeyError(msg), error)
else:
raise
return response
def _query(self, urls):
"""Test URLs for being listed by the service.
:param urls: a sequence of URLs to be tested
:returns: a tuple containing chunk of URLs and a response
pertaining to them if the code of response was 200, which
means at least one of the queried URLs is matched in either
the phishing, malware, or unwanted software lists.
"""
urls = list(set(urls))
for i in range(0, len(urls), self.max_urls_per_request):
chunk = urls[i:i+self.max_urls_per_request]
response = self._query_once(chunk)
if response.status_code == 200:
yield chunk, response
@accepts_valid_urls
def any_match(self, urls):
"""Check if the service recognizes any of given URLs as spam.
:param urls: a sequence of URLs to be tested
:returns: True if any of the URLs was recognized as spam
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
return any(self._query(urls))
@accepts_valid_urls
def lookup_matching(self, urls):
"""Get items for all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: objects representing listed URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _class in self._get_match_and_classification(urls):
classification = set(_class.split(','))
yield AddressListItem(url, self, classification)
@accepts_valid_urls
def filter_matching(self, urls):
"""Get all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: spam URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _ in self._get_match_and_classification(urls):
yield url
|
piotr-rusin/spam-lists | spam_lists/clients.py | GoogleSafeBrowsing.lookup_matching | python | def lookup_matching(self, urls):
for url, _class in self._get_match_and_classification(urls):
classification = set(_class.split(','))
yield AddressListItem(url, self, classification) | Get items for all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: objects representing listed URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence | train | https://github.com/piotr-rusin/spam-lists/blob/fd616e8761b28f3eaa503fee5e45f7748e8f88f2/spam_lists/clients.py#L270-L280 | [
"def _get_match_and_classification(self, urls):\n \"\"\"Get classification for all matching URLs.\n\n :param urls: a sequence of URLs to test\n :return: a tuple containing matching URL and classification\n string pertaining to it\n \"\"\"\n for url_list, response in self._query(urls):\n cla... | class GoogleSafeBrowsing(object):
"""A class of clients of Google Safe Browsing Lookup API."""
protocol_version = '3.1'
max_urls_per_request = 500
def __init__(self, client_name, app_version, api_key):
"""Initialize a new instance.
:param client_name: name of an application using the API
:param app_version: version of the application
:param api_key: API key given by Google:
https://developers.google.com/safe-browsing/key_signup
"""
self.api_key = api_key
self.client_name = client_name
self.app_version = app_version
self._request_address_val = ''
@property
def _request_address(self):
"""Get address of a POST request to the service."""
if not self._request_address_val:
template = (
'https://sb-ssl.google.com/safebrowsing/api/lookup'
'?client={0}&key={1}&appver={2}&pver={3}'
)
self._request_address_val = template.format(
self.client_name,
self.api_key,
self.app_version,
self.protocol_version
)
return self._request_address_val
def _query_once(self, urls):
"""Perform a single POST request using lookup API.
:param urls: a sequence of URLs to put in request body
:returns: a response object
:raises UnathorizedAPIKeyError: when the API key for this
instance is not valid
:raises HTTPError: if the HTTPError was raised for a HTTP code
other than 401, the exception is reraised
"""
request_body = '{}\n{}'.format(len(urls), '\n'.join(urls))
response = post(self._request_address, request_body)
try:
response.raise_for_status()
except HTTPError as error:
if response.status_code == 401:
msg = 'The API key is not authorized'
raise_from(UnathorizedAPIKeyError(msg), error)
else:
raise
return response
def _query(self, urls):
"""Test URLs for being listed by the service.
:param urls: a sequence of URLs to be tested
:returns: a tuple containing chunk of URLs and a response
pertaining to them if the code of response was 200, which
means at least one of the queried URLs is matched in either
the phishing, malware, or unwanted software lists.
"""
urls = list(set(urls))
for i in range(0, len(urls), self.max_urls_per_request):
chunk = urls[i:i+self.max_urls_per_request]
response = self._query_once(chunk)
if response.status_code == 200:
yield chunk, response
@accepts_valid_urls
def any_match(self, urls):
"""Check if the service recognizes any of given URLs as spam.
:param urls: a sequence of URLs to be tested
:returns: True if any of the URLs was recognized as spam
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
return any(self._query(urls))
def _get_match_and_classification(self, urls):
"""Get classification for all matching URLs.
:param urls: a sequence of URLs to test
:return: a tuple containing matching URL and classification
string pertaining to it
"""
for url_list, response in self._query(urls):
classification_set = response.text.splitlines()
for url, _class in zip(url_list, classification_set):
if _class != 'ok':
yield url, _class
@accepts_valid_urls
@accepts_valid_urls
def filter_matching(self, urls):
"""Get all listed URLs.
:param urls: a sequence of URLs to be tested
:returns: spam URLs
:raises InvalidURLError: if there are any invalid URLs in
the sequence
"""
for url, _ in self._get_match_and_classification(urls):
yield url
|
druids/django-chamber | chamber/utils/__init__.py | get_class_method | python | def get_class_method(cls_or_inst, method_name):
cls = cls_or_inst if isinstance(cls_or_inst, type) else cls_or_inst.__class__
meth = getattr(cls, method_name, None)
if isinstance(meth, property):
meth = meth.fget
elif isinstance(meth, cached_property):
meth = meth.func
return meth | Returns a method from a given class or instance. When the method doest not exist, it returns `None`. Also works with
properties and cached properties. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/utils/__init__.py#L17-L28 | null | import inspect
import unicodedata
from django.utils.functional import cached_property
from django.utils.html import escape
from django.utils.safestring import SafeData, mark_safe
from django.utils.text import normalize_newlines
def remove_accent(string_with_diacritics):
"""
Removes a diacritics from a given string"
"""
return unicodedata.normalize('NFKD', string_with_diacritics).encode('ASCII', 'ignore').decode('ASCII')
def keep_spacing(value, autoescape=True):
"""
When a given `str` contains multiple spaces it keeps first space and others are replaces by . Newlines
converts into HTML's <br>.
"""
autoescape = autoescape and not isinstance(value, SafeData)
value = normalize_newlines(value)
if autoescape:
value = escape(value)
return mark_safe(value.replace(' ', ' ').replace('\n', '<br />'))
def call_method_with_unknown_input(method, **fun_kwargs):
method_kwargs_names = inspect.getargspec(method)[0][1:]
method_kwargs = {arg_name: fun_kwargs[arg_name] for arg_name in method_kwargs_names if arg_name in fun_kwargs}
if len(method_kwargs_names) == len(method_kwargs):
return method(**method_kwargs)
else:
raise RuntimeError('Invalid method parameters')
|
druids/django-chamber | chamber/models/__init__.py | field_to_dict | python | def field_to_dict(field, instance):
# avoid a circular import
from django.db.models.fields.related import ManyToManyField
return (many_to_many_field_to_dict(field, instance) if isinstance(field, ManyToManyField)
else field.value_from_object(instance)) | Converts a model field to a dictionary | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/models/__init__.py#L37-L45 | [
"def many_to_many_field_to_dict(field, instance):\n if instance.pk is None:\n # If the object doesn't have a primary key yet, just use an empty\n # list for its m2m fields. Calling f.value_from_object will raise\n # an exception.\n return []\n else:\n # MultipleChoiceWidget ... | import collections
from itertools import chain
from distutils.version import StrictVersion
import django
from django.db import models, transaction
from django.db.models.base import ModelBase
from django.utils.translation import ugettext_lazy as _
from django.utils.functional import cached_property
from chamber.exceptions import PersistenceException
from chamber.patch import Options
from chamber.shortcuts import change_and_save, change, bulk_change_and_save
from chamber.utils.decorators import singleton
from .fields import * # NOQA exposing classes and functions as a module API
from .signals import dispatcher_post_save, dispatcher_pre_save
def many_to_many_field_to_dict(field, instance):
if instance.pk is None:
# If the object doesn't have a primary key yet, just use an empty
# list for its m2m fields. Calling f.value_from_object will raise
# an exception.
return []
else:
# MultipleChoiceWidget needs a list of pks, not object instances.
return list(field.value_from_object(instance).values_list('pk', flat=True))
def should_exclude_field(field, fields, exclude):
return (fields and field.name not in fields) or (exclude and field.name in exclude)
@singleton
class UnknownSingleton:
def __repr__(self):
return 'unknown'
def __bool__(self):
return False
Unknown = UnknownSingleton()
def unknown_model_fields_to_dict(instance, fields=None, exclude=None):
return {
field.name: Unknown
for field in chain(instance._meta.concrete_fields, instance._meta.many_to_many) # pylint: disable=W0212
if not should_exclude_field(field, fields, exclude)
}
def model_to_dict(instance, fields=None, exclude=None):
"""
The same implementation as django model_to_dict but editable fields are allowed
"""
return {
field.name: field_to_dict(field, instance)
for field in chain(instance._meta.concrete_fields, instance._meta.many_to_many) # pylint: disable=W0212
if not should_exclude_field(field, fields, exclude)
}
ValueChange = collections.namedtuple('ValueChange', ('initial', 'current'))
class ChangedFields:
"""
Class stores changed fields and its initial and current values.
"""
def __init__(self, initial_dict):
self._initial_dict = initial_dict
@property
def initial_values(self):
return self._initial_dict
@property
def current_values(self):
raise NotImplementedError
@property
def changed_values(self):
return {k: value_change.current for k, value_change in self.diff.items()}
@property
def diff(self):
d1 = self.initial_values
d2 = self.current_values
return {k: ValueChange(v, d2[k]) for k, v in d1.items() if v != d2[k]}
def __setitem__(self, key, item):
raise AttributeError('Object is readonly')
def __getitem__(self, key):
return self.diff[key]
def __repr__(self):
return repr(self.diff)
def __len__(self):
return len(self.diff)
def __delitem__(self, key):
raise AttributeError('Object is readonly')
def clear(self):
raise AttributeError('Object is readonly')
def has_key(self, k):
return k in self.diff
def has_any_key(self, *keys):
return bool(set(self.keys()) & set(keys))
def update(self, *args, **kwargs):
raise AttributeError('Object is readonly')
def keys(self):
return self.diff.keys()
def values(self):
return self.diff.values()
def items(self):
return self.diff.items()
def pop(self, *args, **kwargs):
raise AttributeError('Object is readonly')
def __cmp__(self, dictionary):
return cmp(self.diff, dictionary)
def __contains__(self, item):
return item in self.diff
def __iter__(self):
return iter(self.diff)
def __str__(self):
return repr(self.diff)
class DynamicChangedFields(ChangedFields):
"""
Dynamic changed fields are changed with the instance changes.
"""
def __init__(self, instance):
super().__init__(
self._get_unknown_dict(instance) if instance.is_adding else self._get_instance_dict(instance)
)
self.instance = instance
def _get_unknown_dict(self, instance):
return unknown_model_fields_to_dict(
instance, fields=(field.name for field in instance._meta.fields)
)
def _get_instance_dict(self, instance):
return model_to_dict(
instance, fields=(field.name for field in instance._meta.fields)
)
@property
def current_values(self):
return self._get_instance_dict(self.instance)
def get_static_changes(self):
return StaticChangedFields(self.initial_values, self.current_values)
class StaticChangedFields(ChangedFields):
"""
Static changed fields are immutable. The origin instance changes will not have an affect.
"""
def __init__(self, initial_dict, current_dict):
super().__init__(initial_dict)
self._current_dict = current_dict
@property
def current_values(self):
return self._current_dict
class ComparableModelMixin:
def equals(self, obj, comparator):
"""
Use comparator for evaluating if objects are the same
"""
return comparator.compare(self, obj)
class Comparator:
def compare(self, a, b):
"""
Return True if objects are same otherwise False
"""
raise NotImplementedError
class AuditModel(models.Model):
created_at = models.DateTimeField(verbose_name=_('created at'), null=False, blank=False, auto_now_add=True,
db_index=True)
changed_at = models.DateTimeField(verbose_name=_('changed at'), null=False, blank=False, auto_now=True,
db_index=True)
class Meta:
abstract = True
class Signal:
def __init__(self, obj):
self.connected_functions = []
self.obj = obj
def connect(self, fun):
self.connected_functions.append(fun)
def send(self):
[fun(self.obj) for fun in self.connected_functions]
class SmartQuerySet(models.QuerySet):
def fast_distinct(self):
"""
Because standard distinct used on the all fields are very slow and works only with PostgreSQL database
this method provides alternative to the standard distinct method.
:return: qs with unique objects
"""
return self.model.objects.filter(pk__in=self.values_list('pk', flat=True))
def change_and_save(self, update_only_changed_fields=False, **changed_fields):
"""
Changes a given `changed_fields` on each object in the queryset, saves objects
and returns the changed objects in the queryset.
"""
bulk_change_and_save(self, update_only_changed_fields=update_only_changed_fields, **changed_fields)
return self.filter()
class SmartModelBase(ModelBase):
"""
Smart model meta class that register dispatchers to the post or pre save signals.
"""
def __new__(cls, name, bases, attrs):
new_cls = super().__new__(cls, name, bases, attrs)
for dispatcher in new_cls.dispatchers:
dispatcher.connect(new_cls)
return new_cls
class SmartModel(AuditModel, metaclass=SmartModelBase):
objects = SmartQuerySet.as_manager()
dispatchers = []
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.is_adding = True
self.is_changing = False
self.changed_fields = DynamicChangedFields(self)
self.post_save = Signal(self)
@classmethod
def from_db(cls, db, field_names, values):
new = super().from_db(db, field_names, values)
new.is_adding = False
new.is_changing = True
new.changed_fields = DynamicChangedFields(new)
return new
@property
def has_changed(self):
return bool(self.changed_fields)
@property
def initial_values(self):
return self.changed_fields.initial_values
def full_clean(self, exclude=None, *args, **kwargs):
errors = {}
for field in self._meta.fields:
if (not exclude or field.name not in exclude) and hasattr(self, 'clean_{}'.format(field.name)):
try:
getattr(self, 'clean_{}'.format(field.name))()
except ValidationError as er:
errors[field.name] = er
if errors:
raise ValidationError(errors)
super().full_clean(exclude=exclude, *args, **kwargs)
def _clean_save(self, *args, **kwargs):
self._persistence_clean(*args, **kwargs)
def _clean_delete(self, *args, **kwargs):
self._persistence_clean(*args, **kwargs)
def _clean_pre_save(self, *args, **kwargs):
self._clean_save(*args, **kwargs)
def _clean_pre_delete(self, *args, **kwargs):
self._clean_delete(*args, **kwargs)
def _clean_post_save(self, *args, **kwargs):
self._clean_save(*args, **kwargs)
def _clean_post_delete(self, *args, **kwargs):
self._clean_delete(*args, **kwargs)
def _persistence_clean(self, *args, **kwargs):
exclude = kwargs.pop('exclude', None)
try:
self.full_clean(exclude=exclude)
except ValidationError as er:
if hasattr(er, 'error_dict'):
raise PersistenceException(', '.join(
('%s: %s' % (key, ', '.join(map(force_text, val))) for key, val in er.message_dict.items())))
else:
raise PersistenceException(', '.join(map(force_text, er.messages)))
def _get_save_extra_kwargs(self):
return {}
def _pre_save(self, *args, **kwargs):
pass
def _call_pre_save(self, *args, **kwargs):
self._pre_save(*args, **kwargs)
def _save(self, update_only_changed_fields=False, is_cleaned_pre_save=None, is_cleaned_post_save=None,
force_insert=False, force_update=False, using=None, update_fields=None, *args, **kwargs):
is_cleaned_pre_save = (
self._smart_meta.is_cleaned_pre_save if is_cleaned_pre_save is None else is_cleaned_pre_save
)
is_cleaned_post_save = (
self._smart_meta.is_cleaned_post_save if is_cleaned_post_save is None else is_cleaned_post_save
)
origin = self.__class__
kwargs.update(self._get_save_extra_kwargs())
self._call_pre_save(self.is_changing, self.changed_fields, *args, **kwargs)
if is_cleaned_pre_save:
self._clean_pre_save(*args, **kwargs)
dispatcher_pre_save.send(sender=origin, instance=self, change=self.is_changing,
changed_fields=self.changed_fields.get_static_changes(),
*args, **kwargs)
if not update_fields and update_only_changed_fields:
update_fields = list(self.changed_fields.keys()) + ['changed_at']
# remove primary key from updating fields
if self._meta.pk.name in update_fields:
update_fields.remove(self._meta.pk.name)
super().save(force_insert=force_insert, force_update=force_update, using=using,
update_fields=update_fields)
self._call_post_save(self.is_changing, self.changed_fields, *args, **kwargs)
if is_cleaned_post_save:
self._clean_post_save(*args, **kwargs)
dispatcher_post_save.send(sender=origin, instance=self, change=self.is_changing,
changed_fields=self.changed_fields.get_static_changes(),
*args, **kwargs)
self.post_save.send()
def _post_save(self, *args, **kwargs):
pass
def _call_post_save(self, *args, **kwargs):
self._post_save(*args, **kwargs)
def save_simple(self, *args, **kwargs):
super().save(*args, **kwargs)
def save(self, update_only_changed_fields=False, *args, **kwargs):
if self._smart_meta.is_save_atomic:
with transaction.atomic():
self._save(update_only_changed_fields=update_only_changed_fields, *args, **kwargs)
else:
self._save(update_only_changed_fields=update_only_changed_fields, *args, **kwargs)
self.is_adding = False
self.is_changing = True
self.changed_fields = DynamicChangedFields(self)
def _pre_delete(self, *args, **kwargs):
pass
def _delete(self, is_cleaned_pre_delete=None, is_cleaned_post_delete=None, *args, **kwargs):
is_cleaned_pre_delete = (
self._smart_meta.is_cleaned_pre_delete if is_cleaned_pre_delete is None else is_cleaned_pre_delete
)
is_cleaned_post_delete = (
self._smart_meta.is_cleaned_post_delete if is_cleaned_post_delete is None else is_cleaned_post_delete
)
self._pre_delete(*args, **kwargs)
if is_cleaned_pre_delete:
self._clean_pre_delete(*args, **kwargs)
super().delete(*args, **kwargs)
self._post_delete(*args, **kwargs)
if is_cleaned_post_delete:
self._clean_post_delete(*args, **kwargs)
def _post_delete(self, *args, **kwargs):
pass
def delete(self, *args, **kwargs):
if self._smart_meta.is_delete_atomic:
with transaction.atomic():
self._delete(*args, **kwargs)
else:
self._delete(*args, **kwargs)
def refresh_from_db(self, *args, **kwargs):
super().refresh_from_db(*args, **kwargs)
for key, value in self.__class__.__dict__.items():
if isinstance(value, cached_property):
self.__dict__.pop(key, None)
self.is_adding = False
self.is_changing = True
self.changed_fields = DynamicChangedFields(self)
if StrictVersion(django.get_version()) < StrictVersion('2.0'):
for field in [f for f in self._meta.get_fields() if f.is_relation]:
# For Generic relation related model is None
# https://docs.djangoproject.com/en/2.1/ref/models/meta/#migrating-from-the-old-api
cache_key = field.get_cache_name() if field.related_model else field.cache_attr
if cache_key in self.__dict__:
del self.__dict__[cache_key]
return self
def change(self, **changed_fields):
"""
Changes a given `changed_fields` on this object and returns itself.
:param changed_fields: fields to change
:return: self
"""
change(self, **changed_fields)
return self
def change_and_save(self, update_only_changed_fields=False, **changed_fields):
"""
Changes a given `changed_fields` on this object, saves it and returns itself.
:param update_only_changed_fields: only changed fields will be updated in the database.
:param changed_fields: fields to change.
:return: self
"""
change_and_save(self, update_only_changed_fields=update_only_changed_fields, **changed_fields)
return self
class Meta:
abstract = True
class SmartOptions(Options):
meta_class_name = 'SmartMeta'
meta_name = '_smart_meta'
model_class = SmartModel
attributes = {
'is_cleaned_pre_save': True,
'is_cleaned_post_save': False,
'is_cleaned_pre_delete': False,
'is_cleaned_post_delete': False,
'is_save_atomic': False,
'is_delete_atomic': False,
}
|
druids/django-chamber | chamber/models/__init__.py | model_to_dict | python | def model_to_dict(instance, fields=None, exclude=None):
return {
field.name: field_to_dict(field, instance)
for field in chain(instance._meta.concrete_fields, instance._meta.many_to_many) # pylint: disable=W0212
if not should_exclude_field(field, fields, exclude)
} | The same implementation as django model_to_dict but editable fields are allowed | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/models/__init__.py#L70-L79 | null | import collections
from itertools import chain
from distutils.version import StrictVersion
import django
from django.db import models, transaction
from django.db.models.base import ModelBase
from django.utils.translation import ugettext_lazy as _
from django.utils.functional import cached_property
from chamber.exceptions import PersistenceException
from chamber.patch import Options
from chamber.shortcuts import change_and_save, change, bulk_change_and_save
from chamber.utils.decorators import singleton
from .fields import * # NOQA exposing classes and functions as a module API
from .signals import dispatcher_post_save, dispatcher_pre_save
def many_to_many_field_to_dict(field, instance):
if instance.pk is None:
# If the object doesn't have a primary key yet, just use an empty
# list for its m2m fields. Calling f.value_from_object will raise
# an exception.
return []
else:
# MultipleChoiceWidget needs a list of pks, not object instances.
return list(field.value_from_object(instance).values_list('pk', flat=True))
def should_exclude_field(field, fields, exclude):
return (fields and field.name not in fields) or (exclude and field.name in exclude)
def field_to_dict(field, instance):
"""
Converts a model field to a dictionary
"""
# avoid a circular import
from django.db.models.fields.related import ManyToManyField
return (many_to_many_field_to_dict(field, instance) if isinstance(field, ManyToManyField)
else field.value_from_object(instance))
@singleton
class UnknownSingleton:
def __repr__(self):
return 'unknown'
def __bool__(self):
return False
Unknown = UnknownSingleton()
def unknown_model_fields_to_dict(instance, fields=None, exclude=None):
return {
field.name: Unknown
for field in chain(instance._meta.concrete_fields, instance._meta.many_to_many) # pylint: disable=W0212
if not should_exclude_field(field, fields, exclude)
}
ValueChange = collections.namedtuple('ValueChange', ('initial', 'current'))
class ChangedFields:
"""
Class stores changed fields and its initial and current values.
"""
def __init__(self, initial_dict):
self._initial_dict = initial_dict
@property
def initial_values(self):
return self._initial_dict
@property
def current_values(self):
raise NotImplementedError
@property
def changed_values(self):
return {k: value_change.current for k, value_change in self.diff.items()}
@property
def diff(self):
d1 = self.initial_values
d2 = self.current_values
return {k: ValueChange(v, d2[k]) for k, v in d1.items() if v != d2[k]}
def __setitem__(self, key, item):
raise AttributeError('Object is readonly')
def __getitem__(self, key):
return self.diff[key]
def __repr__(self):
return repr(self.diff)
def __len__(self):
return len(self.diff)
def __delitem__(self, key):
raise AttributeError('Object is readonly')
def clear(self):
raise AttributeError('Object is readonly')
def has_key(self, k):
return k in self.diff
def has_any_key(self, *keys):
return bool(set(self.keys()) & set(keys))
def update(self, *args, **kwargs):
raise AttributeError('Object is readonly')
def keys(self):
return self.diff.keys()
def values(self):
return self.diff.values()
def items(self):
return self.diff.items()
def pop(self, *args, **kwargs):
raise AttributeError('Object is readonly')
def __cmp__(self, dictionary):
return cmp(self.diff, dictionary)
def __contains__(self, item):
return item in self.diff
def __iter__(self):
return iter(self.diff)
def __str__(self):
return repr(self.diff)
class DynamicChangedFields(ChangedFields):
"""
Dynamic changed fields are changed with the instance changes.
"""
def __init__(self, instance):
super().__init__(
self._get_unknown_dict(instance) if instance.is_adding else self._get_instance_dict(instance)
)
self.instance = instance
def _get_unknown_dict(self, instance):
return unknown_model_fields_to_dict(
instance, fields=(field.name for field in instance._meta.fields)
)
def _get_instance_dict(self, instance):
return model_to_dict(
instance, fields=(field.name for field in instance._meta.fields)
)
@property
def current_values(self):
return self._get_instance_dict(self.instance)
def get_static_changes(self):
return StaticChangedFields(self.initial_values, self.current_values)
class StaticChangedFields(ChangedFields):
"""
Static changed fields are immutable. The origin instance changes will not have an affect.
"""
def __init__(self, initial_dict, current_dict):
super().__init__(initial_dict)
self._current_dict = current_dict
@property
def current_values(self):
return self._current_dict
class ComparableModelMixin:
def equals(self, obj, comparator):
"""
Use comparator for evaluating if objects are the same
"""
return comparator.compare(self, obj)
class Comparator:
def compare(self, a, b):
"""
Return True if objects are same otherwise False
"""
raise NotImplementedError
class AuditModel(models.Model):
created_at = models.DateTimeField(verbose_name=_('created at'), null=False, blank=False, auto_now_add=True,
db_index=True)
changed_at = models.DateTimeField(verbose_name=_('changed at'), null=False, blank=False, auto_now=True,
db_index=True)
class Meta:
abstract = True
class Signal:
def __init__(self, obj):
self.connected_functions = []
self.obj = obj
def connect(self, fun):
self.connected_functions.append(fun)
def send(self):
[fun(self.obj) for fun in self.connected_functions]
class SmartQuerySet(models.QuerySet):
def fast_distinct(self):
"""
Because standard distinct used on the all fields are very slow and works only with PostgreSQL database
this method provides alternative to the standard distinct method.
:return: qs with unique objects
"""
return self.model.objects.filter(pk__in=self.values_list('pk', flat=True))
def change_and_save(self, update_only_changed_fields=False, **changed_fields):
"""
Changes a given `changed_fields` on each object in the queryset, saves objects
and returns the changed objects in the queryset.
"""
bulk_change_and_save(self, update_only_changed_fields=update_only_changed_fields, **changed_fields)
return self.filter()
class SmartModelBase(ModelBase):
"""
Smart model meta class that register dispatchers to the post or pre save signals.
"""
def __new__(cls, name, bases, attrs):
new_cls = super().__new__(cls, name, bases, attrs)
for dispatcher in new_cls.dispatchers:
dispatcher.connect(new_cls)
return new_cls
class SmartModel(AuditModel, metaclass=SmartModelBase):
objects = SmartQuerySet.as_manager()
dispatchers = []
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.is_adding = True
self.is_changing = False
self.changed_fields = DynamicChangedFields(self)
self.post_save = Signal(self)
@classmethod
def from_db(cls, db, field_names, values):
new = super().from_db(db, field_names, values)
new.is_adding = False
new.is_changing = True
new.changed_fields = DynamicChangedFields(new)
return new
@property
def has_changed(self):
return bool(self.changed_fields)
@property
def initial_values(self):
return self.changed_fields.initial_values
def full_clean(self, exclude=None, *args, **kwargs):
errors = {}
for field in self._meta.fields:
if (not exclude or field.name not in exclude) and hasattr(self, 'clean_{}'.format(field.name)):
try:
getattr(self, 'clean_{}'.format(field.name))()
except ValidationError as er:
errors[field.name] = er
if errors:
raise ValidationError(errors)
super().full_clean(exclude=exclude, *args, **kwargs)
def _clean_save(self, *args, **kwargs):
self._persistence_clean(*args, **kwargs)
def _clean_delete(self, *args, **kwargs):
self._persistence_clean(*args, **kwargs)
def _clean_pre_save(self, *args, **kwargs):
self._clean_save(*args, **kwargs)
def _clean_pre_delete(self, *args, **kwargs):
self._clean_delete(*args, **kwargs)
def _clean_post_save(self, *args, **kwargs):
self._clean_save(*args, **kwargs)
def _clean_post_delete(self, *args, **kwargs):
self._clean_delete(*args, **kwargs)
def _persistence_clean(self, *args, **kwargs):
exclude = kwargs.pop('exclude', None)
try:
self.full_clean(exclude=exclude)
except ValidationError as er:
if hasattr(er, 'error_dict'):
raise PersistenceException(', '.join(
('%s: %s' % (key, ', '.join(map(force_text, val))) for key, val in er.message_dict.items())))
else:
raise PersistenceException(', '.join(map(force_text, er.messages)))
def _get_save_extra_kwargs(self):
return {}
def _pre_save(self, *args, **kwargs):
pass
def _call_pre_save(self, *args, **kwargs):
self._pre_save(*args, **kwargs)
def _save(self, update_only_changed_fields=False, is_cleaned_pre_save=None, is_cleaned_post_save=None,
force_insert=False, force_update=False, using=None, update_fields=None, *args, **kwargs):
is_cleaned_pre_save = (
self._smart_meta.is_cleaned_pre_save if is_cleaned_pre_save is None else is_cleaned_pre_save
)
is_cleaned_post_save = (
self._smart_meta.is_cleaned_post_save if is_cleaned_post_save is None else is_cleaned_post_save
)
origin = self.__class__
kwargs.update(self._get_save_extra_kwargs())
self._call_pre_save(self.is_changing, self.changed_fields, *args, **kwargs)
if is_cleaned_pre_save:
self._clean_pre_save(*args, **kwargs)
dispatcher_pre_save.send(sender=origin, instance=self, change=self.is_changing,
changed_fields=self.changed_fields.get_static_changes(),
*args, **kwargs)
if not update_fields and update_only_changed_fields:
update_fields = list(self.changed_fields.keys()) + ['changed_at']
# remove primary key from updating fields
if self._meta.pk.name in update_fields:
update_fields.remove(self._meta.pk.name)
super().save(force_insert=force_insert, force_update=force_update, using=using,
update_fields=update_fields)
self._call_post_save(self.is_changing, self.changed_fields, *args, **kwargs)
if is_cleaned_post_save:
self._clean_post_save(*args, **kwargs)
dispatcher_post_save.send(sender=origin, instance=self, change=self.is_changing,
changed_fields=self.changed_fields.get_static_changes(),
*args, **kwargs)
self.post_save.send()
def _post_save(self, *args, **kwargs):
pass
def _call_post_save(self, *args, **kwargs):
self._post_save(*args, **kwargs)
def save_simple(self, *args, **kwargs):
super().save(*args, **kwargs)
def save(self, update_only_changed_fields=False, *args, **kwargs):
if self._smart_meta.is_save_atomic:
with transaction.atomic():
self._save(update_only_changed_fields=update_only_changed_fields, *args, **kwargs)
else:
self._save(update_only_changed_fields=update_only_changed_fields, *args, **kwargs)
self.is_adding = False
self.is_changing = True
self.changed_fields = DynamicChangedFields(self)
def _pre_delete(self, *args, **kwargs):
pass
def _delete(self, is_cleaned_pre_delete=None, is_cleaned_post_delete=None, *args, **kwargs):
is_cleaned_pre_delete = (
self._smart_meta.is_cleaned_pre_delete if is_cleaned_pre_delete is None else is_cleaned_pre_delete
)
is_cleaned_post_delete = (
self._smart_meta.is_cleaned_post_delete if is_cleaned_post_delete is None else is_cleaned_post_delete
)
self._pre_delete(*args, **kwargs)
if is_cleaned_pre_delete:
self._clean_pre_delete(*args, **kwargs)
super().delete(*args, **kwargs)
self._post_delete(*args, **kwargs)
if is_cleaned_post_delete:
self._clean_post_delete(*args, **kwargs)
def _post_delete(self, *args, **kwargs):
pass
def delete(self, *args, **kwargs):
if self._smart_meta.is_delete_atomic:
with transaction.atomic():
self._delete(*args, **kwargs)
else:
self._delete(*args, **kwargs)
def refresh_from_db(self, *args, **kwargs):
super().refresh_from_db(*args, **kwargs)
for key, value in self.__class__.__dict__.items():
if isinstance(value, cached_property):
self.__dict__.pop(key, None)
self.is_adding = False
self.is_changing = True
self.changed_fields = DynamicChangedFields(self)
if StrictVersion(django.get_version()) < StrictVersion('2.0'):
for field in [f for f in self._meta.get_fields() if f.is_relation]:
# For Generic relation related model is None
# https://docs.djangoproject.com/en/2.1/ref/models/meta/#migrating-from-the-old-api
cache_key = field.get_cache_name() if field.related_model else field.cache_attr
if cache_key in self.__dict__:
del self.__dict__[cache_key]
return self
def change(self, **changed_fields):
"""
Changes a given `changed_fields` on this object and returns itself.
:param changed_fields: fields to change
:return: self
"""
change(self, **changed_fields)
return self
def change_and_save(self, update_only_changed_fields=False, **changed_fields):
"""
Changes a given `changed_fields` on this object, saves it and returns itself.
:param update_only_changed_fields: only changed fields will be updated in the database.
:param changed_fields: fields to change.
:return: self
"""
change_and_save(self, update_only_changed_fields=update_only_changed_fields, **changed_fields)
return self
class Meta:
abstract = True
class SmartOptions(Options):
meta_class_name = 'SmartMeta'
meta_name = '_smart_meta'
model_class = SmartModel
attributes = {
'is_cleaned_pre_save': True,
'is_cleaned_post_save': False,
'is_cleaned_pre_delete': False,
'is_cleaned_post_delete': False,
'is_save_atomic': False,
'is_delete_atomic': False,
}
|
druids/django-chamber | chamber/models/__init__.py | SmartQuerySet.fast_distinct | python | def fast_distinct(self):
return self.model.objects.filter(pk__in=self.values_list('pk', flat=True)) | Because standard distinct used on the all fields are very slow and works only with PostgreSQL database
this method provides alternative to the standard distinct method.
:return: qs with unique objects | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/models/__init__.py#L250-L256 | null | class SmartQuerySet(models.QuerySet):
def change_and_save(self, update_only_changed_fields=False, **changed_fields):
"""
Changes a given `changed_fields` on each object in the queryset, saves objects
and returns the changed objects in the queryset.
"""
bulk_change_and_save(self, update_only_changed_fields=update_only_changed_fields, **changed_fields)
return self.filter()
|
druids/django-chamber | chamber/models/__init__.py | SmartQuerySet.change_and_save | python | def change_and_save(self, update_only_changed_fields=False, **changed_fields):
bulk_change_and_save(self, update_only_changed_fields=update_only_changed_fields, **changed_fields)
return self.filter() | Changes a given `changed_fields` on each object in the queryset, saves objects
and returns the changed objects in the queryset. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/models/__init__.py#L258-L264 | [
"def bulk_change_and_save(iterable, update_only_changed_fields=False, save_kwargs=None, **changed_fields):\n \"\"\"\n Changes a given `changed_fields` on each object in a given `iterable`, saves objects\n and returns the changed objects.\n \"\"\"\n return [\n change_and_save(obj, update_only_c... | class SmartQuerySet(models.QuerySet):
def fast_distinct(self):
"""
Because standard distinct used on the all fields are very slow and works only with PostgreSQL database
this method provides alternative to the standard distinct method.
:return: qs with unique objects
"""
return self.model.objects.filter(pk__in=self.values_list('pk', flat=True))
|
druids/django-chamber | chamber/models/__init__.py | SmartModel.change_and_save | python | def change_and_save(self, update_only_changed_fields=False, **changed_fields):
change_and_save(self, update_only_changed_fields=update_only_changed_fields, **changed_fields)
return self | Changes a given `changed_fields` on this object, saves it and returns itself.
:param update_only_changed_fields: only changed fields will be updated in the database.
:param changed_fields: fields to change.
:return: self | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/models/__init__.py#L475-L483 | [
"def change_and_save(obj, update_only_changed_fields=False, save_kwargs=None, **changed_fields):\n \"\"\"\n Changes a given `changed_fields` on object, saves it and returns changed object.\n \"\"\"\n from chamber.models import SmartModel\n\n change(obj, **changed_fields)\n if update_only_changed_f... | class SmartModel(AuditModel, metaclass=SmartModelBase):
objects = SmartQuerySet.as_manager()
dispatchers = []
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.is_adding = True
self.is_changing = False
self.changed_fields = DynamicChangedFields(self)
self.post_save = Signal(self)
@classmethod
def from_db(cls, db, field_names, values):
new = super().from_db(db, field_names, values)
new.is_adding = False
new.is_changing = True
new.changed_fields = DynamicChangedFields(new)
return new
@property
def has_changed(self):
return bool(self.changed_fields)
@property
def initial_values(self):
return self.changed_fields.initial_values
def full_clean(self, exclude=None, *args, **kwargs):
errors = {}
for field in self._meta.fields:
if (not exclude or field.name not in exclude) and hasattr(self, 'clean_{}'.format(field.name)):
try:
getattr(self, 'clean_{}'.format(field.name))()
except ValidationError as er:
errors[field.name] = er
if errors:
raise ValidationError(errors)
super().full_clean(exclude=exclude, *args, **kwargs)
def _clean_save(self, *args, **kwargs):
self._persistence_clean(*args, **kwargs)
def _clean_delete(self, *args, **kwargs):
self._persistence_clean(*args, **kwargs)
def _clean_pre_save(self, *args, **kwargs):
self._clean_save(*args, **kwargs)
def _clean_pre_delete(self, *args, **kwargs):
self._clean_delete(*args, **kwargs)
def _clean_post_save(self, *args, **kwargs):
self._clean_save(*args, **kwargs)
def _clean_post_delete(self, *args, **kwargs):
self._clean_delete(*args, **kwargs)
def _persistence_clean(self, *args, **kwargs):
exclude = kwargs.pop('exclude', None)
try:
self.full_clean(exclude=exclude)
except ValidationError as er:
if hasattr(er, 'error_dict'):
raise PersistenceException(', '.join(
('%s: %s' % (key, ', '.join(map(force_text, val))) for key, val in er.message_dict.items())))
else:
raise PersistenceException(', '.join(map(force_text, er.messages)))
def _get_save_extra_kwargs(self):
return {}
def _pre_save(self, *args, **kwargs):
pass
def _call_pre_save(self, *args, **kwargs):
self._pre_save(*args, **kwargs)
def _save(self, update_only_changed_fields=False, is_cleaned_pre_save=None, is_cleaned_post_save=None,
force_insert=False, force_update=False, using=None, update_fields=None, *args, **kwargs):
is_cleaned_pre_save = (
self._smart_meta.is_cleaned_pre_save if is_cleaned_pre_save is None else is_cleaned_pre_save
)
is_cleaned_post_save = (
self._smart_meta.is_cleaned_post_save if is_cleaned_post_save is None else is_cleaned_post_save
)
origin = self.__class__
kwargs.update(self._get_save_extra_kwargs())
self._call_pre_save(self.is_changing, self.changed_fields, *args, **kwargs)
if is_cleaned_pre_save:
self._clean_pre_save(*args, **kwargs)
dispatcher_pre_save.send(sender=origin, instance=self, change=self.is_changing,
changed_fields=self.changed_fields.get_static_changes(),
*args, **kwargs)
if not update_fields and update_only_changed_fields:
update_fields = list(self.changed_fields.keys()) + ['changed_at']
# remove primary key from updating fields
if self._meta.pk.name in update_fields:
update_fields.remove(self._meta.pk.name)
super().save(force_insert=force_insert, force_update=force_update, using=using,
update_fields=update_fields)
self._call_post_save(self.is_changing, self.changed_fields, *args, **kwargs)
if is_cleaned_post_save:
self._clean_post_save(*args, **kwargs)
dispatcher_post_save.send(sender=origin, instance=self, change=self.is_changing,
changed_fields=self.changed_fields.get_static_changes(),
*args, **kwargs)
self.post_save.send()
def _post_save(self, *args, **kwargs):
pass
def _call_post_save(self, *args, **kwargs):
self._post_save(*args, **kwargs)
def save_simple(self, *args, **kwargs):
super().save(*args, **kwargs)
def save(self, update_only_changed_fields=False, *args, **kwargs):
if self._smart_meta.is_save_atomic:
with transaction.atomic():
self._save(update_only_changed_fields=update_only_changed_fields, *args, **kwargs)
else:
self._save(update_only_changed_fields=update_only_changed_fields, *args, **kwargs)
self.is_adding = False
self.is_changing = True
self.changed_fields = DynamicChangedFields(self)
def _pre_delete(self, *args, **kwargs):
pass
def _delete(self, is_cleaned_pre_delete=None, is_cleaned_post_delete=None, *args, **kwargs):
is_cleaned_pre_delete = (
self._smart_meta.is_cleaned_pre_delete if is_cleaned_pre_delete is None else is_cleaned_pre_delete
)
is_cleaned_post_delete = (
self._smart_meta.is_cleaned_post_delete if is_cleaned_post_delete is None else is_cleaned_post_delete
)
self._pre_delete(*args, **kwargs)
if is_cleaned_pre_delete:
self._clean_pre_delete(*args, **kwargs)
super().delete(*args, **kwargs)
self._post_delete(*args, **kwargs)
if is_cleaned_post_delete:
self._clean_post_delete(*args, **kwargs)
def _post_delete(self, *args, **kwargs):
pass
def delete(self, *args, **kwargs):
if self._smart_meta.is_delete_atomic:
with transaction.atomic():
self._delete(*args, **kwargs)
else:
self._delete(*args, **kwargs)
def refresh_from_db(self, *args, **kwargs):
super().refresh_from_db(*args, **kwargs)
for key, value in self.__class__.__dict__.items():
if isinstance(value, cached_property):
self.__dict__.pop(key, None)
self.is_adding = False
self.is_changing = True
self.changed_fields = DynamicChangedFields(self)
if StrictVersion(django.get_version()) < StrictVersion('2.0'):
for field in [f for f in self._meta.get_fields() if f.is_relation]:
# For Generic relation related model is None
# https://docs.djangoproject.com/en/2.1/ref/models/meta/#migrating-from-the-old-api
cache_key = field.get_cache_name() if field.related_model else field.cache_attr
if cache_key in self.__dict__:
del self.__dict__[cache_key]
return self
def change(self, **changed_fields):
"""
Changes a given `changed_fields` on this object and returns itself.
:param changed_fields: fields to change
:return: self
"""
change(self, **changed_fields)
return self
class Meta:
abstract = True
|
druids/django-chamber | chamber/multidomains/auth/backends.py | ModelBackend.get_group_permissions | python | def get_group_permissions(self, user_obj, obj=None):
if user_obj.is_anonymous() or obj is not None:
return set()
if not hasattr(user_obj, '_group_perm_cache'):
if user_obj.is_superuser:
perms = Permission.objects.all()
else:
user_groups_field = get_user_class()._meta.get_field('groups') # pylint: disable=W0212
user_groups_query = 'group__%s' % user_groups_field.related_query_name()
perms = Permission.objects.filter(**{user_groups_query: user_obj})
perms = perms.values_list('content_type__app_label', 'codename').order_by()
user_obj._group_perm_cache = set(["%s.%s" % (ct, name) for ct, name in perms]) # pylint: disable=W0212
return user_obj._group_perm_cache | Returns a set of permission strings that this user has through his/her
groups. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/multidomains/auth/backends.py#L25-L41 | null | class ModelBackend(OridginModelBackend):
"""
Authenticates against settings.AUTH_USER_MODEL.
"""
def authenticate(self, username=None, password=None, **kwargs):
UserModel = get_user_class()
if username is None:
username = kwargs.get(UserModel.USERNAME_FIELD)
try:
user = UserModel._default_manager.get_by_natural_key(username) # pylint: disable=W0212
if user.check_password(password):
return user
except UserModel.DoesNotExist:
# Run the default password hasher once to reduce the timing
# difference between an existing and a non-existing user (#20760).
UserModel().set_password(password)
# pylint: disable=W0212
def get_user(self, user_id):
UserModel = get_user_class()
try:
return UserModel._default_manager.get(pk=user_id)
except UserModel.DoesNotExist:
return None
|
druids/django-chamber | chamber/formatters/__init__.py | natural_number_with_currency | python | def natural_number_with_currency(number, currency, show_decimal_place=True, use_nbsp=True):
humanized = '{} {}'.format(
numberformat.format(
number=number,
decimal_sep=',',
decimal_pos=2 if show_decimal_place else 0,
grouping=3,
thousand_sep=' ',
force_grouping=True
),
force_text(currency)
)
return mark_safe(humanized.replace(' ', '\u00a0')) if use_nbsp else humanized | Return a given `number` formatter a price for humans. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/formatters/__init__.py#L6-L21 | null | from django.utils import numberformat
from django.utils.encoding import force_text
from django.utils.safestring import mark_safe
|
druids/django-chamber | chamber/utils/decorators.py | singleton | python | def singleton(klass):
instances = {}
def getinstance(*args, **kwargs):
if klass not in instances:
instances[klass] = klass(*args, **kwargs)
return instances[klass]
return wraps(klass)(getinstance) | Create singleton from class | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/utils/decorators.py#L13-L23 | null | from functools import wraps
from django.conf import settings
from django.utils import translation
class classproperty(property):
def __get__(self, cls, owner):
return self.fget.__get__(None, owner)()
def translation_activate_block(function=None, language=None):
"""
Activate language only for one method or function
"""
def _translation_activate_block(function):
def _decorator(*args, **kwargs):
tmp_language = translation.get_language()
try:
translation.activate(language or settings.LANGUAGE_CODE)
return function(*args, **kwargs)
finally:
translation.activate(tmp_language)
return wraps(function)(_decorator)
if function:
return _translation_activate_block(function)
else:
return _translation_activate_block
|
druids/django-chamber | chamber/utils/decorators.py | translation_activate_block | python | def translation_activate_block(function=None, language=None):
def _translation_activate_block(function):
def _decorator(*args, **kwargs):
tmp_language = translation.get_language()
try:
translation.activate(language or settings.LANGUAGE_CODE)
return function(*args, **kwargs)
finally:
translation.activate(tmp_language)
return wraps(function)(_decorator)
if function:
return _translation_activate_block(function)
else:
return _translation_activate_block | Activate language only for one method or function | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/utils/decorators.py#L26-L44 | [
"def _translation_activate_block(function):\n def _decorator(*args, **kwargs):\n tmp_language = translation.get_language()\n try:\n translation.activate(language or settings.LANGUAGE_CODE)\n return function(*args, **kwargs)\n finally:\n translation.activate(t... | from functools import wraps
from django.conf import settings
from django.utils import translation
class classproperty(property):
def __get__(self, cls, owner):
return self.fget.__get__(None, owner)()
def singleton(klass):
"""
Create singleton from class
"""
instances = {}
def getinstance(*args, **kwargs):
if klass not in instances:
instances[klass] = klass(*args, **kwargs)
return instances[klass]
return wraps(klass)(getinstance)
|
druids/django-chamber | chamber/importers/__init__.py | AbstractCSVImporter.get_fields_dict | python | def get_fields_dict(self, row):
return {k: getattr(self, 'clean_{}'.format(k), lambda x: x)(v.strip() if isinstance(v, str)
else None)
for k, v in zip_longest(self.get_fields(), row)} | Returns a dict of field name and cleaned value pairs to initialize the model.
Beware, it aligns the lists of fields and row values with Nones to allow for adding fields not found in the CSV.
Whitespace around the value of the cell is stripped. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/importers/__init__.py#L90-L98 | [
"def get_fields(self):\n return self.fields\n"
] | class AbstractCSVImporter:
"""
Abstract CSV importer provides an easy way to implement loading a CSV file into a Django model.
* To alter or validate field value, define clean_field_name() method in a similar manner as in Django forms.
* The class implements __call__ method to allow calling concrete importers as regular functions.
* __call__ accepts custom CSV path to import different CSV files with the same instance of the importer.
* all class properties can be set dynamically with getters.
"""
skip_header = True # By default first line of CSV is assumed to contain headers and is skipped
fields = () # Must correspond to columns in the CSV but columns set dynamically in clean methods can be appended.
csv_path = '' # Path to the CSV file relative to Django PROJECT_DIR
delimiter = ';'
encoding = 'utf-8'
def __call__(self, csv_path):
"""csv_path is a required parameter as calling the function without CSV does not make sense"""
self.import_csv(custom_csv_path=csv_path)
def import_csv(self, custom_csv_path=None):
with io.open(self.get_filename(custom_csv_path=custom_csv_path), encoding=self.get_encoding()) as f:
reader = csv.reader(f, delimiter=self.get_delimiter())
if self.get_skip_header():
next(reader, None)
self.import_rows(
reader,
row_count=simple_count(self.get_filename(custom_csv_path=custom_csv_path), encoding=self.get_encoding())
)
def import_rows(self, reader, row_count=0):
raise NotImplementedError
@property
def out_stream(self):
"""
By default, output stream is essentially turned off by supplying dummy StringIO.
Override this property if you want to direct output somewhere, e.g. in Django commands.
"""
return DummyOutputStream()
def get_filename(self, custom_csv_path=None):
return os.path.join(settings.PROJECT_DIR, custom_csv_path or self.get_csv_path())
def get_csv_path(self):
"""Override this in case you need to set the CSV path dynamically."""
return self.csv_path
def get_encoding(self):
return self.encoding
def get_delimiter(self):
return str(self.delimiter)
def get_skip_header(self):
return self.skip_header
def get_fields(self):
return self.fields
def _pre_import_rows(self, row_count):
pass
def _post_import_rows(self, created_count, updated_count=0):
pass
|
druids/django-chamber | chamber/models/humanized_helpers/__init__.py | price_humanized | python | def price_humanized(value, inst, currency=None):
return (natural_number_with_currency(value, ugettext('CZK') if currency is None else currency) if value is not None
else ugettext('(None)')) | Return a humanized price | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/models/humanized_helpers/__init__.py#L6-L11 | [
"def natural_number_with_currency(number, currency, show_decimal_place=True, use_nbsp=True):\n \"\"\"\n Return a given `number` formatter a price for humans.\n \"\"\"\n humanized = '{} {}'.format(\n numberformat.format(\n number=number,\n decimal_sep=',',\n decima... | from django.utils.translation import ugettext
from chamber.formatters import natural_number_with_currency
|
druids/django-chamber | chamber/shortcuts.py | change | python | def change(obj, **changed_fields):
obj_field_names = {
field.name for field in obj._meta.fields
} | {
field.attname for field in obj._meta.fields
} | {'pk'}
for field_name, value in changed_fields.items():
if field_name not in obj_field_names:
raise ValueError("'{}' is an invalid field name".format(field_name))
setattr(obj, field_name, value)
return obj | Changes a given `changed_fields` on object and returns changed object. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/shortcuts.py#L54-L68 | null | from datetime import date, datetime, time
from django.http.response import Http404
from django.shortcuts import _get_queryset
from django.utils import timezone
from django.core.exceptions import ValidationError
def get_object_or_none(klass, *args, **kwargs):
queryset = _get_queryset(klass)
try:
return queryset.get(*args, **kwargs)
except (queryset.model.DoesNotExist, ValueError, ValidationError):
return None
def get_object_or_404(klass, *args, **kwargs):
queryset = _get_queryset(klass)
try:
return queryset.get(*args, **kwargs)
except (queryset.model.DoesNotExist, ValueError, ValidationError):
raise Http404
def distinct_field(klass, *args, **kwargs):
return _get_queryset(klass).order_by().values_list(*args, **kwargs).distinct()
def filter_or_exclude_by_date(negate, klass, **kwargs):
filter_kwargs = {}
for key, date_value in kwargs.items():
assert isinstance(date_value, date)
date_range = (
timezone.make_aware(datetime.combine(date_value, time.min), timezone.get_current_timezone()),
timezone.make_aware(datetime.combine(date_value, time.max), timezone.get_current_timezone())
)
filter_kwargs['%s__range' % key] = date_range
if negate:
return _get_queryset(klass).exclude(**filter_kwargs)
else:
return _get_queryset(klass).filter(**filter_kwargs)
def filter_by_date(klass, **kwargs):
return filter_or_exclude_by_date(False, klass, **kwargs)
def exclude_by_date(klass, **kwargs):
return filter_or_exclude_by_date(True, klass, **kwargs)
def change_and_save(obj, update_only_changed_fields=False, save_kwargs=None, **changed_fields):
"""
Changes a given `changed_fields` on object, saves it and returns changed object.
"""
from chamber.models import SmartModel
change(obj, **changed_fields)
if update_only_changed_fields and not isinstance(obj, SmartModel):
raise TypeError('update_only_changed_fields can be used only with SmartModel')
save_kwargs = save_kwargs if save_kwargs is not None else {}
if update_only_changed_fields:
save_kwargs['update_only_changed_fields'] = True
obj.save(**save_kwargs)
return obj
def bulk_change(iterable, **changed_fields):
"""
Changes a given `changed_fields` on each object in a given `iterable`, returns the changed objects.
"""
return [change(obj, **changed_fields) for obj in iterable]
def bulk_change_and_save(iterable, update_only_changed_fields=False, save_kwargs=None, **changed_fields):
"""
Changes a given `changed_fields` on each object in a given `iterable`, saves objects
and returns the changed objects.
"""
return [
change_and_save(obj, update_only_changed_fields=update_only_changed_fields, save_kwargs=save_kwargs,
**changed_fields)
for obj in iterable
]
def bulk_save(iterable):
"""
Saves a objects in a given `iterable`.
"""
return [obj.save() for obj in iterable]
|
druids/django-chamber | chamber/shortcuts.py | change_and_save | python | def change_and_save(obj, update_only_changed_fields=False, save_kwargs=None, **changed_fields):
from chamber.models import SmartModel
change(obj, **changed_fields)
if update_only_changed_fields and not isinstance(obj, SmartModel):
raise TypeError('update_only_changed_fields can be used only with SmartModel')
save_kwargs = save_kwargs if save_kwargs is not None else {}
if update_only_changed_fields:
save_kwargs['update_only_changed_fields'] = True
obj.save(**save_kwargs)
return obj | Changes a given `changed_fields` on object, saves it and returns changed object. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/shortcuts.py#L71-L86 | [
"def change(obj, **changed_fields):\n \"\"\"\n Changes a given `changed_fields` on object and returns changed object.\n \"\"\"\n obj_field_names = {\n field.name for field in obj._meta.fields\n } | {\n field.attname for field in obj._meta.fields\n } | {'pk'}\n\n for field_name, va... | from datetime import date, datetime, time
from django.http.response import Http404
from django.shortcuts import _get_queryset
from django.utils import timezone
from django.core.exceptions import ValidationError
def get_object_or_none(klass, *args, **kwargs):
queryset = _get_queryset(klass)
try:
return queryset.get(*args, **kwargs)
except (queryset.model.DoesNotExist, ValueError, ValidationError):
return None
def get_object_or_404(klass, *args, **kwargs):
queryset = _get_queryset(klass)
try:
return queryset.get(*args, **kwargs)
except (queryset.model.DoesNotExist, ValueError, ValidationError):
raise Http404
def distinct_field(klass, *args, **kwargs):
return _get_queryset(klass).order_by().values_list(*args, **kwargs).distinct()
def filter_or_exclude_by_date(negate, klass, **kwargs):
filter_kwargs = {}
for key, date_value in kwargs.items():
assert isinstance(date_value, date)
date_range = (
timezone.make_aware(datetime.combine(date_value, time.min), timezone.get_current_timezone()),
timezone.make_aware(datetime.combine(date_value, time.max), timezone.get_current_timezone())
)
filter_kwargs['%s__range' % key] = date_range
if negate:
return _get_queryset(klass).exclude(**filter_kwargs)
else:
return _get_queryset(klass).filter(**filter_kwargs)
def filter_by_date(klass, **kwargs):
return filter_or_exclude_by_date(False, klass, **kwargs)
def exclude_by_date(klass, **kwargs):
return filter_or_exclude_by_date(True, klass, **kwargs)
def change(obj, **changed_fields):
"""
Changes a given `changed_fields` on object and returns changed object.
"""
obj_field_names = {
field.name for field in obj._meta.fields
} | {
field.attname for field in obj._meta.fields
} | {'pk'}
for field_name, value in changed_fields.items():
if field_name not in obj_field_names:
raise ValueError("'{}' is an invalid field name".format(field_name))
setattr(obj, field_name, value)
return obj
def bulk_change(iterable, **changed_fields):
"""
Changes a given `changed_fields` on each object in a given `iterable`, returns the changed objects.
"""
return [change(obj, **changed_fields) for obj in iterable]
def bulk_change_and_save(iterable, update_only_changed_fields=False, save_kwargs=None, **changed_fields):
"""
Changes a given `changed_fields` on each object in a given `iterable`, saves objects
and returns the changed objects.
"""
return [
change_and_save(obj, update_only_changed_fields=update_only_changed_fields, save_kwargs=save_kwargs,
**changed_fields)
for obj in iterable
]
def bulk_save(iterable):
"""
Saves a objects in a given `iterable`.
"""
return [obj.save() for obj in iterable]
|
druids/django-chamber | chamber/shortcuts.py | bulk_change_and_save | python | def bulk_change_and_save(iterable, update_only_changed_fields=False, save_kwargs=None, **changed_fields):
return [
change_and_save(obj, update_only_changed_fields=update_only_changed_fields, save_kwargs=save_kwargs,
**changed_fields)
for obj in iterable
] | Changes a given `changed_fields` on each object in a given `iterable`, saves objects
and returns the changed objects. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/shortcuts.py#L96-L105 | null | from datetime import date, datetime, time
from django.http.response import Http404
from django.shortcuts import _get_queryset
from django.utils import timezone
from django.core.exceptions import ValidationError
def get_object_or_none(klass, *args, **kwargs):
queryset = _get_queryset(klass)
try:
return queryset.get(*args, **kwargs)
except (queryset.model.DoesNotExist, ValueError, ValidationError):
return None
def get_object_or_404(klass, *args, **kwargs):
queryset = _get_queryset(klass)
try:
return queryset.get(*args, **kwargs)
except (queryset.model.DoesNotExist, ValueError, ValidationError):
raise Http404
def distinct_field(klass, *args, **kwargs):
return _get_queryset(klass).order_by().values_list(*args, **kwargs).distinct()
def filter_or_exclude_by_date(negate, klass, **kwargs):
filter_kwargs = {}
for key, date_value in kwargs.items():
assert isinstance(date_value, date)
date_range = (
timezone.make_aware(datetime.combine(date_value, time.min), timezone.get_current_timezone()),
timezone.make_aware(datetime.combine(date_value, time.max), timezone.get_current_timezone())
)
filter_kwargs['%s__range' % key] = date_range
if negate:
return _get_queryset(klass).exclude(**filter_kwargs)
else:
return _get_queryset(klass).filter(**filter_kwargs)
def filter_by_date(klass, **kwargs):
return filter_or_exclude_by_date(False, klass, **kwargs)
def exclude_by_date(klass, **kwargs):
return filter_or_exclude_by_date(True, klass, **kwargs)
def change(obj, **changed_fields):
"""
Changes a given `changed_fields` on object and returns changed object.
"""
obj_field_names = {
field.name for field in obj._meta.fields
} | {
field.attname for field in obj._meta.fields
} | {'pk'}
for field_name, value in changed_fields.items():
if field_name not in obj_field_names:
raise ValueError("'{}' is an invalid field name".format(field_name))
setattr(obj, field_name, value)
return obj
def change_and_save(obj, update_only_changed_fields=False, save_kwargs=None, **changed_fields):
"""
Changes a given `changed_fields` on object, saves it and returns changed object.
"""
from chamber.models import SmartModel
change(obj, **changed_fields)
if update_only_changed_fields and not isinstance(obj, SmartModel):
raise TypeError('update_only_changed_fields can be used only with SmartModel')
save_kwargs = save_kwargs if save_kwargs is not None else {}
if update_only_changed_fields:
save_kwargs['update_only_changed_fields'] = True
obj.save(**save_kwargs)
return obj
def bulk_change(iterable, **changed_fields):
"""
Changes a given `changed_fields` on each object in a given `iterable`, returns the changed objects.
"""
return [change(obj, **changed_fields) for obj in iterable]
def bulk_save(iterable):
"""
Saves a objects in a given `iterable`.
"""
return [obj.save() for obj in iterable]
|
druids/django-chamber | chamber/models/fields.py | generate_random_upload_path | python | def generate_random_upload_path(instance, filename):
return os.path.join(instance.__class__.__name__.lower(), uuid().hex, filename) | Pass this function to upload_to argument of FileField to store the file on an unguessable path.
The format of the path is class_name/hash/original_filename. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/models/fields.py#L89-L94 | null | import os
from decimal import Decimal
from uuid import uuid4 as uuid
from django.core.exceptions import ValidationError
from django.core.validators import MaxValueValidator, MinValueValidator
from django.db import models
from django.db.models import FileField as OriginFileField
from django.db.models.fields import DecimalField as OriginDecimalField
from django.forms import forms
from django.utils.encoding import force_text
from django.utils.translation import ugettext
from chamber import config
from chamber.forms import fields as chamber_fields
from chamber.forms.validators import (
RestrictedFileValidator, AllowedContentTypesByFilenameFileValidator, AllowedContentTypesByContentFileValidator
)
from chamber.models.humanized_helpers import price_humanized
from chamber.utils.datastructures import SequenceChoicesEnumMixin, SubstatesChoicesNumEnum
try:
from sorl.thumbnail import ImageField as OriginImageField
except ImportError:
from django.db.models import ImageField as OriginImageField
class DecimalField(OriginDecimalField):
def __init__(self, *args, **kwargs):
self.step = kwargs.pop('step', 'any')
self.min = kwargs.pop('min', None)
self.max = kwargs.pop('max', None)
kwargs['validators'] = kwargs.get('validators', [])
if self.min is not None:
kwargs['validators'].append(MinValueValidator(self.min))
if self.max is not None:
kwargs['validators'].append(MaxValueValidator(self.max))
super().__init__(*args, **kwargs)
def formfield(self, **kwargs):
defaults = {
'form_class': chamber_fields.DecimalField,
'step': self.step,
'min': self.min,
'max': self.max,
}
defaults.update(kwargs)
return super().formfield(**defaults)
class RestrictedFileFieldMixin:
"""
Same as FileField, but you can specify:
* allowed_content_types - list of allowed content types. Example: ['application/json', 'image/jpeg']
* max_upload_size - a number indicating the maximum file size allowed for upload in MB.
"""
def __init__(self, *args, **kwargs):
max_upload_size = kwargs.pop('max_upload_size', config.CHAMBER_MAX_FILE_UPLOAD_SIZE) * 1024 * 1024
allowed_content_types = kwargs.pop('allowed_content_types', None)
super().__init__(*args, **kwargs)
self.validators.append(RestrictedFileValidator(max_upload_size))
if allowed_content_types:
self.validators = tuple(self.validators) + (
AllowedContentTypesByFilenameFileValidator(allowed_content_types),
AllowedContentTypesByContentFileValidator(allowed_content_types),
)
def generate_filename(self, instance, filename):
"""
removes UTF chars from filename
"""
from unidecode import unidecode
return super().generate_filename(instance, unidecode(force_text(filename)))
class FileField(RestrictedFileFieldMixin, OriginFileField):
pass
class ImageField(RestrictedFileFieldMixin, OriginImageField):
def __init__(self, *args, **kwargs):
allowed_content_types = kwargs.pop('allowed_content_types', config.CHAMBER_DEFAULT_IMAGE_ALLOWED_CONTENT_TYPES)
super().__init__(allowed_content_types=allowed_content_types, *args, **kwargs)
class PrevValuePositiveIntegerField(models.PositiveIntegerField):
def __init__(self, *args, **kwargs):
self.copy_field_name = kwargs.pop('copy_field_name', None)
super().__init__(*args, **kwargs)
def pre_save(self, model_instance, add):
# During migrations no changed_fields is set for a model
if hasattr(model_instance, 'changed_fields') and self.copy_field_name in model_instance.changed_fields:
setattr(
model_instance, self.attname,
getattr(model_instance, self.copy_field_name)
if model_instance.is_adding else model_instance.initial_values[self.copy_field_name]
)
return super().pre_save(model_instance, add)
class SubchoicesPositiveIntegerField(models.PositiveIntegerField):
empty_values = ()
def __init__(self, *args, **kwargs):
self.enum = kwargs.pop('enum', None)
self.supchoices_field_name = kwargs.pop('supchoices_field_name', None)
assert self.enum is None or isinstance(self.enum, SubstatesChoicesNumEnum)
if self.enum:
kwargs['choices'] = self.enum.choices
super().__init__(*args, **kwargs)
def _get_supvalue(self, model_instance):
return getattr(model_instance, self.supchoices_field_name)
def clean(self, value, model_instance):
if self.enum and self._get_supvalue(model_instance) not in self.enum.categories:
return None
else:
return super().clean(value, model_instance)
def _raise_error_if_value_should_be_empty(self, value, subvalue):
if self.enum and subvalue not in self.enum.categories and value is not None:
raise ValidationError(ugettext('Value must be empty'))
def _raise_error_if_value_not_allowed(self, value, subvalue, model_instance):
allowed_values = self.enum.get_allowed_states(getattr(model_instance, self.supchoices_field_name))
if subvalue in self.enum.categories and value not in allowed_values:
raise ValidationError(ugettext('Allowed choices are {}.').format(
', '.join(('{} ({})'.format(*(self.enum.get_label(val), val)) for val in allowed_values))
))
def validate(self, value, model_instance):
if not self.enum:
return
self._raise_error_if_value_should_be_empty(value, self._get_supvalue(model_instance))
self._raise_error_if_value_not_allowed(value, self._get_supvalue(model_instance), model_instance)
class EnumSequenceFieldMixin:
# TODO Once SmartWidget mixin is not in is-core, add formfield method with the appropriate widget
def __init__(self, *args, **kwargs):
self.enum = kwargs.pop('enum', None)
self.prev_field_name = kwargs.pop('prev_field', None)
assert self.enum is None or isinstance(self.enum, SequenceChoicesEnumMixin)
if self.enum:
kwargs['choices'] = self.enum.choices
super().__init__(*args, **kwargs)
def validate(self, value, model_instance):
super().validate(value, model_instance)
if self.enum:
prev_value = model_instance.initial_values[self.attname] if model_instance.is_changing else None
allowed_next_values = self.enum.get_allowed_next_states(prev_value, model_instance)
if ((self.name in model_instance.changed_fields or model_instance.is_adding) and
value not in allowed_next_values):
raise ValidationError(
ugettext('Allowed choices are {}.').format(
', '.join(('{} ({})'.format(*(self.enum.get_label(val), val)) for val in allowed_next_values))))
class EnumSequencePositiveIntegerField(EnumSequenceFieldMixin, models.PositiveIntegerField):
pass
class EnumSequenceCharField(EnumSequenceFieldMixin, models.CharField):
pass
class PriceField(DecimalField):
def __init__(self, *args, **kwargs):
self.currency = kwargs.pop('currency', ugettext('CZK'))
super().__init__(*args, **{
'decimal_places': 2,
'max_digits': 10,
'humanized': lambda val, inst, field: price_humanized(val, inst, currency=field.currency),
**kwargs
})
def formfield(self, **kwargs):
return super(DecimalField, self).formfield(
**{
'form_class': chamber_fields.PriceField,
'currency': self.currency,
**kwargs
}
)
def deconstruct(self):
name, path, args, kwargs = super().deconstruct()
del kwargs['max_digits']
del kwargs['decimal_places']
return name, path, args, kwargs
class PositivePriceField(PriceField):
def __init__(self, *args, **kwargs):
kwargs['validators'] = kwargs.get('validators', [])
kwargs['validators'].append(MinValueValidator(Decimal('0.00')))
super().__init__(*args, **kwargs)
def deconstruct(self):
name, path, args, kwargs = super().deconstruct()
del kwargs['validators']
return name, path, args, kwargs
|
druids/django-chamber | chamber/models/fields.py | RestrictedFileFieldMixin.generate_filename | python | def generate_filename(self, instance, filename):
from unidecode import unidecode
return super().generate_filename(instance, unidecode(force_text(filename))) | removes UTF chars from filename | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/models/fields.py#L69-L75 | null | class RestrictedFileFieldMixin:
"""
Same as FileField, but you can specify:
* allowed_content_types - list of allowed content types. Example: ['application/json', 'image/jpeg']
* max_upload_size - a number indicating the maximum file size allowed for upload in MB.
"""
def __init__(self, *args, **kwargs):
max_upload_size = kwargs.pop('max_upload_size', config.CHAMBER_MAX_FILE_UPLOAD_SIZE) * 1024 * 1024
allowed_content_types = kwargs.pop('allowed_content_types', None)
super().__init__(*args, **kwargs)
self.validators.append(RestrictedFileValidator(max_upload_size))
if allowed_content_types:
self.validators = tuple(self.validators) + (
AllowedContentTypesByFilenameFileValidator(allowed_content_types),
AllowedContentTypesByContentFileValidator(allowed_content_types),
)
|
druids/django-chamber | chamber/commands/__init__.py | ProgressBarStream.write | python | def write(self, *args, **kwargs):
return self.stream.write(ending="", *args, **kwargs) | Call the stream's write method without linebreaks at line endings. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/commands/__init__.py#L19-L23 | null | class ProgressBarStream:
"""
OutputStream wrapper to remove default linebreak at line endings.
"""
def __init__(self, stream):
"""
Wrap the given stream.
"""
self.stream = stream
def flush(self):
"""
Call the stream's flush method without any extra arguments.
"""
return self.stream.flush()
|
druids/django-chamber | chamber/multidomains/auth/middleware.py | get_token | python | def get_token(request):
if (not request.META.get(header_name_to_django(auth_token_settings.HEADER_NAME)) and
config.CHAMBER_MULTIDOMAINS_OVERTAKER_AUTH_COOKIE_NAME):
ovetaker_auth_token = request.COOKIES.get(config.CHAMBER_MULTIDOMAINS_OVERTAKER_AUTH_COOKIE_NAME)
token = get_object_or_none(Token, key=ovetaker_auth_token, is_active=True)
if utils.get_user_from_token(token).is_authenticated():
return token
return utils.get_token(request) | Returns the token model instance associated with the given request token key.
If no user is retrieved AnonymousToken is returned. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/multidomains/auth/middleware.py#L13-L25 | [
"def get_object_or_none(klass, *args, **kwargs):\n queryset = _get_queryset(klass)\n try:\n return queryset.get(*args, **kwargs)\n except (queryset.model.DoesNotExist, ValueError, ValidationError):\n return None\n"
] | from django.utils.functional import SimpleLazyObject
from auth_token import utils # pylint: disable=E0401
from auth_token.config import settings as auth_token_settings # pylint: disable=E0401
from auth_token.middleware import TokenAuthenticationMiddleware, get_user # pylint: disable=E0401
from auth_token.models import Token # pylint: disable=E0401
from auth_token.utils import dont_enforce_csrf_checks, header_name_to_django # pylint: disable=E0401
from chamber import config
from chamber.shortcuts import get_object_or_none
class MultiDomainsTokenAuthenticationMiddleware(TokenAuthenticationMiddleware):
def process_request(self, request):
"""
Lazy set user and token
"""
request.token = get_token(request)
request.user = SimpleLazyObject(lambda: get_user(request))
request._dont_enforce_csrf_checks = dont_enforce_csrf_checks(request) # pylint: disable=W0212
|
druids/django-chamber | chamber/multidomains/auth/middleware.py | MultiDomainsTokenAuthenticationMiddleware.process_request | python | def process_request(self, request):
request.token = get_token(request)
request.user = SimpleLazyObject(lambda: get_user(request))
request._dont_enforce_csrf_checks = dont_enforce_csrf_checks(request) | Lazy set user and token | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/multidomains/auth/middleware.py#L30-L36 | [
"def get_token(request):\n \"\"\"\n Returns the token model instance associated with the given request token key.\n If no user is retrieved AnonymousToken is returned.\n \"\"\"\n if (not request.META.get(header_name_to_django(auth_token_settings.HEADER_NAME)) and\n config.CHAMBER_MULTIDOMA... | class MultiDomainsTokenAuthenticationMiddleware(TokenAuthenticationMiddleware):
# pylint: disable=W0212
|
druids/django-chamber | chamber/utils/transaction.py | atomic | python | def atomic(func):
try:
from reversion.revisions import create_revision
return transaction.atomic(create_revision()(func))
except ImportError:
return transaction.atomic(func) | Decorator helper that overrides django atomic decorator and automatically adds create revision. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/utils/transaction.py#L14-L23 | null | import logging
from collections import OrderedDict
from django.conf import settings
from django.db import transaction, DEFAULT_DB_ALIAS
from django.db.transaction import get_connection
from django.utils.decorators import ContextDecorator
logger = logging.getLogger(__name__)
class TransactionSignalsContext:
"""
Context object that stores handlers and call it after successful pass trough surrounded code block
with "transaction_signals decorator. Handlers can be unique or standard. Unique handlers are registered
and executed only once.
"""
def __init__(self):
self._unique_handlers = OrderedDict()
self._handlers = []
def register(self, handler):
if getattr(handler, 'is_unique', False):
if hash(handler) in self._unique_handlers:
self._unique_handlers.get(hash(handler)).join(handler)
else:
self._unique_handlers[hash(handler)] = handler
self._handlers.append(handler)
else:
self._handlers.append(handler)
def handle_all(self):
for handler in self._handlers:
handler()
def join(self, transaction_signals_context):
for handler in transaction_signals_context._handlers:
self.register(handler)
class TransactionSignals(ContextDecorator):
"""
Context decorator that supports usage python keyword "with".
Decorator that adds transaction context to the connection on input.
Finally handlers are called on the output.
"""
def __init__(self, using):
self.using = using
def __enter__(self):
connection = get_connection(self.using)
if not hasattr(connection, 'transaction_signals_context_list'):
connection.transaction_signals_context_list = []
connection.transaction_signals_context_list.append(TransactionSignalsContext())
def __exit__(self, exc_type, exc_value, traceback):
connection = get_connection(self.using)
transaction_signals_context = connection.transaction_signals_context_list.pop()
if not exc_value:
if len(connection.transaction_signals_context_list) == 0:
transaction_signals_context.handle_all()
else:
connection.transaction_signals_context_list[-1].join(transaction_signals_context)
def on_success(handler, using=None):
"""
Register a handler or a function to be called after successful code pass.
If transaction signals are not active the handler/function is called immediately.
:param handler: handler or function that will be called.
:param using: name of the database
"""
connection = get_connection(using)
if getattr(connection, 'transaction_signals_context_list', False):
connection.transaction_signals_context_list[-1].register(handler)
else:
if settings.DEBUG:
logger.warning(
'For on success signal should be activated transaction signals via transaction_signals decorator.'
'Function is called immediately now.'
)
handler()
def transaction_signals(using=None):
"""
Decorator that adds transaction context to the connection on input.
Finally handlers are called on the output.
:param using: name of the database
"""
if callable(using):
return TransactionSignals(DEFAULT_DB_ALIAS)(using)
else:
return TransactionSignals(using)
def atomic_with_signals(func):
"""
Atomic decorator with transaction signals.
"""
try:
from reversion.revisions import create_revision
return transaction.atomic(create_revision()(transaction_signals(func)))
except ImportError:
return transaction.atomic(transaction_signals(func))
class OnSuccessHandler:
"""
Handler class that is used for performing on success operations.
"""
is_unique = False
def __init__(self, using=None, **kwargs):
self.kwargs = kwargs
on_success(self, using=using)
def __call__(self):
self.handle(**self.kwargs)
def handle(self, **kwargs):
"""
There should be implemented handler operations.
:param kwargs: input data that was send during hanlder creation.
"""
raise NotImplementedError
class OneTimeOnSuccessHandler(OnSuccessHandler):
"""
One time handler class that is used for performing on success operations.
Handler is called only once, but data of all calls are stored inside list (kwargs_list).
"""
is_unique = True
def __init__(self, using=None, **kwargs):
self.kwargs_list = (kwargs,)
on_success(self, using=using)
def join(self, handler):
"""
Joins two unique handlers.
"""
self.kwargs_list += handler.kwargs_list
def _get_unique_id(self):
"""
Unique handler must be identified with some was
:return:
"""
return None
def __hash__(self):
return hash((self.__class__, self._get_unique_id()))
def __call__(self):
self.handle(self.kwargs_list)
def handle(self, kwargs_list):
raise NotImplementedError
class InstanceOneTimeOnSuccessHandler(OneTimeOnSuccessHandler):
"""
Use this class to create handler that will be unique per instance and will be called only once per instance.
"""
def _get_instance(self):
instance = self.kwargs_list[0]['instance']
return instance.__class__.objects.get(pk=instance.pk)
def _get_unique_id(self):
instance = self.kwargs_list[0]['instance']
return hash((instance.__class__, instance.pk))
|
druids/django-chamber | chamber/utils/transaction.py | on_success | python | def on_success(handler, using=None):
connection = get_connection(using)
if getattr(connection, 'transaction_signals_context_list', False):
connection.transaction_signals_context_list[-1].register(handler)
else:
if settings.DEBUG:
logger.warning(
'For on success signal should be activated transaction signals via transaction_signals decorator.'
'Function is called immediately now.'
)
handler() | Register a handler or a function to be called after successful code pass.
If transaction signals are not active the handler/function is called immediately.
:param handler: handler or function that will be called.
:param using: name of the database | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/utils/transaction.py#L84-L101 | [
"on_success(lambda: add_number(numbers_list, 0))\n",
"on_success(lambda: add_number(numbers_list, 0))\n",
"on_success(lambda: add_number(numbers_list, 0))\n",
"on_success(lambda: add_number(numbers_list, 1))\n",
"on_success(lambda: add_number(numbers_list, 2))\n"
] | import logging
from collections import OrderedDict
from django.conf import settings
from django.db import transaction, DEFAULT_DB_ALIAS
from django.db.transaction import get_connection
from django.utils.decorators import ContextDecorator
logger = logging.getLogger(__name__)
def atomic(func):
"""
Decorator helper that overrides django atomic decorator and automatically adds create revision.
"""
try:
from reversion.revisions import create_revision
return transaction.atomic(create_revision()(func))
except ImportError:
return transaction.atomic(func)
class TransactionSignalsContext:
"""
Context object that stores handlers and call it after successful pass trough surrounded code block
with "transaction_signals decorator. Handlers can be unique or standard. Unique handlers are registered
and executed only once.
"""
def __init__(self):
self._unique_handlers = OrderedDict()
self._handlers = []
def register(self, handler):
if getattr(handler, 'is_unique', False):
if hash(handler) in self._unique_handlers:
self._unique_handlers.get(hash(handler)).join(handler)
else:
self._unique_handlers[hash(handler)] = handler
self._handlers.append(handler)
else:
self._handlers.append(handler)
def handle_all(self):
for handler in self._handlers:
handler()
def join(self, transaction_signals_context):
for handler in transaction_signals_context._handlers:
self.register(handler)
class TransactionSignals(ContextDecorator):
"""
Context decorator that supports usage python keyword "with".
Decorator that adds transaction context to the connection on input.
Finally handlers are called on the output.
"""
def __init__(self, using):
self.using = using
def __enter__(self):
connection = get_connection(self.using)
if not hasattr(connection, 'transaction_signals_context_list'):
connection.transaction_signals_context_list = []
connection.transaction_signals_context_list.append(TransactionSignalsContext())
def __exit__(self, exc_type, exc_value, traceback):
connection = get_connection(self.using)
transaction_signals_context = connection.transaction_signals_context_list.pop()
if not exc_value:
if len(connection.transaction_signals_context_list) == 0:
transaction_signals_context.handle_all()
else:
connection.transaction_signals_context_list[-1].join(transaction_signals_context)
def transaction_signals(using=None):
"""
Decorator that adds transaction context to the connection on input.
Finally handlers are called on the output.
:param using: name of the database
"""
if callable(using):
return TransactionSignals(DEFAULT_DB_ALIAS)(using)
else:
return TransactionSignals(using)
def atomic_with_signals(func):
"""
Atomic decorator with transaction signals.
"""
try:
from reversion.revisions import create_revision
return transaction.atomic(create_revision()(transaction_signals(func)))
except ImportError:
return transaction.atomic(transaction_signals(func))
class OnSuccessHandler:
"""
Handler class that is used for performing on success operations.
"""
is_unique = False
def __init__(self, using=None, **kwargs):
self.kwargs = kwargs
on_success(self, using=using)
def __call__(self):
self.handle(**self.kwargs)
def handle(self, **kwargs):
"""
There should be implemented handler operations.
:param kwargs: input data that was send during hanlder creation.
"""
raise NotImplementedError
class OneTimeOnSuccessHandler(OnSuccessHandler):
"""
One time handler class that is used for performing on success operations.
Handler is called only once, but data of all calls are stored inside list (kwargs_list).
"""
is_unique = True
def __init__(self, using=None, **kwargs):
self.kwargs_list = (kwargs,)
on_success(self, using=using)
def join(self, handler):
"""
Joins two unique handlers.
"""
self.kwargs_list += handler.kwargs_list
def _get_unique_id(self):
"""
Unique handler must be identified with some was
:return:
"""
return None
def __hash__(self):
return hash((self.__class__, self._get_unique_id()))
def __call__(self):
self.handle(self.kwargs_list)
def handle(self, kwargs_list):
raise NotImplementedError
class InstanceOneTimeOnSuccessHandler(OneTimeOnSuccessHandler):
"""
Use this class to create handler that will be unique per instance and will be called only once per instance.
"""
def _get_instance(self):
instance = self.kwargs_list[0]['instance']
return instance.__class__.objects.get(pk=instance.pk)
def _get_unique_id(self):
instance = self.kwargs_list[0]['instance']
return hash((instance.__class__, instance.pk))
|
druids/django-chamber | chamber/utils/transaction.py | atomic_with_signals | python | def atomic_with_signals(func):
try:
from reversion.revisions import create_revision
return transaction.atomic(create_revision()(transaction_signals(func)))
except ImportError:
return transaction.atomic(transaction_signals(func)) | Atomic decorator with transaction signals. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/utils/transaction.py#L116-L125 | [
"def transaction_signals(using=None):\n \"\"\"\n Decorator that adds transaction context to the connection on input.\n Finally handlers are called on the output.\n :param using: name of the database\n \"\"\"\n if callable(using):\n return TransactionSignals(DEFAULT_DB_ALIAS)(using)\n els... | import logging
from collections import OrderedDict
from django.conf import settings
from django.db import transaction, DEFAULT_DB_ALIAS
from django.db.transaction import get_connection
from django.utils.decorators import ContextDecorator
logger = logging.getLogger(__name__)
def atomic(func):
"""
Decorator helper that overrides django atomic decorator and automatically adds create revision.
"""
try:
from reversion.revisions import create_revision
return transaction.atomic(create_revision()(func))
except ImportError:
return transaction.atomic(func)
class TransactionSignalsContext:
"""
Context object that stores handlers and call it after successful pass trough surrounded code block
with "transaction_signals decorator. Handlers can be unique or standard. Unique handlers are registered
and executed only once.
"""
def __init__(self):
self._unique_handlers = OrderedDict()
self._handlers = []
def register(self, handler):
if getattr(handler, 'is_unique', False):
if hash(handler) in self._unique_handlers:
self._unique_handlers.get(hash(handler)).join(handler)
else:
self._unique_handlers[hash(handler)] = handler
self._handlers.append(handler)
else:
self._handlers.append(handler)
def handle_all(self):
for handler in self._handlers:
handler()
def join(self, transaction_signals_context):
for handler in transaction_signals_context._handlers:
self.register(handler)
class TransactionSignals(ContextDecorator):
"""
Context decorator that supports usage python keyword "with".
Decorator that adds transaction context to the connection on input.
Finally handlers are called on the output.
"""
def __init__(self, using):
self.using = using
def __enter__(self):
connection = get_connection(self.using)
if not hasattr(connection, 'transaction_signals_context_list'):
connection.transaction_signals_context_list = []
connection.transaction_signals_context_list.append(TransactionSignalsContext())
def __exit__(self, exc_type, exc_value, traceback):
connection = get_connection(self.using)
transaction_signals_context = connection.transaction_signals_context_list.pop()
if not exc_value:
if len(connection.transaction_signals_context_list) == 0:
transaction_signals_context.handle_all()
else:
connection.transaction_signals_context_list[-1].join(transaction_signals_context)
def on_success(handler, using=None):
"""
Register a handler or a function to be called after successful code pass.
If transaction signals are not active the handler/function is called immediately.
:param handler: handler or function that will be called.
:param using: name of the database
"""
connection = get_connection(using)
if getattr(connection, 'transaction_signals_context_list', False):
connection.transaction_signals_context_list[-1].register(handler)
else:
if settings.DEBUG:
logger.warning(
'For on success signal should be activated transaction signals via transaction_signals decorator.'
'Function is called immediately now.'
)
handler()
def transaction_signals(using=None):
"""
Decorator that adds transaction context to the connection on input.
Finally handlers are called on the output.
:param using: name of the database
"""
if callable(using):
return TransactionSignals(DEFAULT_DB_ALIAS)(using)
else:
return TransactionSignals(using)
class OnSuccessHandler:
"""
Handler class that is used for performing on success operations.
"""
is_unique = False
def __init__(self, using=None, **kwargs):
self.kwargs = kwargs
on_success(self, using=using)
def __call__(self):
self.handle(**self.kwargs)
def handle(self, **kwargs):
"""
There should be implemented handler operations.
:param kwargs: input data that was send during hanlder creation.
"""
raise NotImplementedError
class OneTimeOnSuccessHandler(OnSuccessHandler):
"""
One time handler class that is used for performing on success operations.
Handler is called only once, but data of all calls are stored inside list (kwargs_list).
"""
is_unique = True
def __init__(self, using=None, **kwargs):
self.kwargs_list = (kwargs,)
on_success(self, using=using)
def join(self, handler):
"""
Joins two unique handlers.
"""
self.kwargs_list += handler.kwargs_list
def _get_unique_id(self):
"""
Unique handler must be identified with some was
:return:
"""
return None
def __hash__(self):
return hash((self.__class__, self._get_unique_id()))
def __call__(self):
self.handle(self.kwargs_list)
def handle(self, kwargs_list):
raise NotImplementedError
class InstanceOneTimeOnSuccessHandler(OneTimeOnSuccessHandler):
"""
Use this class to create handler that will be unique per instance and will be called only once per instance.
"""
def _get_instance(self):
instance = self.kwargs_list[0]['instance']
return instance.__class__.objects.get(pk=instance.pk)
def _get_unique_id(self):
instance = self.kwargs_list[0]['instance']
return hash((instance.__class__, instance.pk))
|
druids/django-chamber | chamber/patch.py | field_init | python | def field_init(self, *args, **kwargs):
humanize_func = kwargs.pop('humanized', None)
if humanize_func:
def humanize(val, inst, *args, **kwargs):
return humanize_func(val, inst, field=self, *args, **kwargs)
self.humanized = humanize
else:
self.humanized = self.default_humanized
getattr(self, '_init_chamber_patch_')(*args, **kwargs) | Patches a Django Field's `__init__` method for easier usage of optional `kwargs`. It defines a `humanized` attribute
on a field for better display of its value. | train | https://github.com/druids/django-chamber/blob/eef4169923557e96877a664fa254e8c0814f3f23/chamber/patch.py#L50-L62 | null | from django.db.models import Model
from django.db.models.fields import Field
class OptionsLazy:
def __init__(self, name, klass):
self.name = name
self.klass = klass
def __get__(self, instance=None, owner=None):
return self.klass(owner)
class OptionsBase(type):
def __new__(cls, *args, **kwargs):
new_class = super().__new__(cls, *args, **kwargs)
if new_class.model_class and new_class.meta_name:
setattr(new_class.model_class, new_class.meta_name, OptionsLazy(new_class.meta_name, new_class))
return new_class
class Options(metaclass=OptionsBase):
meta_class_name = None
meta_name = None
attributes = None
model_class = None
def __init__(self, model):
self.model = model
for key, default_value in self._get_attributes(model).items():
setattr(self, key, self._getattr(key, default_value))
def _get_attributes(self, model):
return self.attributes
def _getattr(self, name, default_value):
meta_models = [b for b in self.model.__mro__ if issubclass(b, Model)]
for model in meta_models:
meta = getattr(model, self.meta_class_name, None)
if meta:
value = getattr(meta, name, None)
if value is not None:
return value
return default_value
Field.default_humanized = None
Field._init_chamber_patch_ = Field.__init__ # pylint: disable=W0212
Field.__init__ = field_init
|
jwodder/javaproperties | javaproperties/xmlprops.py | load_xml | python | def load_xml(fp, object_pairs_hook=dict):
r"""
Parse the contents of the file-like object ``fp`` as an XML properties file
and return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `load_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``fp`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``fp`` (including duplicates) in order of occurrence. `load_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Files containing
non-ASCII characters need to be opened in binary mode in Python 2,
while Python 3 accepts both binary and text input.
:param fp: the file from which to read the XML properties document
:type fp: file-like object
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute
"""
tree = ET.parse(fp)
return object_pairs_hook(_fromXML(tree.getroot())) | r"""
Parse the contents of the file-like object ``fp`` as an XML properties file
and return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `load_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``fp`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``fp`` (including duplicates) in order of occurrence. `load_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Files containing
non-ASCII characters need to be opened in binary mode in Python 2,
while Python 3 accepts both binary and text input.
:param fp: the file from which to read the XML properties document
:type fp: file-like object
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/xmlprops.py#L7-L41 | [
"def _fromXML(root):\n if root.tag != 'properties':\n raise ValueError('XML tree is not rooted at <properties>')\n for entry in root.findall('entry'):\n key = entry.get('key')\n if key is None:\n raise ValueError('<entry> is missing \"key\" attribute')\n yield (key, entr... | from __future__ import print_function, unicode_literals
import codecs
import xml.etree.ElementTree as ET
from xml.sax.saxutils import escape, quoteattr
from .util import itemize
def loads_xml(s, object_pairs_hook=dict):
r"""
Parse the contents of the string ``s`` as an XML properties document and
return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `loads_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``s`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``s`` (including duplicates) in order of occurrence. `loads_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Strings containing
non-ASCII characters need to be encoded as bytes in Python 2 (Use
either UTF-8 or UTF-16 if the XML document does not contain an encoding
declaration), while Python 3 accepts both binary and text input.
:param string s: the string from which to read the XML properties document
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute
"""
elem = ET.fromstring(s)
return object_pairs_hook(_fromXML(elem))
def _fromXML(root):
if root.tag != 'properties':
raise ValueError('XML tree is not rooted at <properties>')
for entry in root.findall('entry'):
key = entry.get('key')
if key is None:
raise ValueError('<entry> is missing "key" attribute')
yield (key, entry.text)
def dump_xml(props, fp, comment=None, encoding='UTF-8', sort_keys=False):
"""
Write a series ``props`` of key-value pairs to a binary filehandle ``fp``
in the format of an XML properties file. The file will include both an XML
declaration and a doctype declaration.
:param props: A mapping or iterable of ``(key, value)`` pairs to write to
``fp``. All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param fp: a file-like object to write the values of ``props`` to
:type fp: binary file-like object
:param comment: if non-`None`, ``comment`` will be output as a
``<comment>`` element before the ``<entry>`` elements
:type comment: text string or `None`
:param string encoding: the name of the encoding to use for the XML
document (also included in the XML declaration)
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:return: `None`
"""
fp = codecs.lookup(encoding).streamwriter(fp, errors='xmlcharrefreplace')
print('<?xml version="1.0" encoding={0} standalone="no"?>'
.format(quoteattr(encoding)), file=fp)
for s in _stream_xml(props, comment, sort_keys):
print(s, file=fp)
def dumps_xml(props, comment=None, sort_keys=False):
"""
Convert a series ``props`` of key-value pairs to a text string containing
an XML properties document. The document will include a doctype
declaration but not an XML declaration.
:param props: A mapping or iterable of ``(key, value)`` pairs to serialize.
All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param comment: if non-`None`, ``comment`` will be output as a
``<comment>`` element before the ``<entry>`` elements
:type comment: text string or `None`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:rtype: text string
"""
return ''.join(s + '\n' for s in _stream_xml(props, comment, sort_keys))
def _stream_xml(props, comment=None, sort_keys=False):
yield '<!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd">'
yield '<properties>'
if comment is not None:
yield '<comment>' + escape(comment) + '</comment>'
for k,v in itemize(props, sort_keys=sort_keys):
yield '<entry key={0}>{1}</entry>'.format(quoteattr(k), escape(v))
yield '</properties>'
|
jwodder/javaproperties | javaproperties/xmlprops.py | loads_xml | python | def loads_xml(s, object_pairs_hook=dict):
r"""
Parse the contents of the string ``s`` as an XML properties document and
return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `loads_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``s`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``s`` (including duplicates) in order of occurrence. `loads_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Strings containing
non-ASCII characters need to be encoded as bytes in Python 2 (Use
either UTF-8 or UTF-16 if the XML document does not contain an encoding
declaration), while Python 3 accepts both binary and text input.
:param string s: the string from which to read the XML properties document
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute
"""
elem = ET.fromstring(s)
return object_pairs_hook(_fromXML(elem)) | r"""
Parse the contents of the string ``s`` as an XML properties document and
return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `loads_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``s`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``s`` (including duplicates) in order of occurrence. `loads_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Strings containing
non-ASCII characters need to be encoded as bytes in Python 2 (Use
either UTF-8 or UTF-16 if the XML document does not contain an encoding
declaration), while Python 3 accepts both binary and text input.
:param string s: the string from which to read the XML properties document
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/xmlprops.py#L43-L77 | [
"def _fromXML(root):\n if root.tag != 'properties':\n raise ValueError('XML tree is not rooted at <properties>')\n for entry in root.findall('entry'):\n key = entry.get('key')\n if key is None:\n raise ValueError('<entry> is missing \"key\" attribute')\n yield (key, entr... | from __future__ import print_function, unicode_literals
import codecs
import xml.etree.ElementTree as ET
from xml.sax.saxutils import escape, quoteattr
from .util import itemize
def load_xml(fp, object_pairs_hook=dict):
r"""
Parse the contents of the file-like object ``fp`` as an XML properties file
and return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `load_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``fp`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``fp`` (including duplicates) in order of occurrence. `load_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Files containing
non-ASCII characters need to be opened in binary mode in Python 2,
while Python 3 accepts both binary and text input.
:param fp: the file from which to read the XML properties document
:type fp: file-like object
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute
"""
tree = ET.parse(fp)
return object_pairs_hook(_fromXML(tree.getroot()))
def _fromXML(root):
if root.tag != 'properties':
raise ValueError('XML tree is not rooted at <properties>')
for entry in root.findall('entry'):
key = entry.get('key')
if key is None:
raise ValueError('<entry> is missing "key" attribute')
yield (key, entry.text)
def dump_xml(props, fp, comment=None, encoding='UTF-8', sort_keys=False):
"""
Write a series ``props`` of key-value pairs to a binary filehandle ``fp``
in the format of an XML properties file. The file will include both an XML
declaration and a doctype declaration.
:param props: A mapping or iterable of ``(key, value)`` pairs to write to
``fp``. All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param fp: a file-like object to write the values of ``props`` to
:type fp: binary file-like object
:param comment: if non-`None`, ``comment`` will be output as a
``<comment>`` element before the ``<entry>`` elements
:type comment: text string or `None`
:param string encoding: the name of the encoding to use for the XML
document (also included in the XML declaration)
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:return: `None`
"""
fp = codecs.lookup(encoding).streamwriter(fp, errors='xmlcharrefreplace')
print('<?xml version="1.0" encoding={0} standalone="no"?>'
.format(quoteattr(encoding)), file=fp)
for s in _stream_xml(props, comment, sort_keys):
print(s, file=fp)
def dumps_xml(props, comment=None, sort_keys=False):
"""
Convert a series ``props`` of key-value pairs to a text string containing
an XML properties document. The document will include a doctype
declaration but not an XML declaration.
:param props: A mapping or iterable of ``(key, value)`` pairs to serialize.
All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param comment: if non-`None`, ``comment`` will be output as a
``<comment>`` element before the ``<entry>`` elements
:type comment: text string or `None`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:rtype: text string
"""
return ''.join(s + '\n' for s in _stream_xml(props, comment, sort_keys))
def _stream_xml(props, comment=None, sort_keys=False):
yield '<!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd">'
yield '<properties>'
if comment is not None:
yield '<comment>' + escape(comment) + '</comment>'
for k,v in itemize(props, sort_keys=sort_keys):
yield '<entry key={0}>{1}</entry>'.format(quoteattr(k), escape(v))
yield '</properties>'
|
jwodder/javaproperties | javaproperties/xmlprops.py | dump_xml | python | def dump_xml(props, fp, comment=None, encoding='UTF-8', sort_keys=False):
fp = codecs.lookup(encoding).streamwriter(fp, errors='xmlcharrefreplace')
print('<?xml version="1.0" encoding={0} standalone="no"?>'
.format(quoteattr(encoding)), file=fp)
for s in _stream_xml(props, comment, sort_keys):
print(s, file=fp) | Write a series ``props`` of key-value pairs to a binary filehandle ``fp``
in the format of an XML properties file. The file will include both an XML
declaration and a doctype declaration.
:param props: A mapping or iterable of ``(key, value)`` pairs to write to
``fp``. All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param fp: a file-like object to write the values of ``props`` to
:type fp: binary file-like object
:param comment: if non-`None`, ``comment`` will be output as a
``<comment>`` element before the ``<entry>`` elements
:type comment: text string or `None`
:param string encoding: the name of the encoding to use for the XML
document (also included in the XML declaration)
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:return: `None` | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/xmlprops.py#L88-L112 | [
"def _stream_xml(props, comment=None, sort_keys=False):\n yield '<!DOCTYPE properties SYSTEM \"http://java.sun.com/dtd/properties.dtd\">'\n yield '<properties>'\n if comment is not None:\n yield '<comment>' + escape(comment) + '</comment>'\n for k,v in itemize(props, sort_keys=sort_keys):\n ... | from __future__ import print_function, unicode_literals
import codecs
import xml.etree.ElementTree as ET
from xml.sax.saxutils import escape, quoteattr
from .util import itemize
def load_xml(fp, object_pairs_hook=dict):
r"""
Parse the contents of the file-like object ``fp`` as an XML properties file
and return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `load_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``fp`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``fp`` (including duplicates) in order of occurrence. `load_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Files containing
non-ASCII characters need to be opened in binary mode in Python 2,
while Python 3 accepts both binary and text input.
:param fp: the file from which to read the XML properties document
:type fp: file-like object
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute
"""
tree = ET.parse(fp)
return object_pairs_hook(_fromXML(tree.getroot()))
def loads_xml(s, object_pairs_hook=dict):
r"""
Parse the contents of the string ``s`` as an XML properties document and
return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `loads_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``s`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``s`` (including duplicates) in order of occurrence. `loads_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Strings containing
non-ASCII characters need to be encoded as bytes in Python 2 (Use
either UTF-8 or UTF-16 if the XML document does not contain an encoding
declaration), while Python 3 accepts both binary and text input.
:param string s: the string from which to read the XML properties document
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute
"""
elem = ET.fromstring(s)
return object_pairs_hook(_fromXML(elem))
def _fromXML(root):
if root.tag != 'properties':
raise ValueError('XML tree is not rooted at <properties>')
for entry in root.findall('entry'):
key = entry.get('key')
if key is None:
raise ValueError('<entry> is missing "key" attribute')
yield (key, entry.text)
def dumps_xml(props, comment=None, sort_keys=False):
"""
Convert a series ``props`` of key-value pairs to a text string containing
an XML properties document. The document will include a doctype
declaration but not an XML declaration.
:param props: A mapping or iterable of ``(key, value)`` pairs to serialize.
All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param comment: if non-`None`, ``comment`` will be output as a
``<comment>`` element before the ``<entry>`` elements
:type comment: text string or `None`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:rtype: text string
"""
return ''.join(s + '\n' for s in _stream_xml(props, comment, sort_keys))
def _stream_xml(props, comment=None, sort_keys=False):
yield '<!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd">'
yield '<properties>'
if comment is not None:
yield '<comment>' + escape(comment) + '</comment>'
for k,v in itemize(props, sort_keys=sort_keys):
yield '<entry key={0}>{1}</entry>'.format(quoteattr(k), escape(v))
yield '</properties>'
|
jwodder/javaproperties | javaproperties/xmlprops.py | dumps_xml | python | def dumps_xml(props, comment=None, sort_keys=False):
return ''.join(s + '\n' for s in _stream_xml(props, comment, sort_keys)) | Convert a series ``props`` of key-value pairs to a text string containing
an XML properties document. The document will include a doctype
declaration but not an XML declaration.
:param props: A mapping or iterable of ``(key, value)`` pairs to serialize.
All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param comment: if non-`None`, ``comment`` will be output as a
``<comment>`` element before the ``<entry>`` elements
:type comment: text string or `None`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:rtype: text string | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/xmlprops.py#L114-L130 | [
"def _stream_xml(props, comment=None, sort_keys=False):\n yield '<!DOCTYPE properties SYSTEM \"http://java.sun.com/dtd/properties.dtd\">'\n yield '<properties>'\n if comment is not None:\n yield '<comment>' + escape(comment) + '</comment>'\n for k,v in itemize(props, sort_keys=sort_keys):\n ... | from __future__ import print_function, unicode_literals
import codecs
import xml.etree.ElementTree as ET
from xml.sax.saxutils import escape, quoteattr
from .util import itemize
def load_xml(fp, object_pairs_hook=dict):
r"""
Parse the contents of the file-like object ``fp`` as an XML properties file
and return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `load_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``fp`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``fp`` (including duplicates) in order of occurrence. `load_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Files containing
non-ASCII characters need to be opened in binary mode in Python 2,
while Python 3 accepts both binary and text input.
:param fp: the file from which to read the XML properties document
:type fp: file-like object
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute
"""
tree = ET.parse(fp)
return object_pairs_hook(_fromXML(tree.getroot()))
def loads_xml(s, object_pairs_hook=dict):
r"""
Parse the contents of the string ``s`` as an XML properties document and
return a `dict` of the key-value pairs.
Beyond basic XML well-formedness, `loads_xml` only checks that the root
element is named "``properties``" and that all of its ``<entry>`` children
have ``key`` attributes. No further validation is performed; if any
``<entry>``\ s happen to contain nested tags, the behavior is undefined.
By default, the key-value pairs extracted from ``s`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``s`` (including duplicates) in order of occurrence. `loads_xml` will then
return the value returned by ``object_pairs_hook``.
.. note::
This uses `xml.etree.ElementTree` for parsing, which does not have
decent support for |unicode|_ input in Python 2. Strings containing
non-ASCII characters need to be encoded as bytes in Python 2 (Use
either UTF-8 or UTF-16 if the XML document does not contain an encoding
declaration), while Python 3 accepts both binary and text input.
:param string s: the string from which to read the XML properties document
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` or the return value of ``object_pairs_hook``
:raises ValueError: if the root of the XML tree is not a ``<properties>``
tag or an ``<entry>`` element is missing a ``key`` attribute
"""
elem = ET.fromstring(s)
return object_pairs_hook(_fromXML(elem))
def _fromXML(root):
if root.tag != 'properties':
raise ValueError('XML tree is not rooted at <properties>')
for entry in root.findall('entry'):
key = entry.get('key')
if key is None:
raise ValueError('<entry> is missing "key" attribute')
yield (key, entry.text)
def dump_xml(props, fp, comment=None, encoding='UTF-8', sort_keys=False):
"""
Write a series ``props`` of key-value pairs to a binary filehandle ``fp``
in the format of an XML properties file. The file will include both an XML
declaration and a doctype declaration.
:param props: A mapping or iterable of ``(key, value)`` pairs to write to
``fp``. All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param fp: a file-like object to write the values of ``props`` to
:type fp: binary file-like object
:param comment: if non-`None`, ``comment`` will be output as a
``<comment>`` element before the ``<entry>`` elements
:type comment: text string or `None`
:param string encoding: the name of the encoding to use for the XML
document (also included in the XML declaration)
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:return: `None`
"""
fp = codecs.lookup(encoding).streamwriter(fp, errors='xmlcharrefreplace')
print('<?xml version="1.0" encoding={0} standalone="no"?>'
.format(quoteattr(encoding)), file=fp)
for s in _stream_xml(props, comment, sort_keys):
print(s, file=fp)
def _stream_xml(props, comment=None, sort_keys=False):
yield '<!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd">'
yield '<properties>'
if comment is not None:
yield '<comment>' + escape(comment) + '</comment>'
for k,v in itemize(props, sort_keys=sort_keys):
yield '<entry key={0}>{1}</entry>'.format(quoteattr(k), escape(v))
yield '</properties>'
|
jwodder/javaproperties | javaproperties/reading.py | load | python | def load(fp, object_pairs_hook=dict):
return object_pairs_hook((k,v) for k,v,_ in parse(fp) if k is not None) | Parse the contents of the `~io.IOBase.readline`-supporting file-like object
``fp`` as a simple line-oriented ``.properties`` file and return a `dict`
of the key-value pairs.
``fp`` may be either a text or binary filehandle, with or without universal
newlines enabled. If it is a binary filehandle, its contents are decoded
as Latin-1.
By default, the key-value pairs extracted from ``fp`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``fp`` (including duplicates) in order of occurrence. `load` will then
return the value returned by ``object_pairs_hook``.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` of text strings or the return value of ``object_pairs_hook``
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/reading.py#L6-L36 | [
"def parse(fp):\n \"\"\"\n Parse the contents of the `~io.IOBase.readline`-supporting file-like object\n ``fp`` as a simple line-oriented ``.properties`` file and return a\n generator of ``(key, value, original_lines)`` triples for every entry in\n ``fp`` (including duplicate keys) in order of occurr... | from __future__ import unicode_literals
import re
from six import binary_type, StringIO, BytesIO, unichr
from .util import ascii_splitlines
def loads(s, object_pairs_hook=dict):
"""
Parse the contents of the string ``s`` as a simple line-oriented
``.properties`` file and return a `dict` of the key-value pairs.
``s`` may be either a text string or bytes string. If it is a bytes
string, its contents are decoded as Latin-1.
By default, the key-value pairs extracted from ``s`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``s`` (including duplicates) in order of occurrence. `loads` will then
return the value returned by ``object_pairs_hook``.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param string s: the string from which to read the ``.properties`` document
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` of text strings or the return value of ``object_pairs_hook``
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
fp = BytesIO(s) if isinstance(s, binary_type) else StringIO(s)
return load(fp, object_pairs_hook=object_pairs_hook)
def parse(fp):
"""
Parse the contents of the `~io.IOBase.readline`-supporting file-like object
``fp`` as a simple line-oriented ``.properties`` file and return a
generator of ``(key, value, original_lines)`` triples for every entry in
``fp`` (including duplicate keys) in order of occurrence. The third
element of each triple is the concatenation of the unmodified lines in
``fp`` (including trailing newlines) from which the key and value were
extracted. The generator also includes comments and blank/all-whitespace
lines found in ``fp``, one triple per line, with the first two elements of
the triples set to `None`. This is the only way to extract comments from a
``.properties`` file with this library.
``fp`` may be either a text or binary filehandle, with or without universal
newlines enabled. If it is a binary filehandle, its contents are decoded
as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:rtype: generator of triples of text strings
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
def lineiter():
while True:
ln = fp.readline()
if isinstance(ln, binary_type):
ln = ln.decode('iso-8859-1')
if ln == '':
return
for l in ascii_splitlines(ln):
yield l
liter = lineiter()
for source in liter:
line = source
if re.match(r'^[ \t\f]*(?:[#!]|\r?\n?$)', line):
yield (None, None, source)
continue
line = line.lstrip(' \t\f').rstrip('\r\n')
while re.search(r'(?<!\\)(?:\\\\)*\\$', line):
line = line[:-1]
nextline = next(liter, '')
source += nextline
line += nextline.lstrip(' \t\f').rstrip('\r\n')
if line == '': # series of otherwise-blank lines with continuations
yield (None, None, source)
continue
m = re.search(r'(?<!\\)(?:\\\\)*([ \t\f]*[=:]|[ \t\f])[ \t\f]*', line)
if m:
yield (unescape(line[:m.start(1)]),unescape(line[m.end():]),source)
else:
yield (unescape(line), '', source)
def unescape(field):
"""
Decode escape sequences in a ``.properties`` key or value. The following
escape sequences are recognized::
\\t \\n \\f \\r \\uXXXX \\\\
If a backslash is followed by any other character, the backslash is
dropped.
In addition, any valid UTF-16 surrogate pairs in the string after
escape-decoding are further decoded into the non-BMP characters they
represent. (Invalid & isolated surrogate code points are left as-is.)
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param field: the string to decode
:type field: text string
:rtype: text string
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
return re.sub(r'[\uD800-\uDBFF][\uDC00-\uDFFF]', _unsurrogate,
re.sub(r'\\(u.{0,4}|.)', _unesc, field))
_unescapes = {'t': '\t', 'n': '\n', 'f': '\f', 'r': '\r'}
def _unesc(m):
esc = m.group(1)
if esc[0] == 'u':
if not re.match(r'^u[0-9A-Fa-f]{4}\Z', esc):
# We can't rely on `int` failing, because it succeeds when `esc`
# has trailing whitespace or a leading minus.
raise InvalidUEscapeError('\\' + esc)
return unichr(int(esc[1:], 16))
else:
return _unescapes.get(esc, esc)
def _unsurrogate(m):
c,d = map(ord, m.group())
return unichr(((c - 0xD800) << 10) + (d - 0xDC00) + 0x10000)
class InvalidUEscapeError(ValueError):
"""
.. versionadded:: 0.5.0
Raised when an invalid ``\\uXXXX`` escape sequence (i.e., a ``\\u`` not
immediately followed by four hexadecimal digits) is encountered in a simple
line-oriented ``.properties`` file
"""
def __init__(self, escape):
#: The invalid ``\uXXXX`` escape sequence encountered
self.escape = escape
super(InvalidUEscapeError, self).__init__(escape)
def __str__(self):
return 'Invalid \\u escape sequence: ' + self.escape
|
jwodder/javaproperties | javaproperties/reading.py | loads | python | def loads(s, object_pairs_hook=dict):
fp = BytesIO(s) if isinstance(s, binary_type) else StringIO(s)
return load(fp, object_pairs_hook=object_pairs_hook) | Parse the contents of the string ``s`` as a simple line-oriented
``.properties`` file and return a `dict` of the key-value pairs.
``s`` may be either a text string or bytes string. If it is a bytes
string, its contents are decoded as Latin-1.
By default, the key-value pairs extracted from ``s`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``s`` (including duplicates) in order of occurrence. `loads` will then
return the value returned by ``object_pairs_hook``.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param string s: the string from which to read the ``.properties`` document
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` of text strings or the return value of ``object_pairs_hook``
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/reading.py#L38-L66 | [
"def load(fp, object_pairs_hook=dict):\n \"\"\"\n Parse the contents of the `~io.IOBase.readline`-supporting file-like object\n ``fp`` as a simple line-oriented ``.properties`` file and return a `dict`\n of the key-value pairs.\n\n ``fp`` may be either a text or binary filehandle, with or without uni... | from __future__ import unicode_literals
import re
from six import binary_type, StringIO, BytesIO, unichr
from .util import ascii_splitlines
def load(fp, object_pairs_hook=dict):
"""
Parse the contents of the `~io.IOBase.readline`-supporting file-like object
``fp`` as a simple line-oriented ``.properties`` file and return a `dict`
of the key-value pairs.
``fp`` may be either a text or binary filehandle, with or without universal
newlines enabled. If it is a binary filehandle, its contents are decoded
as Latin-1.
By default, the key-value pairs extracted from ``fp`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``fp`` (including duplicates) in order of occurrence. `load` will then
return the value returned by ``object_pairs_hook``.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` of text strings or the return value of ``object_pairs_hook``
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
return object_pairs_hook((k,v) for k,v,_ in parse(fp) if k is not None)
def parse(fp):
"""
Parse the contents of the `~io.IOBase.readline`-supporting file-like object
``fp`` as a simple line-oriented ``.properties`` file and return a
generator of ``(key, value, original_lines)`` triples for every entry in
``fp`` (including duplicate keys) in order of occurrence. The third
element of each triple is the concatenation of the unmodified lines in
``fp`` (including trailing newlines) from which the key and value were
extracted. The generator also includes comments and blank/all-whitespace
lines found in ``fp``, one triple per line, with the first two elements of
the triples set to `None`. This is the only way to extract comments from a
``.properties`` file with this library.
``fp`` may be either a text or binary filehandle, with or without universal
newlines enabled. If it is a binary filehandle, its contents are decoded
as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:rtype: generator of triples of text strings
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
def lineiter():
while True:
ln = fp.readline()
if isinstance(ln, binary_type):
ln = ln.decode('iso-8859-1')
if ln == '':
return
for l in ascii_splitlines(ln):
yield l
liter = lineiter()
for source in liter:
line = source
if re.match(r'^[ \t\f]*(?:[#!]|\r?\n?$)', line):
yield (None, None, source)
continue
line = line.lstrip(' \t\f').rstrip('\r\n')
while re.search(r'(?<!\\)(?:\\\\)*\\$', line):
line = line[:-1]
nextline = next(liter, '')
source += nextline
line += nextline.lstrip(' \t\f').rstrip('\r\n')
if line == '': # series of otherwise-blank lines with continuations
yield (None, None, source)
continue
m = re.search(r'(?<!\\)(?:\\\\)*([ \t\f]*[=:]|[ \t\f])[ \t\f]*', line)
if m:
yield (unescape(line[:m.start(1)]),unescape(line[m.end():]),source)
else:
yield (unescape(line), '', source)
def unescape(field):
"""
Decode escape sequences in a ``.properties`` key or value. The following
escape sequences are recognized::
\\t \\n \\f \\r \\uXXXX \\\\
If a backslash is followed by any other character, the backslash is
dropped.
In addition, any valid UTF-16 surrogate pairs in the string after
escape-decoding are further decoded into the non-BMP characters they
represent. (Invalid & isolated surrogate code points are left as-is.)
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param field: the string to decode
:type field: text string
:rtype: text string
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
return re.sub(r'[\uD800-\uDBFF][\uDC00-\uDFFF]', _unsurrogate,
re.sub(r'\\(u.{0,4}|.)', _unesc, field))
_unescapes = {'t': '\t', 'n': '\n', 'f': '\f', 'r': '\r'}
def _unesc(m):
esc = m.group(1)
if esc[0] == 'u':
if not re.match(r'^u[0-9A-Fa-f]{4}\Z', esc):
# We can't rely on `int` failing, because it succeeds when `esc`
# has trailing whitespace or a leading minus.
raise InvalidUEscapeError('\\' + esc)
return unichr(int(esc[1:], 16))
else:
return _unescapes.get(esc, esc)
def _unsurrogate(m):
c,d = map(ord, m.group())
return unichr(((c - 0xD800) << 10) + (d - 0xDC00) + 0x10000)
class InvalidUEscapeError(ValueError):
"""
.. versionadded:: 0.5.0
Raised when an invalid ``\\uXXXX`` escape sequence (i.e., a ``\\u`` not
immediately followed by four hexadecimal digits) is encountered in a simple
line-oriented ``.properties`` file
"""
def __init__(self, escape):
#: The invalid ``\uXXXX`` escape sequence encountered
self.escape = escape
super(InvalidUEscapeError, self).__init__(escape)
def __str__(self):
return 'Invalid \\u escape sequence: ' + self.escape
|
jwodder/javaproperties | javaproperties/reading.py | parse | python | def parse(fp):
def lineiter():
while True:
ln = fp.readline()
if isinstance(ln, binary_type):
ln = ln.decode('iso-8859-1')
if ln == '':
return
for l in ascii_splitlines(ln):
yield l
liter = lineiter()
for source in liter:
line = source
if re.match(r'^[ \t\f]*(?:[#!]|\r?\n?$)', line):
yield (None, None, source)
continue
line = line.lstrip(' \t\f').rstrip('\r\n')
while re.search(r'(?<!\\)(?:\\\\)*\\$', line):
line = line[:-1]
nextline = next(liter, '')
source += nextline
line += nextline.lstrip(' \t\f').rstrip('\r\n')
if line == '': # series of otherwise-blank lines with continuations
yield (None, None, source)
continue
m = re.search(r'(?<!\\)(?:\\\\)*([ \t\f]*[=:]|[ \t\f])[ \t\f]*', line)
if m:
yield (unescape(line[:m.start(1)]),unescape(line[m.end():]),source)
else:
yield (unescape(line), '', source) | Parse the contents of the `~io.IOBase.readline`-supporting file-like object
``fp`` as a simple line-oriented ``.properties`` file and return a
generator of ``(key, value, original_lines)`` triples for every entry in
``fp`` (including duplicate keys) in order of occurrence. The third
element of each triple is the concatenation of the unmodified lines in
``fp`` (including trailing newlines) from which the key and value were
extracted. The generator also includes comments and blank/all-whitespace
lines found in ``fp``, one triple per line, with the first two elements of
the triples set to `None`. This is the only way to extract comments from a
``.properties`` file with this library.
``fp`` may be either a text or binary filehandle, with or without universal
newlines enabled. If it is a binary filehandle, its contents are decoded
as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:rtype: generator of triples of text strings
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/reading.py#L68-L123 | [
"def lineiter():\n while True:\n ln = fp.readline()\n if isinstance(ln, binary_type):\n ln = ln.decode('iso-8859-1')\n if ln == '':\n return\n for l in ascii_splitlines(ln):\n yield l\n"
] | from __future__ import unicode_literals
import re
from six import binary_type, StringIO, BytesIO, unichr
from .util import ascii_splitlines
def load(fp, object_pairs_hook=dict):
"""
Parse the contents of the `~io.IOBase.readline`-supporting file-like object
``fp`` as a simple line-oriented ``.properties`` file and return a `dict`
of the key-value pairs.
``fp`` may be either a text or binary filehandle, with or without universal
newlines enabled. If it is a binary filehandle, its contents are decoded
as Latin-1.
By default, the key-value pairs extracted from ``fp`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``fp`` (including duplicates) in order of occurrence. `load` will then
return the value returned by ``object_pairs_hook``.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` of text strings or the return value of ``object_pairs_hook``
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
return object_pairs_hook((k,v) for k,v,_ in parse(fp) if k is not None)
def loads(s, object_pairs_hook=dict):
"""
Parse the contents of the string ``s`` as a simple line-oriented
``.properties`` file and return a `dict` of the key-value pairs.
``s`` may be either a text string or bytes string. If it is a bytes
string, its contents are decoded as Latin-1.
By default, the key-value pairs extracted from ``s`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``s`` (including duplicates) in order of occurrence. `loads` will then
return the value returned by ``object_pairs_hook``.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param string s: the string from which to read the ``.properties`` document
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` of text strings or the return value of ``object_pairs_hook``
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
fp = BytesIO(s) if isinstance(s, binary_type) else StringIO(s)
return load(fp, object_pairs_hook=object_pairs_hook)
def unescape(field):
"""
Decode escape sequences in a ``.properties`` key or value. The following
escape sequences are recognized::
\\t \\n \\f \\r \\uXXXX \\\\
If a backslash is followed by any other character, the backslash is
dropped.
In addition, any valid UTF-16 surrogate pairs in the string after
escape-decoding are further decoded into the non-BMP characters they
represent. (Invalid & isolated surrogate code points are left as-is.)
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param field: the string to decode
:type field: text string
:rtype: text string
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
return re.sub(r'[\uD800-\uDBFF][\uDC00-\uDFFF]', _unsurrogate,
re.sub(r'\\(u.{0,4}|.)', _unesc, field))
_unescapes = {'t': '\t', 'n': '\n', 'f': '\f', 'r': '\r'}
def _unesc(m):
esc = m.group(1)
if esc[0] == 'u':
if not re.match(r'^u[0-9A-Fa-f]{4}\Z', esc):
# We can't rely on `int` failing, because it succeeds when `esc`
# has trailing whitespace or a leading minus.
raise InvalidUEscapeError('\\' + esc)
return unichr(int(esc[1:], 16))
else:
return _unescapes.get(esc, esc)
def _unsurrogate(m):
c,d = map(ord, m.group())
return unichr(((c - 0xD800) << 10) + (d - 0xDC00) + 0x10000)
class InvalidUEscapeError(ValueError):
"""
.. versionadded:: 0.5.0
Raised when an invalid ``\\uXXXX`` escape sequence (i.e., a ``\\u`` not
immediately followed by four hexadecimal digits) is encountered in a simple
line-oriented ``.properties`` file
"""
def __init__(self, escape):
#: The invalid ``\uXXXX`` escape sequence encountered
self.escape = escape
super(InvalidUEscapeError, self).__init__(escape)
def __str__(self):
return 'Invalid \\u escape sequence: ' + self.escape
|
jwodder/javaproperties | javaproperties/reading.py | unescape | python | def unescape(field):
return re.sub(r'[\uD800-\uDBFF][\uDC00-\uDFFF]', _unsurrogate,
re.sub(r'\\(u.{0,4}|.)', _unesc, field)) | Decode escape sequences in a ``.properties`` key or value. The following
escape sequences are recognized::
\\t \\n \\f \\r \\uXXXX \\\\
If a backslash is followed by any other character, the backslash is
dropped.
In addition, any valid UTF-16 surrogate pairs in the string after
escape-decoding are further decoded into the non-BMP characters they
represent. (Invalid & isolated surrogate code points are left as-is.)
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param field: the string to decode
:type field: text string
:rtype: text string
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/reading.py#L125-L150 | null | from __future__ import unicode_literals
import re
from six import binary_type, StringIO, BytesIO, unichr
from .util import ascii_splitlines
def load(fp, object_pairs_hook=dict):
"""
Parse the contents of the `~io.IOBase.readline`-supporting file-like object
``fp`` as a simple line-oriented ``.properties`` file and return a `dict`
of the key-value pairs.
``fp`` may be either a text or binary filehandle, with or without universal
newlines enabled. If it is a binary filehandle, its contents are decoded
as Latin-1.
By default, the key-value pairs extracted from ``fp`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``fp`` (including duplicates) in order of occurrence. `load` will then
return the value returned by ``object_pairs_hook``.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` of text strings or the return value of ``object_pairs_hook``
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
return object_pairs_hook((k,v) for k,v,_ in parse(fp) if k is not None)
def loads(s, object_pairs_hook=dict):
"""
Parse the contents of the string ``s`` as a simple line-oriented
``.properties`` file and return a `dict` of the key-value pairs.
``s`` may be either a text string or bytes string. If it is a bytes
string, its contents are decoded as Latin-1.
By default, the key-value pairs extracted from ``s`` are combined into a
`dict` with later occurrences of a key overriding previous occurrences of
the same key. To change this behavior, pass a callable as the
``object_pairs_hook`` argument; it will be called with one argument, a
generator of ``(key, value)`` pairs representing the key-value entries in
``s`` (including duplicates) in order of occurrence. `loads` will then
return the value returned by ``object_pairs_hook``.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param string s: the string from which to read the ``.properties`` document
:param callable object_pairs_hook: class or function for combining the
key-value pairs
:rtype: `dict` of text strings or the return value of ``object_pairs_hook``
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
fp = BytesIO(s) if isinstance(s, binary_type) else StringIO(s)
return load(fp, object_pairs_hook=object_pairs_hook)
def parse(fp):
"""
Parse the contents of the `~io.IOBase.readline`-supporting file-like object
``fp`` as a simple line-oriented ``.properties`` file and return a
generator of ``(key, value, original_lines)`` triples for every entry in
``fp`` (including duplicate keys) in order of occurrence. The third
element of each triple is the concatenation of the unmodified lines in
``fp`` (including trailing newlines) from which the key and value were
extracted. The generator also includes comments and blank/all-whitespace
lines found in ``fp``, one triple per line, with the first two elements of
the triples set to `None`. This is the only way to extract comments from a
``.properties`` file with this library.
``fp`` may be either a text or binary filehandle, with or without universal
newlines enabled. If it is a binary filehandle, its contents are decoded
as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:rtype: generator of triples of text strings
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
def lineiter():
while True:
ln = fp.readline()
if isinstance(ln, binary_type):
ln = ln.decode('iso-8859-1')
if ln == '':
return
for l in ascii_splitlines(ln):
yield l
liter = lineiter()
for source in liter:
line = source
if re.match(r'^[ \t\f]*(?:[#!]|\r?\n?$)', line):
yield (None, None, source)
continue
line = line.lstrip(' \t\f').rstrip('\r\n')
while re.search(r'(?<!\\)(?:\\\\)*\\$', line):
line = line[:-1]
nextline = next(liter, '')
source += nextline
line += nextline.lstrip(' \t\f').rstrip('\r\n')
if line == '': # series of otherwise-blank lines with continuations
yield (None, None, source)
continue
m = re.search(r'(?<!\\)(?:\\\\)*([ \t\f]*[=:]|[ \t\f])[ \t\f]*', line)
if m:
yield (unescape(line[:m.start(1)]),unescape(line[m.end():]),source)
else:
yield (unescape(line), '', source)
_unescapes = {'t': '\t', 'n': '\n', 'f': '\f', 'r': '\r'}
def _unesc(m):
esc = m.group(1)
if esc[0] == 'u':
if not re.match(r'^u[0-9A-Fa-f]{4}\Z', esc):
# We can't rely on `int` failing, because it succeeds when `esc`
# has trailing whitespace or a leading minus.
raise InvalidUEscapeError('\\' + esc)
return unichr(int(esc[1:], 16))
else:
return _unescapes.get(esc, esc)
def _unsurrogate(m):
c,d = map(ord, m.group())
return unichr(((c - 0xD800) << 10) + (d - 0xDC00) + 0x10000)
class InvalidUEscapeError(ValueError):
"""
.. versionadded:: 0.5.0
Raised when an invalid ``\\uXXXX`` escape sequence (i.e., a ``\\u`` not
immediately followed by four hexadecimal digits) is encountered in a simple
line-oriented ``.properties`` file
"""
def __init__(self, escape):
#: The invalid ``\uXXXX`` escape sequence encountered
self.escape = escape
super(InvalidUEscapeError, self).__init__(escape)
def __str__(self):
return 'Invalid \\u escape sequence: ' + self.escape
|
jwodder/javaproperties | javaproperties/writing.py | dump | python | def dump(props, fp, separator='=', comments=None, timestamp=True,
sort_keys=False):
if comments is not None:
print(to_comment(comments), file=fp)
if timestamp is not None and timestamp is not False:
print(to_comment(java_timestamp(timestamp)), file=fp)
for k,v in itemize(props, sort_keys=sort_keys):
print(join_key_value(k, v, separator), file=fp) | Write a series of key-value pairs to a file in simple line-oriented
``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to write to
``fp``. All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param fp: A file-like object to write the values of ``props`` to. It must
have been opened as a text file with a Latin-1-compatible encoding.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be written to ``fp`` as a
comment before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is written as a comment to ``fp``
after ``comments`` (if any) and before the key-value pairs. If
``timestamp`` is `True`, the current date & time is used. If it is a
number, it is converted from seconds since the epoch to local time. If
it is a `datetime.datetime` object, its value is used directly, with
naïve objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:return: `None` | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/writing.py#L10-L45 | [
"def java_timestamp(timestamp=True):\n \"\"\"\n .. versionadded:: 0.2.0\n\n Returns a timestamp in the format produced by |date_tostring|_, e.g.::\n\n Mon Sep 02 14:00:54 EDT 2016\n\n If ``timestamp`` is `True` (the default), the current date & time is\n returned.\n\n If ``timestamp`` is `N... | # -*- coding: utf-8 -*-
from __future__ import print_function, unicode_literals
from datetime import datetime
import numbers
import re
import time
from six import StringIO
from .util import itemize
def dumps(props, separator='=', comments=None, timestamp=True, sort_keys=False):
"""
Convert a series of key-value pairs to a text string in simple
line-oriented ``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to serialize.
All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be output as a comment
before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is output as a comment after
``comments`` (if any) and before the key-value pairs. If ``timestamp``
is `True`, the current date & time is used. If it is a number, it is
converted from seconds since the epoch to local time. If it is a
`datetime.datetime` object, its value is used directly, with naïve
objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:rtype: text string
"""
s = StringIO()
dump(props, s, separator=separator, comments=comments, timestamp=timestamp,
sort_keys=sort_keys)
return s.getvalue()
def to_comment(comment):
"""
Convert a string to a ``.properties`` file comment. All non-Latin-1
characters in the string are escaped using ``\\uXXXX`` escapes (after
converting non-BMP characters to surrogate pairs), a ``#`` is prepended to
the string, any CR LF or CR line breaks in the string are converted to LF,
and a ``#`` is inserted after any line break not already followed by a
``#`` or ``!``. No trailing newline is added.
>>> to_comment('They say foo=bar,\\r\\nbut does bar=foo?')
'#They say foo=bar,\\n#but does bar=foo?'
:param comment: the string to convert to a comment
:type comment: text string
:rtype: text string
"""
return '#' + re.sub(r'[^\x00-\xFF]', _esc,
re.sub(r'\n(?![#!])', '\n#',
re.sub(r'\r\n?', '\n', comment)))
def join_key_value(key, value, separator='='):
r"""
Join a key and value together into a single line suitable for adding to a
simple line-oriented ``.properties`` file. No trailing newline is added.
>>> join_key_value('possible separators', '= : space')
'possible\\ separators=\\= \\: space'
:param key: the key
:type key: text string
:param value: the value
:type value: text string
:param separator: the string to use for separating the key & value. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:rtype: text string
"""
# Escapes `key` and `value` the same way as java.util.Properties.store()
return escape(key) \
+ separator \
+ re.sub(r'^ +', lambda m: r'\ ' * m.end(), _base_escape(value))
_escapes = {
'\t': r'\t',
'\n': r'\n',
'\f': r'\f',
'\r': r'\r',
'!': r'\!',
'#': r'\#',
':': r'\:',
'=': r'\=',
'\\': r'\\',
}
def _esc(m):
c = m.group()
try:
return _escapes[c]
except KeyError:
c = ord(c)
if c > 0xFFFF:
# Does Python really not have a decent builtin way to calculate
# surrogate pairs?
assert c <= 0x10FFFF
c -= 0x10000
return '\\u{0:04x}\\u{1:04x}'.format(
0xD800 + (c >> 10),
0xDC00 + (c & 0x3FF)
)
else:
return '\\u{0:04x}'.format(c)
def _base_escape(field):
return re.sub(r'[^\x20-\x7E]|[\\#!=:]', _esc, field)
def escape(field):
"""
Escape a string so that it can be safely used as either a key or value in a
``.properties`` file. All non-ASCII characters, all nonprintable or space
characters, and the characters ``\\ # ! = :`` are all escaped using either
the single-character escapes recognized by `unescape` (when they exist) or
``\\uXXXX`` escapes (after converting non-BMP characters to surrogate
pairs).
:param field: the string to escape
:type field: text string
:rtype: text string
"""
return _base_escape(field).replace(' ', r'\ ')
DAYS_OF_WEEK = ['Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat', 'Sun']
MONTHS = [
'Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun',
'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec',
]
def java_timestamp(timestamp=True):
"""
.. versionadded:: 0.2.0
Returns a timestamp in the format produced by |date_tostring|_, e.g.::
Mon Sep 02 14:00:54 EDT 2016
If ``timestamp`` is `True` (the default), the current date & time is
returned.
If ``timestamp`` is `None` or `False`, an empty string is returned.
If ``timestamp`` is a number, it is converted from seconds since the epoch
to local time.
If ``timestamp`` is a `datetime.datetime` object, its value is used
directly, with naïve objects assumed to be in the local timezone.
The timestamp is always constructed using the C locale.
:param timestamp: the date & time to display
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:rtype: text string
.. |date_tostring| replace:: Java 8's ``Date.toString()``
.. _date_tostring: https://docs.oracle.com/javase/8/docs/api/java/util/Date.html#toString--
"""
if timestamp is None or timestamp is False:
return ''
if isinstance(timestamp, datetime) and timestamp.tzinfo is not None:
timebits = timestamp.timetuple()
# Assumes `timestamp.tzinfo.tzname()` is meaningful/useful
tzname = timestamp.tzname()
else:
if timestamp is True:
timestamp = None
elif isinstance(timestamp, datetime):
try:
# Use `datetime.timestamp()` if it's available, as it (unlike
# `datetime.timetuple()`) takes `fold` into account for naïve
# datetimes
timestamp = timestamp.timestamp()
except AttributeError: # Pre-Python 3.3
# Mapping `timetuple` through `mktime` and `localtime` is
# necessary for determining whether DST is in effect (which, in
# turn, is necessary for determining which timezone name to
# use). The only downside to using standard functions instead
# of `python-dateutil` is that `mktime`, apparently, handles
# times duplicated by DST non-deterministically (cf.
# <https://git.io/vixsE>), but there's no right way to deal
# with those anyway, so...
timestamp = time.mktime(timestamp.timetuple())
elif not isinstance(timestamp, numbers.Number):
raise TypeError('Timestamp must be number or datetime.datetime')
timebits = time.localtime(timestamp)
try:
tzname = timebits.tm_zone
except AttributeError:
# This assumes that `time.tzname` is meaningful/useful.
tzname = time.tzname[timebits.tm_isdst > 0]
assert 1 <= timebits.tm_mon <= 12, 'invalid month'
assert 0 <= timebits.tm_wday <= 6, 'invalid day of week'
return '{wday} {mon} {t.tm_mday:02d}' \
' {t.tm_hour:02d}:{t.tm_min:02d}:{t.tm_sec:02d}' \
' {tz} {t.tm_year:04d}'.format(
t=timebits,
tz=tzname,
mon=MONTHS[timebits.tm_mon-1],
wday=DAYS_OF_WEEK[timebits.tm_wday]
)
|
jwodder/javaproperties | javaproperties/writing.py | dumps | python | def dumps(props, separator='=', comments=None, timestamp=True, sort_keys=False):
s = StringIO()
dump(props, s, separator=separator, comments=comments, timestamp=timestamp,
sort_keys=sort_keys)
return s.getvalue() | Convert a series of key-value pairs to a text string in simple
line-oriented ``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to serialize.
All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be output as a comment
before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is output as a comment after
``comments`` (if any) and before the key-value pairs. If ``timestamp``
is `True`, the current date & time is used. If it is a number, it is
converted from seconds since the epoch to local time. If it is a
`datetime.datetime` object, its value is used directly, with naïve
objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:rtype: text string | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/writing.py#L47-L77 | [
"def dump(props, fp, separator='=', comments=None, timestamp=True,\n sort_keys=False):\n \"\"\"\n Write a series of key-value pairs to a file in simple line-oriented\n ``.properties`` format.\n\n :param props: A mapping or iterable of ``(key, value)`` pairs to write to\n ``fp``. All keys... | # -*- coding: utf-8 -*-
from __future__ import print_function, unicode_literals
from datetime import datetime
import numbers
import re
import time
from six import StringIO
from .util import itemize
def dump(props, fp, separator='=', comments=None, timestamp=True,
sort_keys=False):
"""
Write a series of key-value pairs to a file in simple line-oriented
``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to write to
``fp``. All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param fp: A file-like object to write the values of ``props`` to. It must
have been opened as a text file with a Latin-1-compatible encoding.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be written to ``fp`` as a
comment before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is written as a comment to ``fp``
after ``comments`` (if any) and before the key-value pairs. If
``timestamp`` is `True`, the current date & time is used. If it is a
number, it is converted from seconds since the epoch to local time. If
it is a `datetime.datetime` object, its value is used directly, with
naïve objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:return: `None`
"""
if comments is not None:
print(to_comment(comments), file=fp)
if timestamp is not None and timestamp is not False:
print(to_comment(java_timestamp(timestamp)), file=fp)
for k,v in itemize(props, sort_keys=sort_keys):
print(join_key_value(k, v, separator), file=fp)
def to_comment(comment):
"""
Convert a string to a ``.properties`` file comment. All non-Latin-1
characters in the string are escaped using ``\\uXXXX`` escapes (after
converting non-BMP characters to surrogate pairs), a ``#`` is prepended to
the string, any CR LF or CR line breaks in the string are converted to LF,
and a ``#`` is inserted after any line break not already followed by a
``#`` or ``!``. No trailing newline is added.
>>> to_comment('They say foo=bar,\\r\\nbut does bar=foo?')
'#They say foo=bar,\\n#but does bar=foo?'
:param comment: the string to convert to a comment
:type comment: text string
:rtype: text string
"""
return '#' + re.sub(r'[^\x00-\xFF]', _esc,
re.sub(r'\n(?![#!])', '\n#',
re.sub(r'\r\n?', '\n', comment)))
def join_key_value(key, value, separator='='):
r"""
Join a key and value together into a single line suitable for adding to a
simple line-oriented ``.properties`` file. No trailing newline is added.
>>> join_key_value('possible separators', '= : space')
'possible\\ separators=\\= \\: space'
:param key: the key
:type key: text string
:param value: the value
:type value: text string
:param separator: the string to use for separating the key & value. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:rtype: text string
"""
# Escapes `key` and `value` the same way as java.util.Properties.store()
return escape(key) \
+ separator \
+ re.sub(r'^ +', lambda m: r'\ ' * m.end(), _base_escape(value))
_escapes = {
'\t': r'\t',
'\n': r'\n',
'\f': r'\f',
'\r': r'\r',
'!': r'\!',
'#': r'\#',
':': r'\:',
'=': r'\=',
'\\': r'\\',
}
def _esc(m):
c = m.group()
try:
return _escapes[c]
except KeyError:
c = ord(c)
if c > 0xFFFF:
# Does Python really not have a decent builtin way to calculate
# surrogate pairs?
assert c <= 0x10FFFF
c -= 0x10000
return '\\u{0:04x}\\u{1:04x}'.format(
0xD800 + (c >> 10),
0xDC00 + (c & 0x3FF)
)
else:
return '\\u{0:04x}'.format(c)
def _base_escape(field):
return re.sub(r'[^\x20-\x7E]|[\\#!=:]', _esc, field)
def escape(field):
"""
Escape a string so that it can be safely used as either a key or value in a
``.properties`` file. All non-ASCII characters, all nonprintable or space
characters, and the characters ``\\ # ! = :`` are all escaped using either
the single-character escapes recognized by `unescape` (when they exist) or
``\\uXXXX`` escapes (after converting non-BMP characters to surrogate
pairs).
:param field: the string to escape
:type field: text string
:rtype: text string
"""
return _base_escape(field).replace(' ', r'\ ')
DAYS_OF_WEEK = ['Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat', 'Sun']
MONTHS = [
'Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun',
'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec',
]
def java_timestamp(timestamp=True):
"""
.. versionadded:: 0.2.0
Returns a timestamp in the format produced by |date_tostring|_, e.g.::
Mon Sep 02 14:00:54 EDT 2016
If ``timestamp`` is `True` (the default), the current date & time is
returned.
If ``timestamp`` is `None` or `False`, an empty string is returned.
If ``timestamp`` is a number, it is converted from seconds since the epoch
to local time.
If ``timestamp`` is a `datetime.datetime` object, its value is used
directly, with naïve objects assumed to be in the local timezone.
The timestamp is always constructed using the C locale.
:param timestamp: the date & time to display
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:rtype: text string
.. |date_tostring| replace:: Java 8's ``Date.toString()``
.. _date_tostring: https://docs.oracle.com/javase/8/docs/api/java/util/Date.html#toString--
"""
if timestamp is None or timestamp is False:
return ''
if isinstance(timestamp, datetime) and timestamp.tzinfo is not None:
timebits = timestamp.timetuple()
# Assumes `timestamp.tzinfo.tzname()` is meaningful/useful
tzname = timestamp.tzname()
else:
if timestamp is True:
timestamp = None
elif isinstance(timestamp, datetime):
try:
# Use `datetime.timestamp()` if it's available, as it (unlike
# `datetime.timetuple()`) takes `fold` into account for naïve
# datetimes
timestamp = timestamp.timestamp()
except AttributeError: # Pre-Python 3.3
# Mapping `timetuple` through `mktime` and `localtime` is
# necessary for determining whether DST is in effect (which, in
# turn, is necessary for determining which timezone name to
# use). The only downside to using standard functions instead
# of `python-dateutil` is that `mktime`, apparently, handles
# times duplicated by DST non-deterministically (cf.
# <https://git.io/vixsE>), but there's no right way to deal
# with those anyway, so...
timestamp = time.mktime(timestamp.timetuple())
elif not isinstance(timestamp, numbers.Number):
raise TypeError('Timestamp must be number or datetime.datetime')
timebits = time.localtime(timestamp)
try:
tzname = timebits.tm_zone
except AttributeError:
# This assumes that `time.tzname` is meaningful/useful.
tzname = time.tzname[timebits.tm_isdst > 0]
assert 1 <= timebits.tm_mon <= 12, 'invalid month'
assert 0 <= timebits.tm_wday <= 6, 'invalid day of week'
return '{wday} {mon} {t.tm_mday:02d}' \
' {t.tm_hour:02d}:{t.tm_min:02d}:{t.tm_sec:02d}' \
' {tz} {t.tm_year:04d}'.format(
t=timebits,
tz=tzname,
mon=MONTHS[timebits.tm_mon-1],
wday=DAYS_OF_WEEK[timebits.tm_wday]
)
|
jwodder/javaproperties | javaproperties/writing.py | to_comment | python | def to_comment(comment):
return '#' + re.sub(r'[^\x00-\xFF]', _esc,
re.sub(r'\n(?![#!])', '\n#',
re.sub(r'\r\n?', '\n', comment))) | Convert a string to a ``.properties`` file comment. All non-Latin-1
characters in the string are escaped using ``\\uXXXX`` escapes (after
converting non-BMP characters to surrogate pairs), a ``#`` is prepended to
the string, any CR LF or CR line breaks in the string are converted to LF,
and a ``#`` is inserted after any line break not already followed by a
``#`` or ``!``. No trailing newline is added.
>>> to_comment('They say foo=bar,\\r\\nbut does bar=foo?')
'#They say foo=bar,\\n#but does bar=foo?'
:param comment: the string to convert to a comment
:type comment: text string
:rtype: text string | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/writing.py#L79-L97 | null | # -*- coding: utf-8 -*-
from __future__ import print_function, unicode_literals
from datetime import datetime
import numbers
import re
import time
from six import StringIO
from .util import itemize
def dump(props, fp, separator='=', comments=None, timestamp=True,
sort_keys=False):
"""
Write a series of key-value pairs to a file in simple line-oriented
``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to write to
``fp``. All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param fp: A file-like object to write the values of ``props`` to. It must
have been opened as a text file with a Latin-1-compatible encoding.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be written to ``fp`` as a
comment before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is written as a comment to ``fp``
after ``comments`` (if any) and before the key-value pairs. If
``timestamp`` is `True`, the current date & time is used. If it is a
number, it is converted from seconds since the epoch to local time. If
it is a `datetime.datetime` object, its value is used directly, with
naïve objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:return: `None`
"""
if comments is not None:
print(to_comment(comments), file=fp)
if timestamp is not None and timestamp is not False:
print(to_comment(java_timestamp(timestamp)), file=fp)
for k,v in itemize(props, sort_keys=sort_keys):
print(join_key_value(k, v, separator), file=fp)
def dumps(props, separator='=', comments=None, timestamp=True, sort_keys=False):
"""
Convert a series of key-value pairs to a text string in simple
line-oriented ``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to serialize.
All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be output as a comment
before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is output as a comment after
``comments`` (if any) and before the key-value pairs. If ``timestamp``
is `True`, the current date & time is used. If it is a number, it is
converted from seconds since the epoch to local time. If it is a
`datetime.datetime` object, its value is used directly, with naïve
objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:rtype: text string
"""
s = StringIO()
dump(props, s, separator=separator, comments=comments, timestamp=timestamp,
sort_keys=sort_keys)
return s.getvalue()
def join_key_value(key, value, separator='='):
r"""
Join a key and value together into a single line suitable for adding to a
simple line-oriented ``.properties`` file. No trailing newline is added.
>>> join_key_value('possible separators', '= : space')
'possible\\ separators=\\= \\: space'
:param key: the key
:type key: text string
:param value: the value
:type value: text string
:param separator: the string to use for separating the key & value. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:rtype: text string
"""
# Escapes `key` and `value` the same way as java.util.Properties.store()
return escape(key) \
+ separator \
+ re.sub(r'^ +', lambda m: r'\ ' * m.end(), _base_escape(value))
_escapes = {
'\t': r'\t',
'\n': r'\n',
'\f': r'\f',
'\r': r'\r',
'!': r'\!',
'#': r'\#',
':': r'\:',
'=': r'\=',
'\\': r'\\',
}
def _esc(m):
c = m.group()
try:
return _escapes[c]
except KeyError:
c = ord(c)
if c > 0xFFFF:
# Does Python really not have a decent builtin way to calculate
# surrogate pairs?
assert c <= 0x10FFFF
c -= 0x10000
return '\\u{0:04x}\\u{1:04x}'.format(
0xD800 + (c >> 10),
0xDC00 + (c & 0x3FF)
)
else:
return '\\u{0:04x}'.format(c)
def _base_escape(field):
return re.sub(r'[^\x20-\x7E]|[\\#!=:]', _esc, field)
def escape(field):
"""
Escape a string so that it can be safely used as either a key or value in a
``.properties`` file. All non-ASCII characters, all nonprintable or space
characters, and the characters ``\\ # ! = :`` are all escaped using either
the single-character escapes recognized by `unescape` (when they exist) or
``\\uXXXX`` escapes (after converting non-BMP characters to surrogate
pairs).
:param field: the string to escape
:type field: text string
:rtype: text string
"""
return _base_escape(field).replace(' ', r'\ ')
DAYS_OF_WEEK = ['Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat', 'Sun']
MONTHS = [
'Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun',
'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec',
]
def java_timestamp(timestamp=True):
"""
.. versionadded:: 0.2.0
Returns a timestamp in the format produced by |date_tostring|_, e.g.::
Mon Sep 02 14:00:54 EDT 2016
If ``timestamp`` is `True` (the default), the current date & time is
returned.
If ``timestamp`` is `None` or `False`, an empty string is returned.
If ``timestamp`` is a number, it is converted from seconds since the epoch
to local time.
If ``timestamp`` is a `datetime.datetime` object, its value is used
directly, with naïve objects assumed to be in the local timezone.
The timestamp is always constructed using the C locale.
:param timestamp: the date & time to display
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:rtype: text string
.. |date_tostring| replace:: Java 8's ``Date.toString()``
.. _date_tostring: https://docs.oracle.com/javase/8/docs/api/java/util/Date.html#toString--
"""
if timestamp is None or timestamp is False:
return ''
if isinstance(timestamp, datetime) and timestamp.tzinfo is not None:
timebits = timestamp.timetuple()
# Assumes `timestamp.tzinfo.tzname()` is meaningful/useful
tzname = timestamp.tzname()
else:
if timestamp is True:
timestamp = None
elif isinstance(timestamp, datetime):
try:
# Use `datetime.timestamp()` if it's available, as it (unlike
# `datetime.timetuple()`) takes `fold` into account for naïve
# datetimes
timestamp = timestamp.timestamp()
except AttributeError: # Pre-Python 3.3
# Mapping `timetuple` through `mktime` and `localtime` is
# necessary for determining whether DST is in effect (which, in
# turn, is necessary for determining which timezone name to
# use). The only downside to using standard functions instead
# of `python-dateutil` is that `mktime`, apparently, handles
# times duplicated by DST non-deterministically (cf.
# <https://git.io/vixsE>), but there's no right way to deal
# with those anyway, so...
timestamp = time.mktime(timestamp.timetuple())
elif not isinstance(timestamp, numbers.Number):
raise TypeError('Timestamp must be number or datetime.datetime')
timebits = time.localtime(timestamp)
try:
tzname = timebits.tm_zone
except AttributeError:
# This assumes that `time.tzname` is meaningful/useful.
tzname = time.tzname[timebits.tm_isdst > 0]
assert 1 <= timebits.tm_mon <= 12, 'invalid month'
assert 0 <= timebits.tm_wday <= 6, 'invalid day of week'
return '{wday} {mon} {t.tm_mday:02d}' \
' {t.tm_hour:02d}:{t.tm_min:02d}:{t.tm_sec:02d}' \
' {tz} {t.tm_year:04d}'.format(
t=timebits,
tz=tzname,
mon=MONTHS[timebits.tm_mon-1],
wday=DAYS_OF_WEEK[timebits.tm_wday]
)
|
jwodder/javaproperties | javaproperties/writing.py | join_key_value | python | def join_key_value(key, value, separator='='):
r"""
Join a key and value together into a single line suitable for adding to a
simple line-oriented ``.properties`` file. No trailing newline is added.
>>> join_key_value('possible separators', '= : space')
'possible\\ separators=\\= \\: space'
:param key: the key
:type key: text string
:param value: the value
:type value: text string
:param separator: the string to use for separating the key & value. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:rtype: text string
"""
# Escapes `key` and `value` the same way as java.util.Properties.store()
return escape(key) \
+ separator \
+ re.sub(r'^ +', lambda m: r'\ ' * m.end(), _base_escape(value)) | r"""
Join a key and value together into a single line suitable for adding to a
simple line-oriented ``.properties`` file. No trailing newline is added.
>>> join_key_value('possible separators', '= : space')
'possible\\ separators=\\= \\: space'
:param key: the key
:type key: text string
:param value: the value
:type value: text string
:param separator: the string to use for separating the key & value. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:rtype: text string | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/writing.py#L99-L120 | [
"def escape(field):\n \"\"\"\n Escape a string so that it can be safely used as either a key or value in a\n ``.properties`` file. All non-ASCII characters, all nonprintable or space\n characters, and the characters ``\\\\ # ! = :`` are all escaped using either\n the single-character escapes recogni... | # -*- coding: utf-8 -*-
from __future__ import print_function, unicode_literals
from datetime import datetime
import numbers
import re
import time
from six import StringIO
from .util import itemize
def dump(props, fp, separator='=', comments=None, timestamp=True,
sort_keys=False):
"""
Write a series of key-value pairs to a file in simple line-oriented
``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to write to
``fp``. All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param fp: A file-like object to write the values of ``props`` to. It must
have been opened as a text file with a Latin-1-compatible encoding.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be written to ``fp`` as a
comment before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is written as a comment to ``fp``
after ``comments`` (if any) and before the key-value pairs. If
``timestamp`` is `True`, the current date & time is used. If it is a
number, it is converted from seconds since the epoch to local time. If
it is a `datetime.datetime` object, its value is used directly, with
naïve objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:return: `None`
"""
if comments is not None:
print(to_comment(comments), file=fp)
if timestamp is not None and timestamp is not False:
print(to_comment(java_timestamp(timestamp)), file=fp)
for k,v in itemize(props, sort_keys=sort_keys):
print(join_key_value(k, v, separator), file=fp)
def dumps(props, separator='=', comments=None, timestamp=True, sort_keys=False):
"""
Convert a series of key-value pairs to a text string in simple
line-oriented ``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to serialize.
All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be output as a comment
before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is output as a comment after
``comments`` (if any) and before the key-value pairs. If ``timestamp``
is `True`, the current date & time is used. If it is a number, it is
converted from seconds since the epoch to local time. If it is a
`datetime.datetime` object, its value is used directly, with naïve
objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:rtype: text string
"""
s = StringIO()
dump(props, s, separator=separator, comments=comments, timestamp=timestamp,
sort_keys=sort_keys)
return s.getvalue()
def to_comment(comment):
"""
Convert a string to a ``.properties`` file comment. All non-Latin-1
characters in the string are escaped using ``\\uXXXX`` escapes (after
converting non-BMP characters to surrogate pairs), a ``#`` is prepended to
the string, any CR LF or CR line breaks in the string are converted to LF,
and a ``#`` is inserted after any line break not already followed by a
``#`` or ``!``. No trailing newline is added.
>>> to_comment('They say foo=bar,\\r\\nbut does bar=foo?')
'#They say foo=bar,\\n#but does bar=foo?'
:param comment: the string to convert to a comment
:type comment: text string
:rtype: text string
"""
return '#' + re.sub(r'[^\x00-\xFF]', _esc,
re.sub(r'\n(?![#!])', '\n#',
re.sub(r'\r\n?', '\n', comment)))
_escapes = {
'\t': r'\t',
'\n': r'\n',
'\f': r'\f',
'\r': r'\r',
'!': r'\!',
'#': r'\#',
':': r'\:',
'=': r'\=',
'\\': r'\\',
}
def _esc(m):
c = m.group()
try:
return _escapes[c]
except KeyError:
c = ord(c)
if c > 0xFFFF:
# Does Python really not have a decent builtin way to calculate
# surrogate pairs?
assert c <= 0x10FFFF
c -= 0x10000
return '\\u{0:04x}\\u{1:04x}'.format(
0xD800 + (c >> 10),
0xDC00 + (c & 0x3FF)
)
else:
return '\\u{0:04x}'.format(c)
def _base_escape(field):
return re.sub(r'[^\x20-\x7E]|[\\#!=:]', _esc, field)
def escape(field):
"""
Escape a string so that it can be safely used as either a key or value in a
``.properties`` file. All non-ASCII characters, all nonprintable or space
characters, and the characters ``\\ # ! = :`` are all escaped using either
the single-character escapes recognized by `unescape` (when they exist) or
``\\uXXXX`` escapes (after converting non-BMP characters to surrogate
pairs).
:param field: the string to escape
:type field: text string
:rtype: text string
"""
return _base_escape(field).replace(' ', r'\ ')
DAYS_OF_WEEK = ['Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat', 'Sun']
MONTHS = [
'Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun',
'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec',
]
def java_timestamp(timestamp=True):
"""
.. versionadded:: 0.2.0
Returns a timestamp in the format produced by |date_tostring|_, e.g.::
Mon Sep 02 14:00:54 EDT 2016
If ``timestamp`` is `True` (the default), the current date & time is
returned.
If ``timestamp`` is `None` or `False`, an empty string is returned.
If ``timestamp`` is a number, it is converted from seconds since the epoch
to local time.
If ``timestamp`` is a `datetime.datetime` object, its value is used
directly, with naïve objects assumed to be in the local timezone.
The timestamp is always constructed using the C locale.
:param timestamp: the date & time to display
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:rtype: text string
.. |date_tostring| replace:: Java 8's ``Date.toString()``
.. _date_tostring: https://docs.oracle.com/javase/8/docs/api/java/util/Date.html#toString--
"""
if timestamp is None or timestamp is False:
return ''
if isinstance(timestamp, datetime) and timestamp.tzinfo is not None:
timebits = timestamp.timetuple()
# Assumes `timestamp.tzinfo.tzname()` is meaningful/useful
tzname = timestamp.tzname()
else:
if timestamp is True:
timestamp = None
elif isinstance(timestamp, datetime):
try:
# Use `datetime.timestamp()` if it's available, as it (unlike
# `datetime.timetuple()`) takes `fold` into account for naïve
# datetimes
timestamp = timestamp.timestamp()
except AttributeError: # Pre-Python 3.3
# Mapping `timetuple` through `mktime` and `localtime` is
# necessary for determining whether DST is in effect (which, in
# turn, is necessary for determining which timezone name to
# use). The only downside to using standard functions instead
# of `python-dateutil` is that `mktime`, apparently, handles
# times duplicated by DST non-deterministically (cf.
# <https://git.io/vixsE>), but there's no right way to deal
# with those anyway, so...
timestamp = time.mktime(timestamp.timetuple())
elif not isinstance(timestamp, numbers.Number):
raise TypeError('Timestamp must be number or datetime.datetime')
timebits = time.localtime(timestamp)
try:
tzname = timebits.tm_zone
except AttributeError:
# This assumes that `time.tzname` is meaningful/useful.
tzname = time.tzname[timebits.tm_isdst > 0]
assert 1 <= timebits.tm_mon <= 12, 'invalid month'
assert 0 <= timebits.tm_wday <= 6, 'invalid day of week'
return '{wday} {mon} {t.tm_mday:02d}' \
' {t.tm_hour:02d}:{t.tm_min:02d}:{t.tm_sec:02d}' \
' {tz} {t.tm_year:04d}'.format(
t=timebits,
tz=tzname,
mon=MONTHS[timebits.tm_mon-1],
wday=DAYS_OF_WEEK[timebits.tm_wday]
)
|
jwodder/javaproperties | javaproperties/writing.py | java_timestamp | python | def java_timestamp(timestamp=True):
if timestamp is None or timestamp is False:
return ''
if isinstance(timestamp, datetime) and timestamp.tzinfo is not None:
timebits = timestamp.timetuple()
# Assumes `timestamp.tzinfo.tzname()` is meaningful/useful
tzname = timestamp.tzname()
else:
if timestamp is True:
timestamp = None
elif isinstance(timestamp, datetime):
try:
# Use `datetime.timestamp()` if it's available, as it (unlike
# `datetime.timetuple()`) takes `fold` into account for naïve
# datetimes
timestamp = timestamp.timestamp()
except AttributeError: # Pre-Python 3.3
# Mapping `timetuple` through `mktime` and `localtime` is
# necessary for determining whether DST is in effect (which, in
# turn, is necessary for determining which timezone name to
# use). The only downside to using standard functions instead
# of `python-dateutil` is that `mktime`, apparently, handles
# times duplicated by DST non-deterministically (cf.
# <https://git.io/vixsE>), but there's no right way to deal
# with those anyway, so...
timestamp = time.mktime(timestamp.timetuple())
elif not isinstance(timestamp, numbers.Number):
raise TypeError('Timestamp must be number or datetime.datetime')
timebits = time.localtime(timestamp)
try:
tzname = timebits.tm_zone
except AttributeError:
# This assumes that `time.tzname` is meaningful/useful.
tzname = time.tzname[timebits.tm_isdst > 0]
assert 1 <= timebits.tm_mon <= 12, 'invalid month'
assert 0 <= timebits.tm_wday <= 6, 'invalid day of week'
return '{wday} {mon} {t.tm_mday:02d}' \
' {t.tm_hour:02d}:{t.tm_min:02d}:{t.tm_sec:02d}' \
' {tz} {t.tm_year:04d}'.format(
t=timebits,
tz=tzname,
mon=MONTHS[timebits.tm_mon-1],
wday=DAYS_OF_WEEK[timebits.tm_wday]
) | .. versionadded:: 0.2.0
Returns a timestamp in the format produced by |date_tostring|_, e.g.::
Mon Sep 02 14:00:54 EDT 2016
If ``timestamp`` is `True` (the default), the current date & time is
returned.
If ``timestamp`` is `None` or `False`, an empty string is returned.
If ``timestamp`` is a number, it is converted from seconds since the epoch
to local time.
If ``timestamp`` is a `datetime.datetime` object, its value is used
directly, with naïve objects assumed to be in the local timezone.
The timestamp is always constructed using the C locale.
:param timestamp: the date & time to display
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:rtype: text string
.. |date_tostring| replace:: Java 8's ``Date.toString()``
.. _date_tostring: https://docs.oracle.com/javase/8/docs/api/java/util/Date.html#toString-- | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/writing.py#L177-L247 | null | # -*- coding: utf-8 -*-
from __future__ import print_function, unicode_literals
from datetime import datetime
import numbers
import re
import time
from six import StringIO
from .util import itemize
def dump(props, fp, separator='=', comments=None, timestamp=True,
sort_keys=False):
"""
Write a series of key-value pairs to a file in simple line-oriented
``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to write to
``fp``. All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param fp: A file-like object to write the values of ``props`` to. It must
have been opened as a text file with a Latin-1-compatible encoding.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be written to ``fp`` as a
comment before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is written as a comment to ``fp``
after ``comments`` (if any) and before the key-value pairs. If
``timestamp`` is `True`, the current date & time is used. If it is a
number, it is converted from seconds since the epoch to local time. If
it is a `datetime.datetime` object, its value is used directly, with
naïve objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:return: `None`
"""
if comments is not None:
print(to_comment(comments), file=fp)
if timestamp is not None and timestamp is not False:
print(to_comment(java_timestamp(timestamp)), file=fp)
for k,v in itemize(props, sort_keys=sort_keys):
print(join_key_value(k, v, separator), file=fp)
def dumps(props, separator='=', comments=None, timestamp=True, sort_keys=False):
"""
Convert a series of key-value pairs to a text string in simple
line-oriented ``.properties`` format.
:param props: A mapping or iterable of ``(key, value)`` pairs to serialize.
All keys and values in ``props`` must be text strings. If
``sort_keys`` is `False`, the entries are output in iteration order.
:param separator: The string to use for separating keys & values. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:param comments: if non-`None`, ``comments`` will be output as a comment
before any other content
:type comments: text string or `None`
:param timestamp: If neither `None` nor `False`, a timestamp in the form of
``Mon Sep 02 14:00:54 EDT 2016`` is output as a comment after
``comments`` (if any) and before the key-value pairs. If ``timestamp``
is `True`, the current date & time is used. If it is a number, it is
converted from seconds since the epoch to local time. If it is a
`datetime.datetime` object, its value is used directly, with naïve
objects assumed to be in the local timezone.
:type timestamp: `None`, `bool`, number, or `datetime.datetime`
:param bool sort_keys: if true, the elements of ``props`` are sorted
lexicographically by key in the output
:rtype: text string
"""
s = StringIO()
dump(props, s, separator=separator, comments=comments, timestamp=timestamp,
sort_keys=sort_keys)
return s.getvalue()
def to_comment(comment):
"""
Convert a string to a ``.properties`` file comment. All non-Latin-1
characters in the string are escaped using ``\\uXXXX`` escapes (after
converting non-BMP characters to surrogate pairs), a ``#`` is prepended to
the string, any CR LF or CR line breaks in the string are converted to LF,
and a ``#`` is inserted after any line break not already followed by a
``#`` or ``!``. No trailing newline is added.
>>> to_comment('They say foo=bar,\\r\\nbut does bar=foo?')
'#They say foo=bar,\\n#but does bar=foo?'
:param comment: the string to convert to a comment
:type comment: text string
:rtype: text string
"""
return '#' + re.sub(r'[^\x00-\xFF]', _esc,
re.sub(r'\n(?![#!])', '\n#',
re.sub(r'\r\n?', '\n', comment)))
def join_key_value(key, value, separator='='):
r"""
Join a key and value together into a single line suitable for adding to a
simple line-oriented ``.properties`` file. No trailing newline is added.
>>> join_key_value('possible separators', '= : space')
'possible\\ separators=\\= \\: space'
:param key: the key
:type key: text string
:param value: the value
:type value: text string
:param separator: the string to use for separating the key & value. Only
``" "``, ``"="``, and ``":"`` (possibly with added whitespace) should
ever be used as the separator.
:type separator: text string
:rtype: text string
"""
# Escapes `key` and `value` the same way as java.util.Properties.store()
return escape(key) \
+ separator \
+ re.sub(r'^ +', lambda m: r'\ ' * m.end(), _base_escape(value))
_escapes = {
'\t': r'\t',
'\n': r'\n',
'\f': r'\f',
'\r': r'\r',
'!': r'\!',
'#': r'\#',
':': r'\:',
'=': r'\=',
'\\': r'\\',
}
def _esc(m):
c = m.group()
try:
return _escapes[c]
except KeyError:
c = ord(c)
if c > 0xFFFF:
# Does Python really not have a decent builtin way to calculate
# surrogate pairs?
assert c <= 0x10FFFF
c -= 0x10000
return '\\u{0:04x}\\u{1:04x}'.format(
0xD800 + (c >> 10),
0xDC00 + (c & 0x3FF)
)
else:
return '\\u{0:04x}'.format(c)
def _base_escape(field):
return re.sub(r'[^\x20-\x7E]|[\\#!=:]', _esc, field)
def escape(field):
"""
Escape a string so that it can be safely used as either a key or value in a
``.properties`` file. All non-ASCII characters, all nonprintable or space
characters, and the characters ``\\ # ! = :`` are all escaped using either
the single-character escapes recognized by `unescape` (when they exist) or
``\\uXXXX`` escapes (after converting non-BMP characters to surrogate
pairs).
:param field: the string to escape
:type field: text string
:rtype: text string
"""
return _base_escape(field).replace(' ', r'\ ')
DAYS_OF_WEEK = ['Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat', 'Sun']
MONTHS = [
'Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun',
'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec',
]
|
jwodder/javaproperties | javaproperties/propfile.py | PropertiesFile._check | python | def _check(self):
for k,ix in six.iteritems(self._indices):
assert k is not None, 'null key'
assert ix, 'Key does not map to any indices'
assert ix == sorted(ix), "Key's indices are not in order"
for i in ix:
assert i in self._lines, 'Key index does not map to line'
assert self._lines[i].key is not None, 'Key maps to comment'
assert self._lines[i].key == k, 'Key does not map to itself'
assert self._lines[i].value is not None, 'Key has null value'
prev = None
for i, line in six.iteritems(self._lines):
assert prev is None or prev < i, 'Line indices out of order'
prev = i
if line.key is None:
assert line.value is None, 'Comment/blank has value'
assert line.source is not None, 'Comment source not stored'
assert loads(line.source) == {}, 'Comment source is not comment'
else:
assert line.value is not None, 'Key has null value'
if line.source is not None:
assert loads(line.source) == {line.key: line.value}, \
'Key source does not deserialize to itself'
assert line.key in self._indices, 'Key is missing from map'
assert i in self._indices[line.key], \
'Key does not map to itself' | Assert the internal consistency of the instance's data structures.
This method is for debugging only. | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/propfile.py#L64-L93 | [
"def loads(s, object_pairs_hook=dict):\n \"\"\"\n Parse the contents of the string ``s`` as a simple line-oriented\n ``.properties`` file and return a `dict` of the key-value pairs.\n\n ``s`` may be either a text string or bytes string. If it is a bytes\n string, its contents are decoded as Latin-1.... | class PropertiesFile(MutableMapping):
"""
.. versionadded:: 0.3.0
A custom mapping class for reading from, editing, and writing to a
``.properties`` file while preserving comments & whitespace in the original
input.
A `PropertiesFile` instance can be constructed from another mapping and/or
iterable of pairs, after which it will act like an
`~collections.OrderedDict`. Alternatively, an instance can be constructed
from a file or string with `PropertiesFile.load()` or
`PropertiesFile.loads()`, and the resulting instance will remember the
formatting of its input and retain that formatting when written back to a
file or string with the `~PropertiesFile.dump()` or
`~PropertiesFile.dumps()` method. The formatting information attached to
an instance ``pf`` can be forgotten by constructing another mapping from it
via ``dict(pf)``, ``OrderedDict(pf)``, or even ``PropertiesFile(pf)`` (Use
the `copy()` method if you want to create another `PropertiesFile` instance
with the same data & formatting).
When not reading or writing, `PropertiesFile` behaves like a normal
`~collections.abc.MutableMapping` class (i.e., you can do ``props[key] =
value`` and so forth), except that (a) like `~collections.OrderedDict`, key
insertion order is remembered and is used when iterating & dumping (and
`reversed` is supported), and (b) like `Properties`, it may only be used to
store strings and will raise a `TypeError` if passed a non-string object as
key or value.
Two `PropertiesFile` instances compare equal iff both their key-value pairs
and comment & whitespace lines are equal and in the same order. When
comparing a `PropertiesFile` to any other type of mapping, only the
key-value pairs are considered, and order is ignored.
`PropertiesFile` currently only supports reading & writing the simple
line-oriented format, not XML.
"""
def __init__(self, mapping=None, **kwargs):
#: mapping from keys to list of line numbers
self._indices = OrderedDict()
#: mapping from line numbers to (key, value, source) tuples
self._lines = OrderedDict()
if mapping is not None:
self.update(mapping)
self.update(kwargs)
def __getitem__(self, key):
if not isinstance(key, six.string_types):
raise TypeError(_type_err)
return self._lines[self._indices[key][-1]].value
def __setitem__(self, key, value):
if not isinstance(key, six.string_types) or \
not isinstance(value, six.string_types):
raise TypeError(_type_err)
try:
ixes = self._indices[key]
except KeyError:
try:
lasti = next(reversed(self._lines))
except StopIteration:
ix = 0
else:
ix = lasti + 1
# We're adding a line to the end of the file, so make sure the
# line before it ends with a newline and (if it's not a
# comment) doesn't end with a trailing line continuation.
lastline = self._lines[lasti]
if lastline.source is not None:
lastsrc = lastline.source
if lastline.key is not None:
lastsrc=re.sub(r'(?<!\\)((?:\\\\)*)\\$', r'\1', lastsrc)
if not lastsrc.endswith(('\r', '\n')):
lastsrc += '\n'
self._lines[lasti] = lastline._replace(source=lastsrc)
else:
# Update the first occurrence of the key and discard the rest.
# This way, the order in which the keys are listed in the file and
# dict will be preserved.
ix = ixes.pop(0)
for i in ixes:
del self._lines[i]
self._indices[key] = [ix]
self._lines[ix] = PropertyLine(key, value, None)
def __delitem__(self, key):
if not isinstance(key, six.string_types):
raise TypeError(_type_err)
for i in self._indices.pop(key):
del self._lines[i]
def __iter__(self):
return iter(self._indices)
def __reversed__(self):
return reversed(self._indices)
def __len__(self):
return len(self._indices)
def _comparable(self):
return [
(None, line.source) if line.key is None else (line.key, line.value)
for i, line in six.iteritems(self._lines)
### TODO: Also include non-final repeated keys???
if line.key is None or self._indices[line.key][-1] == i
]
def __eq__(self, other):
if isinstance(other, PropertiesFile):
return self._comparable() == other._comparable()
### TODO: Special-case OrderedDict?
elif isinstance(other, Mapping):
return dict(self) == other
else:
return NotImplemented
def __ne__(self, other):
return not (self == other)
@classmethod
def load(cls, fp):
"""
Parse the contents of the `~io.IOBase.readline`-supporting file-like
object ``fp`` as a simple line-oriented ``.properties`` file and return
a `PropertiesFile` instance.
``fp`` may be either a text or binary filehandle, with or without
universal newlines enabled. If it is a binary filehandle, its contents
are decoded as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:rtype: PropertiesFile
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
obj = cls()
for i, (k, v, src) in enumerate(parse(fp)):
if k is not None:
obj._indices.setdefault(k, []).append(i)
obj._lines[i] = PropertyLine(k, v, src)
return obj
@classmethod
def loads(cls, s):
"""
Parse the contents of the string ``s`` as a simple line-oriented
``.properties`` file and return a `PropertiesFile` instance.
``s`` may be either a text string or bytes string. If it is a bytes
string, its contents are decoded as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param string s: the string from which to read the ``.properties``
document
:rtype: PropertiesFile
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
if isinstance(s, six.binary_type):
fp = six.BytesIO(s)
else:
fp = six.StringIO(s)
return cls.load(fp)
def dump(self, fp, separator='='):
"""
Write the mapping to a file in simple line-oriented ``.properties``
format.
If the instance was originally created from a file or string with
`PropertiesFile.load()` or `PropertiesFile.loads()`, then the output
will include the comments and whitespace from the original input, and
any keys that haven't been deleted or reassigned will retain their
original formatting and multiplicity. Key-value pairs that have been
modified or added to the mapping will be reformatted with
`join_key_value()` using the given separator. All key-value pairs are
output in the order they were defined, with new keys added to the end.
.. note::
Serializing a `PropertiesFile` instance with the :func:`dump()`
function instead will cause all formatting information to be
ignored, as :func:`dump()` will treat the instance like a normal
mapping.
:param fp: A file-like object to write the mapping to. It must have
been opened as a text file with a Latin-1-compatible encoding.
:param separator: The string to use for separating new or modified keys
& values. Only ``" "``, ``"="``, and ``":"`` (possibly with added
whitespace) should ever be used as the separator.
:type separator: text string
:return: `None`
"""
### TODO: Support setting the timestamp
for line in six.itervalues(self._lines):
if line.source is None:
print(join_key_value(line.key, line.value, separator), file=fp)
else:
fp.write(line.source)
def dumps(self, separator='='):
"""
Convert the mapping to a text string in simple line-oriented
``.properties`` format.
If the instance was originally created from a file or string with
`PropertiesFile.load()` or `PropertiesFile.loads()`, then the output
will include the comments and whitespace from the original input, and
any keys that haven't been deleted or reassigned will retain their
original formatting and multiplicity. Key-value pairs that have been
modified or added to the mapping will be reformatted with
`join_key_value()` using the given separator. All key-value pairs are
output in the order they were defined, with new keys added to the end.
.. note::
Serializing a `PropertiesFile` instance with the :func:`dumps()`
function instead will cause all formatting information to be
ignored, as :func:`dumps()` will treat the instance like a normal
mapping.
:param separator: The string to use for separating new or modified keys
& values. Only ``" "``, ``"="``, and ``":"`` (possibly with added
whitespace) should ever be used as the separator.
:type separator: text string
:rtype: text string
"""
s = six.StringIO()
self.dump(s, separator=separator)
return s.getvalue()
def copy(self):
""" Create a copy of the mapping, including formatting information """
dup = type(self)()
dup._indices = OrderedDict(
(k, list(v)) for k,v in six.iteritems(self._indices)
)
dup._lines = self._lines.copy()
return dup
|
jwodder/javaproperties | javaproperties/propfile.py | PropertiesFile.load | python | def load(cls, fp):
obj = cls()
for i, (k, v, src) in enumerate(parse(fp)):
if k is not None:
obj._indices.setdefault(k, []).append(i)
obj._lines[i] = PropertyLine(k, v, src)
return obj | Parse the contents of the `~io.IOBase.readline`-supporting file-like
object ``fp`` as a simple line-oriented ``.properties`` file and return
a `PropertiesFile` instance.
``fp`` may be either a text or binary filehandle, with or without
universal newlines enabled. If it is a binary filehandle, its contents
are decoded as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:rtype: PropertiesFile
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/propfile.py#L170-L195 | [
"def parse(fp):\n \"\"\"\n Parse the contents of the `~io.IOBase.readline`-supporting file-like object\n ``fp`` as a simple line-oriented ``.properties`` file and return a\n generator of ``(key, value, original_lines)`` triples for every entry in\n ``fp`` (including duplicate keys) in order of occurr... | class PropertiesFile(MutableMapping):
"""
.. versionadded:: 0.3.0
A custom mapping class for reading from, editing, and writing to a
``.properties`` file while preserving comments & whitespace in the original
input.
A `PropertiesFile` instance can be constructed from another mapping and/or
iterable of pairs, after which it will act like an
`~collections.OrderedDict`. Alternatively, an instance can be constructed
from a file or string with `PropertiesFile.load()` or
`PropertiesFile.loads()`, and the resulting instance will remember the
formatting of its input and retain that formatting when written back to a
file or string with the `~PropertiesFile.dump()` or
`~PropertiesFile.dumps()` method. The formatting information attached to
an instance ``pf`` can be forgotten by constructing another mapping from it
via ``dict(pf)``, ``OrderedDict(pf)``, or even ``PropertiesFile(pf)`` (Use
the `copy()` method if you want to create another `PropertiesFile` instance
with the same data & formatting).
When not reading or writing, `PropertiesFile` behaves like a normal
`~collections.abc.MutableMapping` class (i.e., you can do ``props[key] =
value`` and so forth), except that (a) like `~collections.OrderedDict`, key
insertion order is remembered and is used when iterating & dumping (and
`reversed` is supported), and (b) like `Properties`, it may only be used to
store strings and will raise a `TypeError` if passed a non-string object as
key or value.
Two `PropertiesFile` instances compare equal iff both their key-value pairs
and comment & whitespace lines are equal and in the same order. When
comparing a `PropertiesFile` to any other type of mapping, only the
key-value pairs are considered, and order is ignored.
`PropertiesFile` currently only supports reading & writing the simple
line-oriented format, not XML.
"""
def __init__(self, mapping=None, **kwargs):
#: mapping from keys to list of line numbers
self._indices = OrderedDict()
#: mapping from line numbers to (key, value, source) tuples
self._lines = OrderedDict()
if mapping is not None:
self.update(mapping)
self.update(kwargs)
def _check(self):
"""
Assert the internal consistency of the instance's data structures.
This method is for debugging only.
"""
for k,ix in six.iteritems(self._indices):
assert k is not None, 'null key'
assert ix, 'Key does not map to any indices'
assert ix == sorted(ix), "Key's indices are not in order"
for i in ix:
assert i in self._lines, 'Key index does not map to line'
assert self._lines[i].key is not None, 'Key maps to comment'
assert self._lines[i].key == k, 'Key does not map to itself'
assert self._lines[i].value is not None, 'Key has null value'
prev = None
for i, line in six.iteritems(self._lines):
assert prev is None or prev < i, 'Line indices out of order'
prev = i
if line.key is None:
assert line.value is None, 'Comment/blank has value'
assert line.source is not None, 'Comment source not stored'
assert loads(line.source) == {}, 'Comment source is not comment'
else:
assert line.value is not None, 'Key has null value'
if line.source is not None:
assert loads(line.source) == {line.key: line.value}, \
'Key source does not deserialize to itself'
assert line.key in self._indices, 'Key is missing from map'
assert i in self._indices[line.key], \
'Key does not map to itself'
def __getitem__(self, key):
if not isinstance(key, six.string_types):
raise TypeError(_type_err)
return self._lines[self._indices[key][-1]].value
def __setitem__(self, key, value):
if not isinstance(key, six.string_types) or \
not isinstance(value, six.string_types):
raise TypeError(_type_err)
try:
ixes = self._indices[key]
except KeyError:
try:
lasti = next(reversed(self._lines))
except StopIteration:
ix = 0
else:
ix = lasti + 1
# We're adding a line to the end of the file, so make sure the
# line before it ends with a newline and (if it's not a
# comment) doesn't end with a trailing line continuation.
lastline = self._lines[lasti]
if lastline.source is not None:
lastsrc = lastline.source
if lastline.key is not None:
lastsrc=re.sub(r'(?<!\\)((?:\\\\)*)\\$', r'\1', lastsrc)
if not lastsrc.endswith(('\r', '\n')):
lastsrc += '\n'
self._lines[lasti] = lastline._replace(source=lastsrc)
else:
# Update the first occurrence of the key and discard the rest.
# This way, the order in which the keys are listed in the file and
# dict will be preserved.
ix = ixes.pop(0)
for i in ixes:
del self._lines[i]
self._indices[key] = [ix]
self._lines[ix] = PropertyLine(key, value, None)
def __delitem__(self, key):
if not isinstance(key, six.string_types):
raise TypeError(_type_err)
for i in self._indices.pop(key):
del self._lines[i]
def __iter__(self):
return iter(self._indices)
def __reversed__(self):
return reversed(self._indices)
def __len__(self):
return len(self._indices)
def _comparable(self):
return [
(None, line.source) if line.key is None else (line.key, line.value)
for i, line in six.iteritems(self._lines)
### TODO: Also include non-final repeated keys???
if line.key is None or self._indices[line.key][-1] == i
]
def __eq__(self, other):
if isinstance(other, PropertiesFile):
return self._comparable() == other._comparable()
### TODO: Special-case OrderedDict?
elif isinstance(other, Mapping):
return dict(self) == other
else:
return NotImplemented
def __ne__(self, other):
return not (self == other)
@classmethod
@classmethod
def loads(cls, s):
"""
Parse the contents of the string ``s`` as a simple line-oriented
``.properties`` file and return a `PropertiesFile` instance.
``s`` may be either a text string or bytes string. If it is a bytes
string, its contents are decoded as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param string s: the string from which to read the ``.properties``
document
:rtype: PropertiesFile
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
if isinstance(s, six.binary_type):
fp = six.BytesIO(s)
else:
fp = six.StringIO(s)
return cls.load(fp)
def dump(self, fp, separator='='):
"""
Write the mapping to a file in simple line-oriented ``.properties``
format.
If the instance was originally created from a file or string with
`PropertiesFile.load()` or `PropertiesFile.loads()`, then the output
will include the comments and whitespace from the original input, and
any keys that haven't been deleted or reassigned will retain their
original formatting and multiplicity. Key-value pairs that have been
modified or added to the mapping will be reformatted with
`join_key_value()` using the given separator. All key-value pairs are
output in the order they were defined, with new keys added to the end.
.. note::
Serializing a `PropertiesFile` instance with the :func:`dump()`
function instead will cause all formatting information to be
ignored, as :func:`dump()` will treat the instance like a normal
mapping.
:param fp: A file-like object to write the mapping to. It must have
been opened as a text file with a Latin-1-compatible encoding.
:param separator: The string to use for separating new or modified keys
& values. Only ``" "``, ``"="``, and ``":"`` (possibly with added
whitespace) should ever be used as the separator.
:type separator: text string
:return: `None`
"""
### TODO: Support setting the timestamp
for line in six.itervalues(self._lines):
if line.source is None:
print(join_key_value(line.key, line.value, separator), file=fp)
else:
fp.write(line.source)
def dumps(self, separator='='):
"""
Convert the mapping to a text string in simple line-oriented
``.properties`` format.
If the instance was originally created from a file or string with
`PropertiesFile.load()` or `PropertiesFile.loads()`, then the output
will include the comments and whitespace from the original input, and
any keys that haven't been deleted or reassigned will retain their
original formatting and multiplicity. Key-value pairs that have been
modified or added to the mapping will be reformatted with
`join_key_value()` using the given separator. All key-value pairs are
output in the order they were defined, with new keys added to the end.
.. note::
Serializing a `PropertiesFile` instance with the :func:`dumps()`
function instead will cause all formatting information to be
ignored, as :func:`dumps()` will treat the instance like a normal
mapping.
:param separator: The string to use for separating new or modified keys
& values. Only ``" "``, ``"="``, and ``":"`` (possibly with added
whitespace) should ever be used as the separator.
:type separator: text string
:rtype: text string
"""
s = six.StringIO()
self.dump(s, separator=separator)
return s.getvalue()
def copy(self):
""" Create a copy of the mapping, including formatting information """
dup = type(self)()
dup._indices = OrderedDict(
(k, list(v)) for k,v in six.iteritems(self._indices)
)
dup._lines = self._lines.copy()
return dup
|
jwodder/javaproperties | javaproperties/propfile.py | PropertiesFile.loads | python | def loads(cls, s):
if isinstance(s, six.binary_type):
fp = six.BytesIO(s)
else:
fp = six.StringIO(s)
return cls.load(fp) | Parse the contents of the string ``s`` as a simple line-oriented
``.properties`` file and return a `PropertiesFile` instance.
``s`` may be either a text string or bytes string. If it is a bytes
string, its contents are decoded as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param string s: the string from which to read the ``.properties``
document
:rtype: PropertiesFile
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input | train | https://github.com/jwodder/javaproperties/blob/8b48f040305217ebeb80c98c4354691bbb01429b/javaproperties/propfile.py#L198-L220 | [
"def load(cls, fp):\n \"\"\"\n Parse the contents of the `~io.IOBase.readline`-supporting file-like\n object ``fp`` as a simple line-oriented ``.properties`` file and return\n a `PropertiesFile` instance.\n\n ``fp`` may be either a text or binary filehandle, with or without\n universal newlines en... | class PropertiesFile(MutableMapping):
"""
.. versionadded:: 0.3.0
A custom mapping class for reading from, editing, and writing to a
``.properties`` file while preserving comments & whitespace in the original
input.
A `PropertiesFile` instance can be constructed from another mapping and/or
iterable of pairs, after which it will act like an
`~collections.OrderedDict`. Alternatively, an instance can be constructed
from a file or string with `PropertiesFile.load()` or
`PropertiesFile.loads()`, and the resulting instance will remember the
formatting of its input and retain that formatting when written back to a
file or string with the `~PropertiesFile.dump()` or
`~PropertiesFile.dumps()` method. The formatting information attached to
an instance ``pf`` can be forgotten by constructing another mapping from it
via ``dict(pf)``, ``OrderedDict(pf)``, or even ``PropertiesFile(pf)`` (Use
the `copy()` method if you want to create another `PropertiesFile` instance
with the same data & formatting).
When not reading or writing, `PropertiesFile` behaves like a normal
`~collections.abc.MutableMapping` class (i.e., you can do ``props[key] =
value`` and so forth), except that (a) like `~collections.OrderedDict`, key
insertion order is remembered and is used when iterating & dumping (and
`reversed` is supported), and (b) like `Properties`, it may only be used to
store strings and will raise a `TypeError` if passed a non-string object as
key or value.
Two `PropertiesFile` instances compare equal iff both their key-value pairs
and comment & whitespace lines are equal and in the same order. When
comparing a `PropertiesFile` to any other type of mapping, only the
key-value pairs are considered, and order is ignored.
`PropertiesFile` currently only supports reading & writing the simple
line-oriented format, not XML.
"""
def __init__(self, mapping=None, **kwargs):
#: mapping from keys to list of line numbers
self._indices = OrderedDict()
#: mapping from line numbers to (key, value, source) tuples
self._lines = OrderedDict()
if mapping is not None:
self.update(mapping)
self.update(kwargs)
def _check(self):
"""
Assert the internal consistency of the instance's data structures.
This method is for debugging only.
"""
for k,ix in six.iteritems(self._indices):
assert k is not None, 'null key'
assert ix, 'Key does not map to any indices'
assert ix == sorted(ix), "Key's indices are not in order"
for i in ix:
assert i in self._lines, 'Key index does not map to line'
assert self._lines[i].key is not None, 'Key maps to comment'
assert self._lines[i].key == k, 'Key does not map to itself'
assert self._lines[i].value is not None, 'Key has null value'
prev = None
for i, line in six.iteritems(self._lines):
assert prev is None or prev < i, 'Line indices out of order'
prev = i
if line.key is None:
assert line.value is None, 'Comment/blank has value'
assert line.source is not None, 'Comment source not stored'
assert loads(line.source) == {}, 'Comment source is not comment'
else:
assert line.value is not None, 'Key has null value'
if line.source is not None:
assert loads(line.source) == {line.key: line.value}, \
'Key source does not deserialize to itself'
assert line.key in self._indices, 'Key is missing from map'
assert i in self._indices[line.key], \
'Key does not map to itself'
def __getitem__(self, key):
if not isinstance(key, six.string_types):
raise TypeError(_type_err)
return self._lines[self._indices[key][-1]].value
def __setitem__(self, key, value):
if not isinstance(key, six.string_types) or \
not isinstance(value, six.string_types):
raise TypeError(_type_err)
try:
ixes = self._indices[key]
except KeyError:
try:
lasti = next(reversed(self._lines))
except StopIteration:
ix = 0
else:
ix = lasti + 1
# We're adding a line to the end of the file, so make sure the
# line before it ends with a newline and (if it's not a
# comment) doesn't end with a trailing line continuation.
lastline = self._lines[lasti]
if lastline.source is not None:
lastsrc = lastline.source
if lastline.key is not None:
lastsrc=re.sub(r'(?<!\\)((?:\\\\)*)\\$', r'\1', lastsrc)
if not lastsrc.endswith(('\r', '\n')):
lastsrc += '\n'
self._lines[lasti] = lastline._replace(source=lastsrc)
else:
# Update the first occurrence of the key and discard the rest.
# This way, the order in which the keys are listed in the file and
# dict will be preserved.
ix = ixes.pop(0)
for i in ixes:
del self._lines[i]
self._indices[key] = [ix]
self._lines[ix] = PropertyLine(key, value, None)
def __delitem__(self, key):
if not isinstance(key, six.string_types):
raise TypeError(_type_err)
for i in self._indices.pop(key):
del self._lines[i]
def __iter__(self):
return iter(self._indices)
def __reversed__(self):
return reversed(self._indices)
def __len__(self):
return len(self._indices)
def _comparable(self):
return [
(None, line.source) if line.key is None else (line.key, line.value)
for i, line in six.iteritems(self._lines)
### TODO: Also include non-final repeated keys???
if line.key is None or self._indices[line.key][-1] == i
]
def __eq__(self, other):
if isinstance(other, PropertiesFile):
return self._comparable() == other._comparable()
### TODO: Special-case OrderedDict?
elif isinstance(other, Mapping):
return dict(self) == other
else:
return NotImplemented
def __ne__(self, other):
return not (self == other)
@classmethod
def load(cls, fp):
"""
Parse the contents of the `~io.IOBase.readline`-supporting file-like
object ``fp`` as a simple line-oriented ``.properties`` file and return
a `PropertiesFile` instance.
``fp`` may be either a text or binary filehandle, with or without
universal newlines enabled. If it is a binary filehandle, its contents
are decoded as Latin-1.
.. versionchanged:: 0.5.0
Invalid ``\\uXXXX`` escape sequences will now cause an
`InvalidUEscapeError` to be raised
:param fp: the file from which to read the ``.properties`` document
:type fp: file-like object
:rtype: PropertiesFile
:raises InvalidUEscapeError: if an invalid ``\\uXXXX`` escape sequence
occurs in the input
"""
obj = cls()
for i, (k, v, src) in enumerate(parse(fp)):
if k is not None:
obj._indices.setdefault(k, []).append(i)
obj._lines[i] = PropertyLine(k, v, src)
return obj
@classmethod
def dump(self, fp, separator='='):
"""
Write the mapping to a file in simple line-oriented ``.properties``
format.
If the instance was originally created from a file or string with
`PropertiesFile.load()` or `PropertiesFile.loads()`, then the output
will include the comments and whitespace from the original input, and
any keys that haven't been deleted or reassigned will retain their
original formatting and multiplicity. Key-value pairs that have been
modified or added to the mapping will be reformatted with
`join_key_value()` using the given separator. All key-value pairs are
output in the order they were defined, with new keys added to the end.
.. note::
Serializing a `PropertiesFile` instance with the :func:`dump()`
function instead will cause all formatting information to be
ignored, as :func:`dump()` will treat the instance like a normal
mapping.
:param fp: A file-like object to write the mapping to. It must have
been opened as a text file with a Latin-1-compatible encoding.
:param separator: The string to use for separating new or modified keys
& values. Only ``" "``, ``"="``, and ``":"`` (possibly with added
whitespace) should ever be used as the separator.
:type separator: text string
:return: `None`
"""
### TODO: Support setting the timestamp
for line in six.itervalues(self._lines):
if line.source is None:
print(join_key_value(line.key, line.value, separator), file=fp)
else:
fp.write(line.source)
def dumps(self, separator='='):
"""
Convert the mapping to a text string in simple line-oriented
``.properties`` format.
If the instance was originally created from a file or string with
`PropertiesFile.load()` or `PropertiesFile.loads()`, then the output
will include the comments and whitespace from the original input, and
any keys that haven't been deleted or reassigned will retain their
original formatting and multiplicity. Key-value pairs that have been
modified or added to the mapping will be reformatted with
`join_key_value()` using the given separator. All key-value pairs are
output in the order they were defined, with new keys added to the end.
.. note::
Serializing a `PropertiesFile` instance with the :func:`dumps()`
function instead will cause all formatting information to be
ignored, as :func:`dumps()` will treat the instance like a normal
mapping.
:param separator: The string to use for separating new or modified keys
& values. Only ``" "``, ``"="``, and ``":"`` (possibly with added
whitespace) should ever be used as the separator.
:type separator: text string
:rtype: text string
"""
s = six.StringIO()
self.dump(s, separator=separator)
return s.getvalue()
def copy(self):
""" Create a copy of the mapping, including formatting information """
dup = type(self)()
dup._indices = OrderedDict(
(k, list(v)) for k,v in six.iteritems(self._indices)
)
dup._lines = self._lines.copy()
return dup
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.