instance_id stringlengths 13 45 | pull_number int64 7 30.1k | repo stringclasses 83
values | version stringclasses 68
values | base_commit stringlengths 40 40 | created_at stringdate 2013-05-16 18:15:55 2025-01-08 15:12:50 | patch stringlengths 347 35.2k | test_patch stringlengths 432 113k | non_py_patch stringlengths 0 18.3k | new_components listlengths 0 40 | FAIL_TO_PASS listlengths 1 2.53k | PASS_TO_PASS listlengths 0 1.7k | problem_statement stringlengths 607 52.7k | hints_text stringlengths 0 57.4k | environment_setup_commit stringclasses 167
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sympy__sympy-13345 | 13,345 | sympy/sympy | 1.1 | 19c75f32f62e1d8900520861261d5eccda243194 | 2017-09-25T10:24:00Z | diff --git a/sympy/printing/repr.py b/sympy/printing/repr.py
index 132fff96f4d5..fe69c43ee5a8 100644
--- a/sympy/printing/repr.py
+++ b/sympy/printing/repr.py
@@ -199,6 +199,23 @@ def _print_FracElement(self, frac):
denom = self._print(denom_terms)
return "%s(%s, %s, %s)" % (frac.__class__.__name__, self._print(frac.field), numer, denom)
+ def _print_FractionField(self, domain):
+ cls = domain.__class__.__name__
+ field = self._print(domain.field)
+ return "%s(%s)" % (cls, field)
+
+ def _print_PolynomialRingBase(self, ring):
+ cls = ring.__class__.__name__
+ dom = self._print(ring.domain)
+ gens = ', '.join(map(self._print, ring.gens))
+ order = str(ring.order)
+ if order != ring.default_order:
+ orderstr = ", order=" + order
+ else:
+ orderstr = ""
+ return "%s(%s, %s%s)" % (cls, dom, gens, orderstr)
+
+
def srepr(expr, **settings):
"""return expr in repr form"""
return ReprPrinter(settings).doprint(expr)
| diff --git a/sympy/printing/tests/test_repr.py b/sympy/printing/tests/test_repr.py
index add77864bf0d..3fbe95003df1 100644
--- a/sympy/printing/tests/test_repr.py
+++ b/sympy/printing/tests/test_repr.py
@@ -213,6 +213,22 @@ def test_FracElement():
F, x, y = field("x,y", ZZ)
assert srepr((3*x**2*y + 1)/(x - y**2)) == "FracElement(FracField((Symbol('x'), Symbol('y')), ZZ, lex), [((2, 1), 3), ((0, 0), 1)], [((1, 0), 1), ((0, 2), -1)])"
+def test_FractionField():
+ assert srepr(QQ.frac_field(x)) == \
+ "FractionField(FracField((Symbol('x'),), QQ, lex))"
+ assert srepr(QQ.frac_field(x, y, order=grlex)) == \
+ "FractionField(FracField((Symbol('x'), Symbol('y')), QQ, grlex))"
+
+
+def test_PolynomialRingBase():
+ assert srepr(ZZ.old_poly_ring(x)) == \
+ "GlobalPolynomialRing(ZZ, Symbol('x'))"
+ assert srepr(ZZ[x].old_poly_ring(y)) == \
+ "GlobalPolynomialRing(ZZ[x], Symbol('y'))"
+ assert srepr(QQ.frac_field(x).old_poly_ring(y)) == \
+ "GlobalPolynomialRing(FractionField(FracField((Symbol('x'),), QQ, lex)), Symbol('y'))"
+
+
def test_BooleanAtom():
assert srepr(true) == "S.true"
assert srepr(false) == "S.false"
| [
{
"components": [
{
"doc": "",
"lines": [
202,
205
],
"name": "ReprPrinter._print_FractionField",
"signature": "def _print_FractionField(self, domain):",
"type": "function"
},
{
"doc": "",
"lines": [
... | [
"test_FractionField",
"test_PolynomialRingBase"
] | [
"test_printmethod",
"test_Add",
"test_Function",
"test_Geometry",
"test_Singletons",
"test_Integer",
"test_list",
"test_Matrix",
"test_empty_Matrix",
"test_Rational",
"test_Float",
"test_Symbol",
"test_Symbol_two_assumptions",
"test_Symbol_no_special_commutative_treatment",
"test_Wild",
... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
srepr of polynomial rings and fraction fields
The old polynomial ring currently prints in the same way as the
new one and its string representation only creates the new ring.
```
>>> A = ZZ.old_poly_ring(x)
>>> sstr(A)
'ZZ[x]'
>>> srepr(A)
'ZZ[x]'
>>> type(ZZ[x])
<class 'sympy.polys.domains.polynomialring.PolynomialRing'>
>>> type(A)
<class 'sympy.polys.domains.old_polynomialring.GlobalPolynomialRing'>
```
This commit defines a representation string that can be used to
construct the old polynomial ring.
```
>>> srepr(A)
"GlobalPolynomialRing(ZZ, Symbol('x'))"
```
With fraction fields, there is another issue. The printed string
of type QQ(x) cannot be used at all to create a SymPy object.
Therefore a new representation is defined.
```
>>> K = QQ.frac_field(x)
>>> sstr(K)
'QQ(x)'
>>> srepr(K)
"FractionField(FracField((Symbol('x'),), QQ, lex))"
```
<!-- Please give this pull request a descriptive title. Pull requests with descriptive titles are more likely to receive reviews. Describe what you changed! A title that only references an issue number is not descriptive. -->
<!-- If this pull request fixes an issue please indicate which issue by typing "Fixes #NNNN" below. -->
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/printing/repr.py]
(definition of ReprPrinter._print_FractionField:)
def _print_FractionField(self, domain):
(definition of ReprPrinter._print_PolynomialRingBase:)
def _print_PolynomialRingBase(self, ring):
[end of new definitions in sympy/printing/repr.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | ec9e3c0436fbff934fa84e22bf07f1b3ef5bfac3 | ||
sympy__sympy-13304 | 13,304 | sympy/sympy | 1.1 | 7e2f1b243e36ad0b5277b9b6f15cb79450c1f579 | 2017-09-13T07:54:49Z | diff --git a/sympy/polys/polyclasses.py b/sympy/polys/polyclasses.py
index bbf078f34249..c194d4eb7d19 100644
--- a/sympy/polys/polyclasses.py
+++ b/sympy/polys/polyclasses.py
@@ -114,6 +114,7 @@ def _perify_factors(per, result, include):
from sympy.polys.sqfreetools import (
dup_gff_list,
+ dmp_norm,
dmp_sqf_p,
dmp_sqf_norm,
dmp_sqf_part,
@@ -750,6 +751,11 @@ def gff_list(f):
else:
raise ValueError('univariate polynomial expected')
+ def norm(f):
+ """Computes ``Norm(f)``."""
+ r = dmp_norm(f.rep, f.lev, f.dom)
+ return f.per(r, dom=f.dom.dom)
+
def sqf_norm(f):
"""Computes square-free norm of ``f``. """
s, g, r = dmp_sqf_norm(f.rep, f.lev, f.dom)
diff --git a/sympy/polys/polytools.py b/sympy/polys/polytools.py
index ff6fedc17b71..90838dab01ac 100644
--- a/sympy/polys/polytools.py
+++ b/sympy/polys/polytools.py
@@ -3073,6 +3073,41 @@ def gff_list(f):
return [(f.per(g), k) for g, k in result]
+ def norm(f):
+ """
+ Computes the product, ``Norm(f)``, of the conjugates of
+ a polynomial ``f`` defined over a number field ``K``.
+
+ Examples
+ ========
+
+ >>> from sympy import Poly, sqrt
+ >>> from sympy.abc import x
+
+ >>> a, b = sqrt(2), sqrt(3)
+
+ A polynomial over a quadratic extension.
+ Two conjugates x - a and x + a.
+
+ >>> f = Poly(x - a, x, extension=a)
+ >>> f.norm()
+ Poly(x**2 - 2, x, domain='QQ')
+
+ A polynomial over a quartic extension.
+ Four conjugates x - a, x - a, x + a and x + a.
+
+ >>> f = Poly(x - a, x, extension=(a, b))
+ >>> f.norm()
+ Poly(x**4 - 4*x**2 + 4, x, domain='QQ')
+
+ """
+ if hasattr(f.rep, 'norm'):
+ r = f.rep.norm()
+ else: # pragma: no cover
+ raise OperationNotSupported(f, 'norm')
+
+ return f.per(r)
+
def sqf_norm(f):
"""
Computes square-free norm of ``f``.
diff --git a/sympy/polys/sqfreetools.py b/sympy/polys/sqfreetools.py
index adef947a7db2..b3446ee1cee8 100644
--- a/sympy/polys/sqfreetools.py
+++ b/sympy/polys/sqfreetools.py
@@ -174,6 +174,19 @@ def dmp_sqf_norm(f, u, K):
return s, f, r
+def dmp_norm(f, u, K):
+ """
+ Norm of ``f`` in ``K[X1, ..., Xn]``, often not square-free.
+ """
+ if not K.is_Algebraic:
+ raise DomainError("ground domain must be algebraic")
+
+ g = dmp_raise(K.mod.rep, u + 1, 0, K.dom)
+ h, _ = dmp_inject(f, u, K, front=True)
+
+ return dmp_resultant(g, h, u + 1, K.dom)
+
+
def dup_gf_sqf_part(f, K):
"""Compute square-free part of ``f`` in ``GF(p)[x]``. """
f = dup_convert(f, K, K.dom)
| diff --git a/sympy/polys/tests/test_polytools.py b/sympy/polys/tests/test_polytools.py
index 0af574396244..0b5a94267893 100644
--- a/sympy/polys/tests/test_polytools.py
+++ b/sympy/polys/tests/test_polytools.py
@@ -2211,6 +2211,12 @@ def test_gff():
raises(NotImplementedError, lambda: gff(f))
+def test_norm():
+ a, b = sqrt(2), sqrt(3)
+ f = Poly(a*x + b*y, x, y, extension=(a, b))
+ assert f.norm() == Poly(4*x**4 - 12*x**2*y**2 + 9*y**4, x, y, domain='QQ')
+
+
def test_sqf_norm():
assert sqf_norm(x**2 - 2, extension=sqrt(3)) == \
(1, x**2 - 2*sqrt(3)*x + 1, x**4 - 10*x**2 + 1)
| [
{
"components": [
{
"doc": "Computes ``Norm(f)``.",
"lines": [
754,
757
],
"name": "DMP.norm",
"signature": "def norm(f):",
"type": "function"
}
],
"file": "sympy/polys/polyclasses.py"
},
{
"components": [
{
... | [
"test_norm"
] | [
"test_Poly_from_dict",
"test_Poly_from_list",
"test_Poly_from_poly",
"test_Poly_from_expr",
"test_Poly__new__",
"test_Poly__args",
"test_Poly__gens",
"test_Poly_zero",
"test_Poly_one",
"test_Poly__unify",
"test_Poly_free_symbols",
"test_PurePoly_free_symbols",
"test_Poly__eq__",
"test_Pure... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
polys: Norm of polynomial over a number field
SymPy implements `sqf_norm`, the norm of a modified polynomial,
for the construction of primitive elements of number fields.
It is the same as `norm` if that is square-free but different
in general. This PR implements the standard norm in all cases.
The low-level function, `dmp_norm`, is in sqfreetools.py because
it is essentially a single step of `dmp_sqf_norm`.
<!-- Please give this pull request a descriptive title. Pull requests with descriptive titles are more likely to receive reviews. Describe what you changed! A title that only references an issue number is not descriptive. -->
<!-- If this pull request fixes an issue please indicate which issue by typing "Fixes #NNNN" below. -->
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/polys/polyclasses.py]
(definition of DMP.norm:)
def norm(f):
"""Computes ``Norm(f)``."""
[end of new definitions in sympy/polys/polyclasses.py]
[start of new definitions in sympy/polys/polytools.py]
(definition of Poly.norm:)
def norm(f):
"""Computes the product, ``Norm(f)``, of the conjugates of
a polynomial ``f`` defined over a number field ``K``.
Examples
========
>>> from sympy import Poly, sqrt
>>> from sympy.abc import x
>>> a, b = sqrt(2), sqrt(3)
A polynomial over a quadratic extension.
Two conjugates x - a and x + a.
>>> f = Poly(x - a, x, extension=a)
>>> f.norm()
Poly(x**2 - 2, x, domain='QQ')
A polynomial over a quartic extension.
Four conjugates x - a, x - a, x + a and x + a.
>>> f = Poly(x - a, x, extension=(a, b))
>>> f.norm()
Poly(x**4 - 4*x**2 + 4, x, domain='QQ')"""
[end of new definitions in sympy/polys/polytools.py]
[start of new definitions in sympy/polys/sqfreetools.py]
(definition of dmp_norm:)
def dmp_norm(f, u, K):
"""Norm of ``f`` in ``K[X1, ..., Xn]``, often not square-free."""
[end of new definitions in sympy/polys/sqfreetools.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | ec9e3c0436fbff934fa84e22bf07f1b3ef5bfac3 | ||
sympy__sympy-13277 | 13,277 | sympy/sympy | 1.1 | 613373e31790f06b7324dd5ef887198538204bcc | 2017-09-08T11:49:12Z | diff --git a/sympy/polys/polytools.py b/sympy/polys/polytools.py
index ff6fedc17b71..c3ee2a0eafa9 100644
--- a/sympy/polys/polytools.py
+++ b/sympy/polys/polytools.py
@@ -4430,6 +4430,59 @@ def degree(f, gen=0):
return Integer(p.degree(gen))
+@public
+def total_degree(f, *gens):
+ """
+ Return the total_degree of ``f`` in the given variables.
+
+ Examples
+ ========
+ >>> from sympy import total_degree, Poly
+ >>> from sympy.abc import x, y, z
+
+ >>> total_degree(1)
+ 0
+ >>> total_degree(x + x*y)
+ 2
+ >>> total_degree(x + x*y, x)
+ 1
+
+ If the expression is a Poly and no variables are given
+ then the generators of the Poly will be used:
+
+ >>> p = Poly(x + x*y, y)
+ >>> total_degree(p)
+ 1
+
+ To deal with the underlying expression of the Poly, convert
+ it to an Expr:
+
+ >>> total_degree(p.as_expr())
+ 2
+
+ This is done automatically if any variables are given:
+
+ >>> total_degree(p, x)
+ 1
+
+ See also
+ ========
+ degree
+ """
+
+ p = sympify(f)
+ if p.is_Poly:
+ p = p.as_expr()
+ if p.is_Number:
+ rv = 0
+ else:
+ if f.is_Poly:
+ gens = gens or f.gens
+ rv = Poly(p, gens).total_degree()
+
+ return Integer(rv)
+
+
@public
def degree_list(f, *gens, **args):
"""
| diff --git a/sympy/polys/tests/test_polytools.py b/sympy/polys/tests/test_polytools.py
index 0af574396244..42373289fa0d 100644
--- a/sympy/polys/tests/test_polytools.py
+++ b/sympy/polys/tests/test_polytools.py
@@ -4,6 +4,7 @@
Poly, PurePoly, poly,
parallel_poly_from_expr,
degree, degree_list,
+ total_degree,
LC, LM, LT,
pdiv, prem, pquo, pexquo,
div, rem, quo, exquo,
@@ -1204,6 +1205,13 @@ def test_Poly_total_degree():
assert Poly(x*y*z + z**4).total_degree() == 4
assert Poly(x**3 + x + 1).total_degree() == 3
+ assert total_degree(x*y + z**3) == 3
+ assert total_degree(x*y + z**3, x, y) == 2
+ assert total_degree(1) == 0
+ assert total_degree(Poly(y**2 + x**3 + z**4)) == 4
+ assert total_degree(Poly(y**2 + x**3 + z**4, x)) == 3
+ assert total_degree(Poly(y**2 + x**3 + z**4, x), z) == 4
+ assert total_degree(Poly(x**9 + x*z*y + x**3*z**2 + z**7,x), z) == 7
def test_Poly_homogenize():
assert Poly(x**2+y).homogenize(z) == Poly(x**2+y*z)
| [
{
"components": [
{
"doc": "Return the total_degree of ``f`` in the given variables.\n\nExamples\n========\n>>> from sympy import total_degree, Poly\n>>> from sympy.abc import x, y, z\n\n>>> total_degree(1)\n0\n>>> total_degree(x + x*y)\n2\n>>> total_degree(x + x*y, x)\n1\n\nIf the expression is a... | [
"test_Poly_from_dict",
"test_Poly_from_list",
"test_Poly_from_poly",
"test_Poly_from_expr",
"test_Poly__new__",
"test_Poly__args",
"test_Poly__gens",
"test_Poly_zero",
"test_Poly_one",
"test_Poly__unify",
"test_Poly_free_symbols",
"test_PurePoly_free_symbols",
"test_Poly__eq__",
"test_Pure... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
total_degree() implemented
As suggested in [issue#13179](https://github.com/sympy/sympy/issues/13179).
I will add docstring in the next commit.
ping @smichr @asmeurer
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/polys/polytools.py]
(definition of total_degree:)
def total_degree(f, *gens):
"""Return the total_degree of ``f`` in the given variables.
Examples
========
>>> from sympy import total_degree, Poly
>>> from sympy.abc import x, y, z
>>> total_degree(1)
0
>>> total_degree(x + x*y)
2
>>> total_degree(x + x*y, x)
1
If the expression is a Poly and no variables are given
then the generators of the Poly will be used:
>>> p = Poly(x + x*y, y)
>>> total_degree(p)
1
To deal with the underlying expression of the Poly, convert
it to an Expr:
>>> total_degree(p.as_expr())
2
This is done automatically if any variables are given:
>>> total_degree(p, x)
1
See also
========
degree"""
[end of new definitions in sympy/polys/polytools.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | ec9e3c0436fbff934fa84e22bf07f1b3ef5bfac3 | ||
joke2k__faker-580 | 580 | joke2k/faker | null | aedf2246bf51a7a6c4d94484ed2e978c4c2abb96 | 2017-08-28T08:11:38Z | diff --git a/faker/providers/date_time/ko_KR/__init__.py b/faker/providers/date_time/ko_KR/__init__.py
new file mode 100644
index 0000000000..f3333b7477
--- /dev/null
+++ b/faker/providers/date_time/ko_KR/__init__.py
@@ -0,0 +1,40 @@
+# coding: utf-8
+from __future__ import unicode_literals
+
+from .. import Provider as DateTimeProvider
+
+
+class Provider(DateTimeProvider):
+
+ @classmethod
+ def day_of_week(cls):
+ day = cls.date('%w')
+ DAY_NAMES = {
+ "0": "일요일",
+ "1": "월요일",
+ "2": "화요일",
+ "3": "수요일",
+ "4": "목요일",
+ "5": "금요일",
+ "6": "토요일",
+ }
+ return DAY_NAMES[day]
+
+ @classmethod
+ def month_name(cls):
+ month = cls.month()
+ MONTH_NAMES = {
+ "01": "1월",
+ "02": "2월",
+ "03": "3월",
+ "04": "4월",
+ "05": "5월",
+ "06": "6월",
+ "07": "7월",
+ "08": "8월",
+ "09": "9월",
+ "10": "10월",
+ "11": "11월",
+ "12": "12월",
+ }
+ return MONTH_NAMES[month]
| diff --git a/tests/providers/date_time.py b/tests/providers/date_time.py
index 8371dc10cf..5c184bbd81 100644
--- a/tests/providers/date_time.py
+++ b/tests/providers/date_time.py
@@ -9,6 +9,7 @@
from faker import Factory
from faker.generator import random
from faker.providers.date_time import Provider as DatetimeProvider
+from faker.providers.date_time.ko_KR import Provider as koKRProvider
from .. import string_types
@@ -32,6 +33,18 @@ def dst(self, dt):
utc = UTC()
+class TestKoKR(unittest.TestCase):
+
+ def setUp(self):
+ self.factory = Factory.create('ko_KR')
+
+ def test_day(self):
+ day = self.factory.day_of_week()
+ assert isinstance(day, string_types)
+ def test_month(self):
+ month = self.factory.month()
+ assert isinstance(month, string_types)
+
class TestDateTime(unittest.TestCase):
| [
{
"components": [
{
"doc": "",
"lines": [
7,
40
],
"name": "Provider",
"signature": "class Provider(DateTimeProvider): @classmethod",
"type": "class"
},
{
"doc": "",
"lines": [
10,
21
... | [
"tests/providers/date_time.py::TestKoKR::test_day",
"tests/providers/date_time.py::TestKoKR::test_month",
"tests/providers/date_time.py::TestDateTime::test_date_object",
"tests/providers/date_time.py::TestDateTime::test_date_time_between_dates",
"tests/providers/date_time.py::TestDateTime::test_date_time_be... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add ko_KR datetime
added the datetime provider for ko_KR.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/date_time/ko_KR/__init__.py]
(definition of Provider:)
class Provider(DateTimeProvider): @classmethod
(definition of Provider.day_of_week:)
def day_of_week(cls):
(definition of Provider.month_name:)
def month_name(cls):
[end of new definitions in faker/providers/date_time/ko_KR/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
sympy__sympy-13200 | 13,200 | sympy/sympy | 1.1 | 83bcbc0f349ae0c90d47808327fedbc1c7c210d6 | 2017-08-26T15:37:39Z | diff --git a/sympy/codegen/approximations.py b/sympy/codegen/approximations.py
new file mode 100644
index 000000000000..5cf0d60dc618
--- /dev/null
+++ b/sympy/codegen/approximations.py
@@ -0,0 +1,185 @@
+# -*- coding: utf-8 -*-
+from __future__ import (absolute_import, division, print_function)
+
+import math
+from itertools import product
+from sympy import Add, Symbol, sin, Abs, oo, Interval
+from sympy.core.function import UndefinedFunction
+from sympy.calculus.singularities import is_increasing, is_decreasing
+from sympy.codegen.rewriting import Optimization
+
+"""
+This module collects classes useful for approimate rewriting of expressions.
+This can be beneficial when generating numeric code for which performance is
+of greater importance than precision (e.g. for preconditioners used in iterative
+methods).
+"""
+
+class SumApprox(Optimization):
+ """ Approximates sum by neglecting small terms
+
+ If terms are expressions which can be determined to be monotonic, then
+ bounds for those expressions are added.
+
+ Parameters
+ ==========
+ bounds : dict
+ Mapping expressions to length 2 tuple of bounds (low, high).
+ reltol : number
+ Threshold for when to ignore a term. Taken relative to the largest
+ lower bound among bounds.
+
+ Examples
+ ========
+
+ >>> from sympy import exp
+ >>> from sympy.abc import x, y, z
+ >>> from sympy.codegen.rewriting import optimize
+ >>> from sympy.codegen.approximations import SumApprox
+ >>> bounds = {x: (-1, 1), y: (1000, 2000), z: (-10, 3)}
+ >>> sum_approx3 = SumApprox(bounds, reltol=1e-3)
+ >>> sum_approx2 = SumApprox(bounds, reltol=1e-2)
+ >>> sum_approx1 = SumApprox(bounds, reltol=1e-1)
+ >>> expr = 3*(x + y + exp(z))
+ >>> optimize(expr, [sum_approx3])
+ 3*(x + y + exp(z))
+ >>> optimize(expr, [sum_approx2])
+ 3*y + 3*exp(z)
+ >>> optimize(expr, [sum_approx1])
+ 3*y
+
+ """
+
+ def __init__(self, bounds, reltol, **kwargs):
+ super(SumApprox, self).__init__(**kwargs)
+ self.bounds = bounds
+ self.reltol = reltol
+
+ def __call__(self, expr):
+ return expr.factor().replace(self.query, lambda arg: self.value(arg))
+
+ def query(self, expr):
+ return expr.is_Add
+
+ def value(self, add):
+ for term in add.args:
+ if term.is_number or term in self.bounds or len(term.free_symbols) != 1:
+ continue
+ fs, = term.free_symbols
+ if fs not in self.bounds:
+ continue
+ intrvl = Interval(*self.bounds[fs])
+ if is_increasing(term, intrvl, fs):
+ self.bounds[term] = (
+ term.subs({fs: self.bounds[fs][0]}),
+ term.subs({fs: self.bounds[fs][1]})
+ )
+ elif is_decreasing(term, intrvl, fs):
+ self.bounds[term] = (
+ term.subs({fs: self.bounds[fs][1]}),
+ term.subs({fs: self.bounds[fs][0]})
+ )
+ else:
+ return add
+
+ if all(term.is_number or term in self.bounds for term in add.args):
+ bounds = [(term, term) if term.is_number else self.bounds[term] for term in add.args]
+ largest_abs_guarantee = 0
+ for lo, hi in bounds:
+ if lo <= 0 <= hi:
+ continue
+ largest_abs_guarantee = max(largest_abs_guarantee,
+ min(abs(lo), abs(hi)))
+ new_terms = []
+ for term, (lo, hi) in zip(add.args, bounds):
+ if max(abs(lo), abs(hi)) >= largest_abs_guarantee*self.reltol:
+ new_terms.append(term)
+ return add.func(*new_terms)
+ else:
+ return add
+
+
+class SeriesApprox(Optimization):
+ """ Approximates functions by expanding them as a series
+
+ Parameters
+ ==========
+ bounds : dict
+ Mapping expressions to length 2 tuple of bounds (low, high).
+ reltol : number
+ Threshold for when to ignore a term. Taken relative to the largest
+ lower bound among bounds.
+ max_order : int
+ Largest order to include in series expansion
+ n_point_checks : int (even)
+ The validity of an expansion (with respect to reltol) is checked at
+ discrete points (linearly spaced over the bounds of the variable). The
+ number of points used in this numerical check is given by this number.
+
+ Examples
+ ========
+
+ >>> from sympy import sin, pi
+ >>> from sympy.abc import x, y
+ >>> from sympy.codegen.rewriting import optimize
+ >>> from sympy.codegen.approximations import SeriesApprox
+ >>> bounds = {x: (-.1, .1), y: (pi-1, pi+1)}
+ >>> series_approx2 = SeriesApprox(bounds, reltol=1e-2)
+ >>> series_approx3 = SeriesApprox(bounds, reltol=1e-3)
+ >>> series_approx8 = SeriesApprox(bounds, reltol=1e-8)
+ >>> expr = sin(x)*sin(y)
+ >>> optimize(expr, [series_approx2])
+ x*(-y + (y - pi)**3/6 + pi)
+ >>> optimize(expr, [series_approx3])
+ (-x**3/6 + x)*sin(y)
+ >>> optimize(expr, [series_approx8])
+ sin(x)*sin(y)
+
+ """
+ def __init__(self, bounds, reltol, max_order=4, n_point_checks=4, **kwargs):
+ super(SeriesApprox, self).__init__(**kwargs)
+ self.bounds = bounds
+ self.reltol = reltol
+ self.max_order = max_order
+ if n_point_checks % 2 == 1:
+ raise ValueError("Checking the solution at expansion point is not helpful")
+ self.n_point_checks = n_point_checks
+ self._prec = math.ceil(-math.log10(self.reltol))
+
+ def __call__(self, expr):
+ return expr.factor().replace(self.query, lambda arg: self.value(arg))
+
+ def query(self, expr):
+ return (expr.is_Function and not isinstance(expr, UndefinedFunction)
+ and len(expr.args) == 1)
+
+ def value(self, fexpr):
+ free_symbols = fexpr.free_symbols
+ if len(free_symbols) != 1:
+ return fexpr
+ symb, = free_symbols
+ if symb not in self.bounds:
+ return fexpr
+ lo, hi = self.bounds[symb]
+ x0 = (lo + hi)/2
+ cheapest = None
+ for n in range(self.max_order+1, 0, -1):
+ fseri = fexpr.series(symb, x0=x0, n=n).removeO()
+ n_ok = True
+ for idx in range(self.n_point_checks):
+ x = lo + idx*(hi - lo)/(self.n_point_checks - 1)
+ val = fseri.xreplace({symb: x})
+ ref = fexpr.xreplace({symb: x})
+ if abs((1 - val/ref).evalf(self._prec)) > self.reltol:
+ n_ok = False
+ break
+
+ if n_ok:
+ cheapest = fseri
+ else:
+ break
+
+ if cheapest is None:
+ return fexpr
+ else:
+ return cheapest
| diff --git a/sympy/codegen/tests/test_approximations.py b/sympy/codegen/tests/test_approximations.py
new file mode 100644
index 000000000000..ebeaf46f1a52
--- /dev/null
+++ b/sympy/codegen/tests/test_approximations.py
@@ -0,0 +1,55 @@
+# -*- coding: utf-8 -*-
+from __future__ import (absolute_import, division, print_function)
+
+import math
+from sympy import symbols, exp, S, Poly
+from sympy.codegen.rewriting import optimize
+from sympy.codegen.approximations import SumApprox, SeriesApprox
+
+
+def test_SumApprox_trivial():
+ x = symbols('x')
+ expr1 = 1 + x
+ sum_approx = SumApprox(bounds={x: (-1e-20, 1e-20)}, reltol=1e-16)
+ apx1 = optimize(expr1, [sum_approx])
+ assert apx1 - 1 == 0
+
+
+def test_SumApprox_monotone_terms():
+ x, y, z = symbols('x y z')
+ expr1 = exp(z)*(x**2 + y**2 + 1)
+ bnds1 = {x: (0, 1e-3), y: (100, 1000)}
+ sum_approx_m2 = SumApprox(bounds=bnds1, reltol=1e-2)
+ sum_approx_m5 = SumApprox(bounds=bnds1, reltol=1e-5)
+ sum_approx_m11 = SumApprox(bounds=bnds1, reltol=1e-11)
+ assert (optimize(expr1, [sum_approx_m2])/exp(z) - (y**2)).simplify() == 0
+ assert (optimize(expr1, [sum_approx_m5])/exp(z) - (y**2 + 1)).simplify() == 0
+ assert (optimize(expr1, [sum_approx_m11])/exp(z) - (y**2 + 1 + x**2)).simplify() == 0
+
+
+def test_SeriesApprox_trivial():
+ x, z = symbols('x z')
+ for factor in [1, exp(z)]:
+ x = symbols('x')
+ expr1 = exp(x)*factor
+ bnds1 = {x: (-1, 1)}
+ series_approx_50 = SeriesApprox(bounds=bnds1, reltol=0.50)
+ series_approx_10 = SeriesApprox(bounds=bnds1, reltol=0.10)
+ series_approx_05 = SeriesApprox(bounds=bnds1, reltol=0.05)
+ c = (bnds1[x][1] + bnds1[x][0])/2 # 0.0
+ f0 = math.exp(c) # 1.0
+
+ ref_50 = f0 + x + x**2/2
+ ref_10 = f0 + x + x**2/2 + x**3/6
+ ref_05 = f0 + x + x**2/2 + x**3/6 + x**4/24
+
+ res_50 = optimize(expr1, [series_approx_50])
+ res_10 = optimize(expr1, [series_approx_10])
+ res_05 = optimize(expr1, [series_approx_05])
+
+ assert (res_50/factor - ref_50).simplify() == 0
+ assert (res_10/factor - ref_10).simplify() == 0
+ assert (res_05/factor - ref_05).simplify() == 0
+
+ max_ord3 = SeriesApprox(bounds=bnds1, reltol=0.05, max_order=3)
+ assert optimize(expr1, [max_ord3]) == expr1
| [
{
"components": [
{
"doc": "Approximates sum by neglecting small terms\n\nIf terms are expressions which can be determined to be monotonic, then\nbounds for those expressions are added.\n\nParameters\n==========\nbounds : dict\n Mapping expressions to length 2 tuple of bounds (low, high).\nrelt... | [
"test_SumApprox_trivial",
"test_SumApprox_monotone_terms"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add .codegen.approximations
This PR introduced classes for approximative rewriting.
TODO
----
- [x] try to deduce bounds of terms of `add` in `SumApprox.__call__` if the term is monotonic (given `bounds`)
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/codegen/approximations.py]
(definition of SumApprox:)
class SumApprox(Optimization):
"""Approximates sum by neglecting small terms
If terms are expressions which can be determined to be monotonic, then
bounds for those expressions are added.
Parameters
==========
bounds : dict
Mapping expressions to length 2 tuple of bounds (low, high).
reltol : number
Threshold for when to ignore a term. Taken relative to the largest
lower bound among bounds.
Examples
========
>>> from sympy import exp
>>> from sympy.abc import x, y, z
>>> from sympy.codegen.rewriting import optimize
>>> from sympy.codegen.approximations import SumApprox
>>> bounds = {x: (-1, 1), y: (1000, 2000), z: (-10, 3)}
>>> sum_approx3 = SumApprox(bounds, reltol=1e-3)
>>> sum_approx2 = SumApprox(bounds, reltol=1e-2)
>>> sum_approx1 = SumApprox(bounds, reltol=1e-1)
>>> expr = 3*(x + y + exp(z))
>>> optimize(expr, [sum_approx3])
3*(x + y + exp(z))
>>> optimize(expr, [sum_approx2])
3*y + 3*exp(z)
>>> optimize(expr, [sum_approx1])
3*y"""
(definition of SumApprox.__init__:)
def __init__(self, bounds, reltol, **kwargs):
(definition of SumApprox.__call__:)
def __call__(self, expr):
(definition of SumApprox.query:)
def query(self, expr):
(definition of SumApprox.value:)
def value(self, add):
(definition of SeriesApprox:)
class SeriesApprox(Optimization):
"""Approximates functions by expanding them as a series
Parameters
==========
bounds : dict
Mapping expressions to length 2 tuple of bounds (low, high).
reltol : number
Threshold for when to ignore a term. Taken relative to the largest
lower bound among bounds.
max_order : int
Largest order to include in series expansion
n_point_checks : int (even)
The validity of an expansion (with respect to reltol) is checked at
discrete points (linearly spaced over the bounds of the variable). The
number of points used in this numerical check is given by this number.
Examples
========
>>> from sympy import sin, pi
>>> from sympy.abc import x, y
>>> from sympy.codegen.rewriting import optimize
>>> from sympy.codegen.approximations import SeriesApprox
>>> bounds = {x: (-.1, .1), y: (pi-1, pi+1)}
>>> series_approx2 = SeriesApprox(bounds, reltol=1e-2)
>>> series_approx3 = SeriesApprox(bounds, reltol=1e-3)
>>> series_approx8 = SeriesApprox(bounds, reltol=1e-8)
>>> expr = sin(x)*sin(y)
>>> optimize(expr, [series_approx2])
x*(-y + (y - pi)**3/6 + pi)
>>> optimize(expr, [series_approx3])
(-x**3/6 + x)*sin(y)
>>> optimize(expr, [series_approx8])
sin(x)*sin(y)"""
(definition of SeriesApprox.__init__:)
def __init__(self, bounds, reltol, max_order=4, n_point_checks=4, **kwargs):
(definition of SeriesApprox.__call__:)
def __call__(self, expr):
(definition of SeriesApprox.query:)
def query(self, expr):
(definition of SeriesApprox.value:)
def value(self, fexpr):
[end of new definitions in sympy/codegen/approximations.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | ec9e3c0436fbff934fa84e22bf07f1b3ef5bfac3 | ||
scikit-learn__scikit-learn-9597 | 9,597 | scikit-learn/scikit-learn | 0.22 | 58e8a43b6062990dddeb3246aede909f0934dc33 | 2017-08-21T18:22:56Z | diff --git a/doc/whats_new/v0.22.rst b/doc/whats_new/v0.22.rst
index c28231adbc1cd..918696cbc83d2 100644
--- a/doc/whats_new/v0.22.rst
+++ b/doc/whats_new/v0.22.rst
@@ -261,10 +261,11 @@ Changelog
:mod:`sklearn.feature_selection`
................................
+
- |Fix| Fixed a bug where :class:`VarianceThreshold` with `threshold=0` did not
remove constant features due to numerical instability, by using range
rather than variance in this case.
- :pr:`13704` by `Roddy MacSween <rlms>`.
+ :pr:`13704` by :user:`Roddy MacSween <rlms>`.
:mod:`sklearn.utils`
....................
@@ -272,10 +273,20 @@ Changelog
- |Enhancement| :func:`utils.safe_indexing` accepts an ``axis`` parameter to
index array-like across rows and columns. The column indexing can be done on
NumPy array, SciPy sparse matrix, and Pandas DataFrame.
- :pr:`14035` by `Guillaume Lemaitre <glemaitre>`.
+ :pr:`14035` by :user:`Guillaume Lemaitre <glemaitre>`.
:mod:`sklearn.neighbors`
-.............................
+....................
+
+- |Feature| :class:`neighbors.RadiusNeighborsClassifier` now supports
+ predicting probabilities by using `predict_proba` and supports more
+ outlier_label options: 'most_frequent', or different outlier_labels
+ for multi-outputs.
+ :pr:`9597` by :user:`Wenbo Zhao <webber26232>`.
+
+- |Efficiency| Efficiency improvements for
+ :func:`neighbors.RadiusNeighborsClassifier.predict`.
+ :pr:`9597` by :user:`Wenbo Zhao <webber26232>`.
- |Fix| KNearestRegressor now throws error when fit on non-square data and
metric = precomputed. :class:`neighbors.NeighborsBase`
diff --git a/sklearn/neighbors/classification.py b/sklearn/neighbors/classification.py
index f0fd0b084365a..a72f710ae57ea 100644
--- a/sklearn/neighbors/classification.py
+++ b/sklearn/neighbors/classification.py
@@ -10,8 +10,11 @@
import numpy as np
from scipy import stats
+from six import string_types
from ..utils.extmath import weighted_mode
+from ..utils.validation import _is_arraylike, _num_samples
+import warnings
from .base import \
_check_weights, _get_weights, \
NeighborsBase, KNeighborsMixin,\
@@ -141,7 +144,6 @@ def __init__(self, n_neighbors=5,
weights='uniform', algorithm='auto', leaf_size=30,
p=2, metric='minkowski', metric_params=None, n_jobs=None,
**kwargs):
-
super().__init__(
n_neighbors=n_neighbors,
algorithm=algorithm,
@@ -151,7 +153,7 @@ def __init__(self, n_neighbors=5,
self.weights = _check_weights(weights)
def predict(self, X):
- """Predict the class labels for the provided data
+ """Predict the class labels for the provided data.
Parameters
----------
@@ -174,7 +176,7 @@ def predict(self, X):
classes_ = [self.classes_]
n_outputs = len(classes_)
- n_samples = X.shape[0]
+ n_samples = _num_samples(X)
weights = _get_weights(neigh_dist, self.weights)
y_pred = np.empty((n_samples, n_outputs), dtype=classes_[0].dtype)
@@ -218,7 +220,7 @@ def predict_proba(self, X):
_y = self._y.reshape((-1, 1))
classes_ = [self.classes_]
- n_samples = X.shape[0]
+ n_samples = _num_samples(X)
weights = _get_weights(neigh_dist, self.weights)
if weights is None:
@@ -302,10 +304,13 @@ class RadiusNeighborsClassifier(NeighborsBase, RadiusNeighborsMixin,
metric. See the documentation of the DistanceMetric class for a
list of available metrics.
- outlier_label : int, optional (default = None)
- Label, which is given for outlier samples (samples with no
- neighbors on given radius).
- If set to None, ValueError is raised, when outlier is detected.
+ outlier_label : {manual label, 'most_frequent'}, optional (default = None)
+ label for outlier samples (samples with no neighbors in given radius).
+
+ - manual label: str or int label (should be the same type as y)
+ or list of manual labels if multi-output is used.
+ - 'most_frequent' : assign the most frequent label of y to outliers.
+ - None : when any outlier is detected, ValueError will be raised.
metric_params : dict, optional (default = None)
Additional keyword arguments for the metric function.
@@ -346,6 +351,8 @@ class RadiusNeighborsClassifier(NeighborsBase, RadiusNeighborsMixin,
RadiusNeighborsClassifier(...)
>>> print(neigh.predict([[1.5]]))
[0]
+ >>> print(neigh.predict_proba([[1.0]]))
+ [[0.66666667 0.33333333]]
See also
--------
@@ -375,8 +382,69 @@ def __init__(self, radius=1.0, weights='uniform',
self.weights = _check_weights(weights)
self.outlier_label = outlier_label
+ def fit(self, X, y):
+ """Fit the model using X as training data and y as target values
+
+ Parameters
+ ----------
+ X : {array-like, sparse matrix, BallTree, KDTree}
+ Training data. If array or matrix, shape [n_samples, n_features],
+ or [n_samples, n_samples] if metric='precomputed'.
+
+ y : {array-like, sparse matrix}
+ Target values of shape = [n_samples] or [n_samples, n_outputs]
+
+ """
+
+ SupervisedIntegerMixin.fit(self, X, y)
+
+ classes_ = self.classes_
+ _y = self._y
+ if not self.outputs_2d_:
+ _y = self._y.reshape((-1, 1))
+ classes_ = [self.classes_]
+
+ if self.outlier_label is None:
+ outlier_label_ = None
+
+ elif self.outlier_label == 'most_frequent':
+ outlier_label_ = []
+ # iterate over multi-output, get the most frequest label for each
+ # output.
+ for k, classes_k in enumerate(classes_):
+ label_count = np.bincount(_y[:, k])
+ outlier_label_.append(classes_k[label_count.argmax()])
+
+ else:
+ if (_is_arraylike(self.outlier_label) and
+ not isinstance(self.outlier_label, string_types)):
+ if len(self.outlier_label) != len(classes_):
+ raise ValueError("The length of outlier_label: {} is "
+ "inconsistent with the output "
+ "length: {}".format(self.outlier_label,
+ len(classes_)))
+ outlier_label_ = self.outlier_label
+ else:
+ outlier_label_ = [self.outlier_label] * len(classes_)
+
+ for classes, label in zip(classes_, outlier_label_):
+ if (_is_arraylike(label) and
+ not isinstance(label, string_types)):
+ # ensure the outlier lable for each output is a scalar.
+ raise TypeError("The outlier_label of classes {} is "
+ "supposed to be a scalar, got "
+ "{}.".format(classes, label))
+ if np.append(classes, label).dtype != classes.dtype:
+ # ensure the dtype of outlier label is consistent with y.
+ raise TypeError("The dtype of outlier_label {} is "
+ "inconsistent with classes {} in "
+ "y.".format(label, classes))
+
+ self.outlier_label_ = outlier_label_
+ return self
+
def predict(self, X):
- """Predict the class labels for the provided data
+ """Predict the class labels for the provided data.
Parameters
----------
@@ -388,54 +456,119 @@ def predict(self, X):
-------
y : array of shape [n_samples] or [n_samples, n_outputs]
Class labels for each data sample.
+ """
+
+ probs = self.predict_proba(X)
+ classes_ = self.classes_
+
+ if not self.outputs_2d_:
+ probs = [probs]
+ classes_ = [self.classes_]
+
+ n_outputs = len(classes_)
+ n_samples = probs[0].shape[0]
+ y_pred = np.empty((n_samples, n_outputs),
+ dtype=classes_[0].dtype)
+
+ for k, prob in enumerate(probs):
+ # iterate over multi-output, assign labels based on probabilities
+ # of each output.
+ max_prob_index = prob.argmax(axis=1)
+ y_pred[:, k] = classes_[k].take(max_prob_index)
+
+ outlier_zero_probs = (prob == 0).all(axis=1)
+ if outlier_zero_probs.any():
+ zero_prob_index = np.flatnonzero(outlier_zero_probs)
+ y_pred[zero_prob_index, k] = self.outlier_label_[k]
+
+ if not self.outputs_2d_:
+ y_pred = y_pred.ravel()
+
+ return y_pred
+ def predict_proba(self, X):
+ """Return probability estimates for the test data X.
+
+ Parameters
+ ----------
+ X : array-like, shape (n_query, n_features), \
+ or (n_query, n_indexed) if metric == 'precomputed'
+ Test samples.
+
+ Returns
+ -------
+ p : array of shape = [n_samples, n_classes], or a list of n_outputs
+ of such arrays if n_outputs > 1.
+ The class probabilities of the input samples. Classes are ordered
+ by lexicographic order.
"""
+
X = check_array(X, accept_sparse='csr')
- n_samples = X.shape[0]
+ n_samples = _num_samples(X)
neigh_dist, neigh_ind = self.radius_neighbors(X)
- inliers = [i for i, nind in enumerate(neigh_ind) if len(nind) != 0]
- outliers = [i for i, nind in enumerate(neigh_ind) if len(nind) == 0]
+ outlier_mask = np.zeros(n_samples, dtype=np.bool)
+ outlier_mask[:] = [len(nind) == 0 for nind in neigh_ind]
+ outliers = np.flatnonzero(outlier_mask)
+ inliers = np.flatnonzero(~outlier_mask)
classes_ = self.classes_
_y = self._y
if not self.outputs_2d_:
_y = self._y.reshape((-1, 1))
classes_ = [self.classes_]
- n_outputs = len(classes_)
- if self.outlier_label is not None:
- neigh_dist[outliers] = 1e-6
- elif outliers:
+ if self.outlier_label_ is None and outliers.size > 0:
raise ValueError('No neighbors found for test samples %r, '
'you can try using larger radius, '
- 'give a label for outliers, '
- 'or consider removing them from your dataset.'
+ 'giving a label for outliers, '
+ 'or considering removing them from your dataset.'
% outliers)
weights = _get_weights(neigh_dist, self.weights)
+ if weights is not None:
+ weights = weights[inliers]
- y_pred = np.empty((n_samples, n_outputs), dtype=classes_[0].dtype)
+ probabilities = []
+ # iterate over multi-output, measure probabilities of the k-th output.
for k, classes_k in enumerate(classes_):
pred_labels = np.zeros(len(neigh_ind), dtype=object)
pred_labels[:] = [_y[ind, k] for ind in neigh_ind]
+
+ proba_k = np.zeros((n_samples, classes_k.size))
+ proba_inl = np.zeros((len(inliers), classes_k.size))
+
+ # samples have different size of neighbors within the same radius
if weights is None:
- mode = np.array([stats.mode(pl)[0]
- for pl in pred_labels[inliers]], dtype=np.int)
+ for i, idx in enumerate(pred_labels[inliers]):
+ proba_inl[i, :] = np.bincount(idx,
+ minlength=classes_k.size)
else:
- mode = np.array(
- [weighted_mode(pl, w)[0]
- for (pl, w) in zip(pred_labels[inliers], weights[inliers])
- ], dtype=np.int)
+ for i, idx in enumerate(pred_labels[inliers]):
+ proba_inl[i, :] = np.bincount(idx,
+ weights[i],
+ minlength=classes_k.size)
+ proba_k[inliers, :] = proba_inl
+
+ if outliers.size > 0:
+ _outlier_label = self.outlier_label_[k]
+ label_index = np.flatnonzero(classes_k == _outlier_label)
+ if label_index.size == 1:
+ proba_k[outliers, label_index[0]] = 1.0
+ else:
+ warnings.warn('Outlier label {} is not in training '
+ 'classes. All class probabilities of '
+ 'outliers will be assigned with 0.'
+ ''.format(self.outlier_label_[k]))
- mode = mode.ravel()
-
- y_pred[inliers, k] = classes_k.take(mode)
+ # normalize 'votes' into real [0,1] probabilities
+ normalizer = proba_k.sum(axis=1)[:, np.newaxis]
+ normalizer[normalizer == 0.0] = 1.0
+ proba_k /= normalizer
- if outliers:
- y_pred[outliers, :] = self.outlier_label
+ probabilities.append(proba_k)
if not self.outputs_2d_:
- y_pred = y_pred.ravel()
+ probabilities = probabilities[0]
- return y_pred
+ return probabilities
diff --git a/sklearn/neighbors/regression.py b/sklearn/neighbors/regression.py
index 09143ad28b1bc..006f98171a95a 100644
--- a/sklearn/neighbors/regression.py
+++ b/sklearn/neighbors/regression.py
@@ -347,12 +347,11 @@ def predict(self, X):
if len(ind) else empty_obs
for (i, ind) in enumerate(neigh_ind)])
- if np.max(np.isnan(y_pred)):
+ if np.any(np.isnan(y_pred)):
empty_warning_msg = ("One or more samples have no neighbors "
"within specified radius; predicting NaN.")
warnings.warn(empty_warning_msg)
-
if self._y.ndim == 1:
y_pred = y_pred.ravel()
| diff --git a/sklearn/neighbors/tests/test_neighbors.py b/sklearn/neighbors/tests/test_neighbors.py
index d22c0d1d9acac..3da1c2579700f 100644
--- a/sklearn/neighbors/tests/test_neighbors.py
+++ b/sklearn/neighbors/tests/test_neighbors.py
@@ -385,6 +385,7 @@ def test_radius_neighbors_classifier_outlier_labeling():
z2 = np.array([[1.4, 1.4], [1.01, 1.01], [2.01, 2.01]]) # one outlier
correct_labels1 = np.array([1, 2])
correct_labels2 = np.array([-1, 1, 2])
+ outlier_proba = np.array([0, 0])
weight_func = _weight_func
@@ -397,6 +398,72 @@ def test_radius_neighbors_classifier_outlier_labeling():
clf.fit(X, y)
assert_array_equal(correct_labels1, clf.predict(z1))
assert_array_equal(correct_labels2, clf.predict(z2))
+ assert_array_equal(outlier_proba, clf.predict_proba(z2)[0])
+
+ # test outlier_labeling of using predict_proba()
+ RNC = neighbors.RadiusNeighborsClassifier
+ X = np.array([[0], [1], [2], [3], [4], [5], [6], [7], [8], [9]])
+ y = np.array([0, 2, 2, 1, 1, 1, 3, 3, 3, 3])
+
+ # test outlier_label scalar verification
+ def check_array_exception():
+ clf = RNC(radius=1, outlier_label=[[5]])
+ clf.fit(X, y)
+ assert_raises(TypeError, check_array_exception)
+
+ # test invalid outlier_label dtype
+ def check_dtype_exception():
+ clf = RNC(radius=1, outlier_label='a')
+ clf.fit(X, y)
+ assert_raises(TypeError, check_dtype_exception)
+
+ # test most frequent
+ clf = RNC(radius=1, outlier_label='most_frequent')
+ clf.fit(X, y)
+ proba = clf.predict_proba([[1], [15]])
+ assert_array_equal(proba[1, :], [0, 0, 0, 1])
+
+ # test manual label in y
+ clf = RNC(radius=1, outlier_label=1)
+ clf.fit(X, y)
+ proba = clf.predict_proba([[1], [15]])
+ assert_array_equal(proba[1, :], [0, 1, 0, 0])
+ pred = clf.predict([[1], [15]])
+ assert_array_equal(pred, [2, 1])
+
+ # test manual label out of y warning
+ def check_warning():
+ clf = RNC(radius=1, outlier_label=4)
+ clf.fit(X, y)
+ clf.predict_proba([[1], [15]])
+ assert_warns(UserWarning, check_warning)
+
+ # test multi output same outlier label
+ y_multi = [[0, 1], [2, 1], [2, 2], [1, 2], [1, 2],
+ [1, 3], [3, 3], [3, 3], [3, 0], [3, 0]]
+ clf = RNC(radius=1, outlier_label=1)
+ clf.fit(X, y_multi)
+ proba = clf.predict_proba([[7], [15]])
+ assert_array_equal(proba[1][1, :], [0, 1, 0, 0])
+ pred = clf.predict([[7], [15]])
+ assert_array_equal(pred[1, :], [1, 1])
+
+ # test multi output different outlier label
+ y_multi = [[0, 0], [2, 2], [2, 2], [1, 1], [1, 1],
+ [1, 1], [3, 3], [3, 3], [3, 3], [3, 3]]
+ clf = RNC(radius=1, outlier_label=[0, 1])
+ clf.fit(X, y_multi)
+ proba = clf.predict_proba([[7], [15]])
+ assert_array_equal(proba[0][1, :], [1, 0, 0, 0])
+ assert_array_equal(proba[1][1, :], [0, 1, 0, 0])
+ pred = clf.predict([[7], [15]])
+ assert_array_equal(pred[1, :], [0, 1])
+
+ # test inconsistent outlier label list length
+ def check_exception():
+ clf = RNC(radius=1, outlier_label=[0, 1, 2])
+ clf.fit(X, y_multi)
+ assert_raises(ValueError, check_exception)
def test_radius_neighbors_classifier_zero_distance():
@@ -1413,3 +1480,21 @@ def test_pairwise_boolean_distance():
nn1 = NN(metric="jaccard", algorithm='brute').fit(X)
nn2 = NN(metric="jaccard", algorithm='ball_tree').fit(X)
assert_array_equal(nn1.kneighbors(X)[0], nn2.kneighbors(X)[0])
+
+
+def test_radius_neighbors_predict_proba():
+ for seed in range(5):
+ X, y = datasets.make_classification(n_samples=50, n_features=5,
+ n_informative=3, n_redundant=0,
+ n_classes=3, random_state=seed)
+ X_tr, X_te, y_tr, y_te = train_test_split(X, y, random_state=0)
+ outlier_label = int(2 - seed)
+ clf = neighbors.RadiusNeighborsClassifier(radius=2,
+ outlier_label=outlier_label)
+ clf.fit(X_tr, y_tr)
+ pred = clf.predict(X_te)
+ proba = clf.predict_proba(X_te)
+ proba_label = proba.argmax(axis=1)
+ proba_label = np.where(proba.sum(axis=1) == 0,
+ outlier_label, proba_label)
+ assert_array_equal(pred, proba_label)
| diff --git a/doc/whats_new/v0.22.rst b/doc/whats_new/v0.22.rst
index c28231adbc1cd..918696cbc83d2 100644
--- a/doc/whats_new/v0.22.rst
+++ b/doc/whats_new/v0.22.rst
@@ -261,10 +261,11 @@ Changelog
:mod:`sklearn.feature_selection`
................................
+
- |Fix| Fixed a bug where :class:`VarianceThreshold` with `threshold=0` did not
remove constant features due to numerical instability, by using range
rather than variance in this case.
- :pr:`13704` by `Roddy MacSween <rlms>`.
+ :pr:`13704` by :user:`Roddy MacSween <rlms>`.
:mod:`sklearn.utils`
....................
@@ -272,10 +273,20 @@ Changelog
- |Enhancement| :func:`utils.safe_indexing` accepts an ``axis`` parameter to
index array-like across rows and columns. The column indexing can be done on
NumPy array, SciPy sparse matrix, and Pandas DataFrame.
- :pr:`14035` by `Guillaume Lemaitre <glemaitre>`.
+ :pr:`14035` by :user:`Guillaume Lemaitre <glemaitre>`.
:mod:`sklearn.neighbors`
-.............................
+....................
+
+- |Feature| :class:`neighbors.RadiusNeighborsClassifier` now supports
+ predicting probabilities by using `predict_proba` and supports more
+ outlier_label options: 'most_frequent', or different outlier_labels
+ for multi-outputs.
+ :pr:`9597` by :user:`Wenbo Zhao <webber26232>`.
+
+- |Efficiency| Efficiency improvements for
+ :func:`neighbors.RadiusNeighborsClassifier.predict`.
+ :pr:`9597` by :user:`Wenbo Zhao <webber26232>`.
- |Fix| KNearestRegressor now throws error when fit on non-square data and
metric = precomputed. :class:`neighbors.NeighborsBase`
| [
{
"components": [
{
"doc": "Fit the model using X as training data and y as target values\n\nParameters\n----------\nX : {array-like, sparse matrix, BallTree, KDTree}\n Training data. If array or matrix, shape [n_samples, n_features],\n or [n_samples, n_samples] if metric='precomputed'.\n\ny... | [
"sklearn/neighbors/tests/test_neighbors.py::test_radius_neighbors_classifier_outlier_labeling",
"sklearn/neighbors/tests/test_neighbors.py::test_radius_neighbors_predict_proba"
] | [
"sklearn/neighbors/tests/test_neighbors.py::test_unsupervised_kneighbors",
"sklearn/neighbors/tests/test_neighbors.py::test_unsupervised_inputs",
"sklearn/neighbors/tests/test_neighbors.py::test_n_neighbors_datatype",
"sklearn/neighbors/tests/test_neighbors.py::test_not_fitted_error_gets_raised",
"sklearn/n... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
[MRG+1] Add predict_proba(X) and outlier handler for RadiusNeighborsClassifier
<!--
Thanks for contributing a pull request! Please ensure you have taken a look at
the contribution guidelines: https://github.com/scikit-learn/scikit-learn/blob/master/CONTRIBUTING.md#Contributing-Pull-Requests
-->
#### Reference Issue
<!-- Example: Fixes #1234 -->
#### What does this implement/fix? Explain your changes.
Currently, RadiusNeighborsClassifier doesn't provide the function predict_proba(X). When an outlier (a sample that doesn't have any neighbors within a fix redius) is detected, no class label can be assigned.
This branch implemented predict_proba(X) for RadiusNeighborsClassifier. The class probabilities are generated by samples' neighbors and weights within the radius r.
When outliers (no neighbors in the radius r) are detected, given 4 solutions, controlled by parameter outlier_label:
solution | predict(X) | predict_proba(X)
----------|------------|--------------------
None | Exception will be raised. | Exception will be raised.
int, str or list of correspond labels | Assign a manual label for outliers. | If the new label is in the 'y' of training data, the probabilities will be 1 in this label and 0 in the rest. Otherwise, all possibilities will be 0.
'most_frequent' | Assign the most frequent label in training target variable for outliers. | The probability of most frequent label in training target will be assigned with 1 and other label's probability will be 0.
#### Any other comments?
DOC for both class of RaduisNeighborsClassifier (outlier solution and an example) and function of predict_proba is added.
<!--
Please be aware that we are a loose team of volunteers so patience is
necessary; assistance handling other issues is very welcome. We value
all user contributions, no matter how minor they are. If we are slow to
review, either the pull request needs some benchmarking, tinkering,
convincing, etc. or more likely the reviewers are simply busy. In either
case, we ask for your understanding during the review process.
For more information, see our FAQ on this topic:
http://scikit-learn.org/dev/faq.html#why-is-my-pull-request-not-getting-any-attention.
Thanks for contributing!
-->
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sklearn/neighbors/classification.py]
(definition of RadiusNeighborsClassifier.fit:)
def fit(self, X, y):
"""Fit the model using X as training data and y as target values
Parameters
----------
X : {array-like, sparse matrix, BallTree, KDTree}
Training data. If array or matrix, shape [n_samples, n_features],
or [n_samples, n_samples] if metric='precomputed'.
y : {array-like, sparse matrix}
Target values of shape = [n_samples] or [n_samples, n_outputs]"""
(definition of RadiusNeighborsClassifier.predict_proba:)
def predict_proba(self, X):
"""Return probability estimates for the test data X.
Parameters
----------
X : array-like, shape (n_query, n_features), or (n_query, n_indexed) if metric == 'precomputed'
Test samples.
Returns
-------
p : array of shape = [n_samples, n_classes], or a list of n_outputs
of such arrays if n_outputs > 1.
The class probabilities of the input samples. Classes are ordered
by lexicographic order."""
[end of new definitions in sklearn/neighbors/classification.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | c96e0958da46ebef482a4084cdda3285d5f5ad23 | |
sympy__sympy-13104 | 13,104 | sympy/sympy | 1.1 | 280a0115f0d2061e5308860b59b88dc180b62038 | 2017-08-10T00:59:27Z | diff --git a/sympy/combinatorics/homomorphisms.py b/sympy/combinatorics/homomorphisms.py
index aa38e6224a5b..c94ddce4dc51 100644
--- a/sympy/combinatorics/homomorphisms.py
+++ b/sympy/combinatorics/homomorphisms.py
@@ -207,6 +207,39 @@ def is_trivial(self):
'''
return self.image().order() == 1
+ def restrict_to(self, H):
+ '''
+ Return the restriction of the homomorphism to the subgroup `H`
+ of the domain.
+
+ '''
+ if not isinstance(H, PermutationGroup) or not H.is_subgroup(self.domain):
+ raise ValueError("Given H is not a subgroup of the domain")
+ domain = H
+ images = {g: self(g) for g in H.generators}
+ return GroupHomomorphism(domain, self.codomain, images)
+
+ def invert_subgroup(self, H):
+ '''
+ Return the subgroup of the domain that is the inverse image
+ of the subgroup `H` of the homomorphism image
+
+ '''
+ if not H.is_subgroup(self.image()):
+ raise ValueError("Given H is not a subgroup of the image")
+ gens = []
+ P = PermutationGroup(self.image().identity)
+ for h in H.generators:
+ h_i = self.invert(h)
+ if h_i not in P:
+ gens.append(h_i)
+ P = PermutationGroup(gens)
+ for k in self.kernel().generators:
+ if k*h_i not in P:
+ gens.append(k*h_i)
+ P = PermutationGroup(gens)
+ return P
+
def homomorphism(domain, codomain, gens, images=[], check=True):
'''
Create (if possible) a group homomorphism from the group `domain`
@@ -303,3 +336,59 @@ def _image(r):
if not s:
return False
return True
+
+def orbit_homomorphism(group, omega):
+ '''
+ Return the homomorphism induced by the action of the permutation
+ group `group` on the set `omega` that is closed under the action.
+
+ '''
+ from sympy.combinatorics import Permutation
+ from sympy.combinatorics.named_groups import SymmetricGroup
+ codomain = SymmetricGroup(len(omega))
+ identity = codomain.identity
+ omega = list(omega)
+ images = {g: identity*Permutation([omega.index(o^g) for o in omega]) for g in group.generators}
+ group._schreier_sims(base=omega)
+ H = GroupHomomorphism(group, codomain, images)
+ if len(group.basic_stabilizers) > len(omega):
+ H._kernel = group.basic_stabilizers[len(omega)]
+ else:
+ H._kernel = PermutationGroup([group.identity])
+ return H
+
+def block_homomorphism(group, blocks):
+ '''
+ Return the homomorphism induced by the action of the permutation
+ group `group` on the block system `blocks`. The latter should be
+ of the same form as returned by the `minimal_block` method for
+ permutation groups, namely a list of length `group.degree` where
+ the i-th entry is a representative of the block i belongs to.
+
+ '''
+ from sympy.combinatorics import Permutation
+ from sympy.combinatorics.named_groups import SymmetricGroup
+
+ n = len(blocks)
+
+ # number the blocks; m is the total number,
+ # b is such that b[i] is the number of the block i belongs to,
+ # p is the list of length m such that p[i] is the representative
+ # of the i-th block
+ m = 0
+ p = []
+ b = [None]*n
+ for i in range(n):
+ if blocks[i] == i:
+ p.append(i)
+ b[i] = m
+ m += 1
+ for i in range(n):
+ b[i] = b[blocks[i]]
+
+ codomain = SymmetricGroup(m)
+ # the list corresponding to the identity permutation in codomain
+ identity = range(m)
+ images = {g: Permutation([b[p[i]^g] for i in identity]) for g in group.generators}
+ H = GroupHomomorphism(group, codomain, images)
+ return H
diff --git a/sympy/combinatorics/perm_groups.py b/sympy/combinatorics/perm_groups.py
index a0f29963bb94..af0e7765e018 100644
--- a/sympy/combinatorics/perm_groups.py
+++ b/sympy/combinatorics/perm_groups.py
@@ -1867,12 +1867,95 @@ def is_primitive(self, randomized=True):
orbits = stab.orbits()
for orb in orbits:
x = orb.pop()
- if x != 0 and self.minimal_block([0, x]) != [0]*n:
+ if x != 0 and any(e != 0 for e in self.minimal_block([0, x])):
self._is_primitive = False
return False
self._is_primitive = True
return True
+ def minimal_blocks(self, randomized=True):
+ '''
+ For a transitive group, return the list of all minimal
+ block systems. If a group is intrasitive, return `False`.
+
+ Examples
+ ========
+ >>> from sympy.combinatorics import Permutation
+ >>> from sympy.combinatorics.perm_groups import PermutationGroup
+ >>> from sympy.combinatorics.named_groups import DihedralGroup
+ >>> DihedralGroup(6).minimal_blocks()
+ [[0, 3, 0, 3, 0, 3], [0, 4, 2, 0, 4, 2]]
+ >>> G = PermutationGroup(Permutation(1,2,5))
+ >>> G.minimal_blocks()
+ False
+
+ See Also
+ ========
+
+ minimal_block, is_transitive, is_primitive
+
+ '''
+ def _number_blocks(blocks):
+ # number the blocks of a block system
+ # in order and return the number of
+ # blocks and the tuple with the
+ # reordering
+ n = len(blocks)
+ appeared = {}
+ m = 0
+ b = [None]*n
+ for i in range(n):
+ if blocks[i] not in appeared:
+ appeared[blocks[i]] = m
+ b[i] = m
+ m += 1
+ else:
+ b[i] = appeared[blocks[i]]
+ return tuple(b), m
+
+ if not self.is_transitive():
+ return False
+ blocks = []
+ num_blocks = []
+ rep_blocks = []
+ if randomized:
+ random_stab_gens = []
+ v = self.schreier_vector(0)
+ for i in range(len(self)):
+ random_stab_gens.append(self.random_stab(0, v))
+ stab = PermutationGroup(random_stab_gens)
+ else:
+ stab = self.stabilizer(0)
+ orbits = stab.orbits()
+ for orb in orbits:
+ x = orb.pop()
+ if x != 0:
+ block = self.minimal_block([0, x])
+ num_block, m = _number_blocks(block)
+ # a representative block (containing 0)
+ rep = set(j for j in range(self.degree) if num_block[j] == 0)
+ # check if the system is minimal with
+ # respect to the already discovere ones
+ minimal = True
+ to_remove = []
+ for i, r in enumerate(rep_blocks):
+ if len(r) > len(rep) and rep.issubset(r):
+ # i-th block sytem is not minimal
+ del num_blocks[i], blocks[i]
+ to_remove.append(rep_blocks[i])
+ elif len(r) < len(rep) and r.issubset(rep):
+ # the system being checked is not minimal
+ minimal = False
+ break
+ # remove non-minimal representative blocks
+ rep_blocks = [r for r in rep_blocks if r not in to_remove]
+
+ if minimal and num_block not in num_blocks:
+ blocks.append(block)
+ num_blocks.append(num_block)
+ rep_blocks.append(rep)
+ return blocks
+
@property
def is_solvable(self):
"""Test if the group is solvable.
@@ -2676,9 +2759,11 @@ def schreier_sims(self):
return
def _schreier_sims(self, base=None):
- base, strong_gens = self.schreier_sims_incremental(base=base)
+ schreier = self.schreier_sims_incremental(base=base, slp_dict=True)
+ base, strong_gens = schreier[:2]
self._base = base
self._strong_gens = strong_gens
+ self._strong_gens_slp = schreier[2]
if not base:
self._transversals = []
self._basic_orbits = []
@@ -2698,7 +2783,7 @@ def _schreier_sims(self, base=None):
self._basic_orbits = [sorted(x) for x in basic_orbits]
self._transversal_slp = slps
- def schreier_sims_incremental(self, base=None, gens=None):
+ def schreier_sims_incremental(self, base=None, gens=None, slp_dict=False):
"""Extend a sequence of points and generating set to a base and strong
generating set.
@@ -2713,6 +2798,12 @@ def schreier_sims_incremental(self, base=None, gens=None):
relative to the base obtained. Optional parameter with default
value ``self.generators``.
+ slp_dict
+ If `True`, return a dictionary `{g: gens}` for each strong
+ generator `g` where `gens` is a list of strong generators
+ coming before `g` in `strong_gens`, such that the product
+ of the elements of `gens` is equal to `g`.
+
Returns
=======
@@ -2761,7 +2852,8 @@ def schreier_sims_incremental(self, base=None, gens=None):
id_af = list(range(degree))
# handle the trivial group
if len(gens) == 1 and gens[0].is_Identity:
- self._strong_gens_slp = {gens[0]: [gens[0]]}
+ if slp_dict:
+ return base, gens, {gens[0]: [gens[0]]}
return base, gens
# prevent side effects
_base, _gens = base[:], gens[:]
@@ -2854,22 +2946,27 @@ def schreier_sims_incremental(self, base=None, gens=None):
continue
i -= 1
- # create the list of the strong generators strong_gens and rewrite
- # the indices of strong_gens_slp in terms of the elements of strong_gens
strong_gens = _gens[:]
- for k, slp in strong_gens_slp:
- strong_gens.append(k)
- for i in range(len(slp)):
- s = slp[i]
- if isinstance(s[1], tuple):
- slp[i] = strong_gens_distr[s[0]][s[1][0]]**-1
- else:
- slp[i] = strong_gens_distr[s[0]][s[1]]
- strong_gens_slp = dict(strong_gens_slp)
- # add the original generators
- for g in _gens:
- strong_gens_slp[g] = [g]
- self._strong_gens_slp = strong_gens_slp
+
+ if slp_dict:
+ # create the list of the strong generators strong_gens and
+ # rewrite the indices of strong_gens_slp in terms of the
+ # elements of strong_gens
+ for k, slp in strong_gens_slp:
+ strong_gens.append(k)
+ for i in range(len(slp)):
+ s = slp[i]
+ if isinstance(s[1], tuple):
+ slp[i] = strong_gens_distr[s[0]][s[1][0]]**-1
+ else:
+ slp[i] = strong_gens_distr[s[0]][s[1]]
+ strong_gens_slp = dict(strong_gens_slp)
+ # add the original generators
+ for g in _gens:
+ strong_gens_slp[g] = [g]
+ return (_base, strong_gens, strong_gens_slp)
+
+ strong_gens.extend([k for k, _ in strong_gens_slp])
return _base, strong_gens
def schreier_sims_random(self, base=None, gens=None, consec_succ=10,
@@ -3360,7 +3457,7 @@ def update_nu(l):
else:
nu[l] = sorted_orbits[l][temp_index]
# line 29: set the next element from the current branch and update
- # accorndingly
+ # accordingly
c[l] += 1
if l == 0:
gamma = sorted_orbits[l][c[l]]
@@ -3421,6 +3518,177 @@ def transitivity_degree(self):
else:
return self._transitivity_degree
+ def _p_elements_group(G, p):
+ '''
+ For an abelian p-group G return the subgroup consisting of
+ all elements of order p (and the identity)
+
+ '''
+ gens = G.generators[:]
+ gens = sorted(gens, key=lambda x: x.order(), reverse=True)
+ gens_p = [g**(g.order()/p) for g in gens]
+ gens_r = []
+ for i in range(len(gens)):
+ x = gens[i]
+ x_order = x.order()
+ # x_p has order p
+ x_p = x**(x_order/p)
+ if i > 0:
+ P = PermutationGroup(gens_p[:i])
+ else:
+ P = PermutationGroup(G.identity)
+ if x**(x_order/p) not in P:
+ gens_r.append(x**(x_order/p))
+ else:
+ # replace x by an element of order (x.order()/p)
+ # so that gens still generates G
+ g = P.generator_product(x_p, original=True)
+ for s in g:
+ x = x*s**-1
+ x_order = x_order/p
+ # insert x to gens so that the sorting is preserved
+ del gens[i]
+ del gens_p[i]
+ j = i - 1
+ while j < len(gens) and gens[j].order() >= x_order:
+ j += 1
+ gens = gens[:j] + [x] + gens[j:]
+ gens_p = gens_p[:j] + [x] + gens_p[j:]
+ return PermutationGroup(gens_r)
+
+ def sylow_subgroup(self, p):
+ '''
+ Return a p-Sylow subgroup of the group.
+
+ The algorithm is described in [1], Chapter 4, Section 7
+
+ Examples
+ ========
+ >>> from sympy.combinatorics.named_groups import DihedralGroup
+ >>> from sympy.combinatorics.named_groups import SymmetricGroup
+ >>> from sympy.combinatorics.named_groups import AlternatingGroup
+
+ >>> D = DihedralGroup(6)
+ >>> S = D.sylow_subgroup(2)
+ >>> S.order()
+ 4
+ >>> G = SymmetricGroup(6)
+ >>> S = G.sylow_subgroup(5)
+ >>> S.order()
+ 5
+
+ >>> G1 = AlternatingGroup(3)
+ >>> G2 = AlternatingGroup(5)
+ >>> G3 = AlternatingGroup(9)
+
+ >>> S1 = G1.sylow_subgroup(3)
+ >>> S2 = G2.sylow_subgroup(3)
+ >>> S3 = G3.sylow_subgroup(3)
+
+ >>> len1 = len(S1.lower_central_series())
+ >>> len2 = len(S2.lower_central_series())
+ >>> len3 = len(S3.lower_central_series())
+
+ >>> len1 == len2
+ True
+ >>> len1 < len3
+ True
+
+ '''
+ from sympy.combinatorics.homomorphisms import (homomorphism,
+ orbit_homomorphism, block_homomorphism)
+ from sympy.ntheory.primetest import isprime
+
+ if not isprime(p):
+ raise ValueError("p must be a prime")
+
+ def is_p_group(G):
+ # check if the order of G is a power of p
+ # and return the power
+ m = G.order()
+ n = 0
+ while m % p == 0:
+ m = m/p
+ n += 1
+ if m == 1:
+ return True, n
+ return False, n
+
+ def _sylow_reduce(mu, nu):
+ # reduction based on two homomorphisms
+ # mu and nu with trivially intersecting
+ # kernels
+ Q = mu.image().sylow_subgroup(p)
+ Q = mu.invert_subgroup(Q)
+ nu = nu.restrict_to(Q)
+ R = nu.image().sylow_subgroup(p)
+ return nu.invert_subgroup(R)
+
+ order = self.order()
+ if order % p != 0:
+ return PermutationGroup([self.identity])
+ p_group, n = is_p_group(self)
+ if p_group:
+ return self
+
+ # if there is a non-trivial orbit with size not divisible
+ # by p, the sylow subgroup is contained in its stabilizer
+ # (by orbit-stabilizer theorem)
+ orbits = self.orbits()
+ non_p_orbits = [o for o in orbits if len(o) % p != 0 and len(o) != 1]
+ if non_p_orbits:
+ G = self.stabilizer(list(non_p_orbits[0]).pop())
+ return G.sylow_subgroup(p)
+
+ if not self.is_transitive():
+ # apply _sylow_reduce to orbit actions
+ orbits = sorted(orbits, key = lambda x: len(x))
+ omega1 = orbits.pop()
+ omega2 = orbits[0].union(*orbits)
+ mu = orbit_homomorphism(self, omega1)
+ nu = orbit_homomorphism(self, omega2)
+ return _sylow_reduce(mu, nu)
+
+ blocks = self.minimal_blocks()
+ if len(blocks) > 1:
+ # apply _sylow_reduce to block system actions
+ mu = block_homomorphism(self, blocks[0])
+ nu = block_homomorphism(self, blocks[1])
+ return _sylow_reduce(mu, nu)
+ elif len(blocks) == 1:
+ block = list(blocks)[0]
+ if any(e != 0 for e in block):
+ # self is imprimitive
+ mu = block_homomorphism(self, block)
+ if not is_p_group(mu.image())[0]:
+ S = mu.image().sylow_subgroup(p)
+ return mu.invert_subgroup(S).sylow_subgroup(p)
+
+ # find an element of order p
+ g = self.random()
+ g_order = g.order()
+ while g_order % p != 0 or g_order == 0:
+ g = self.random()
+ g_order = g.order()
+ g = g**(g_order // p)
+ if order % p**2 != 0:
+ return PermutationGroup(g)
+
+ C = self.centralizer(g)
+ while C.order() % p**n != 0:
+ S = C.sylow_subgroup(p)
+ s_order = S.order()
+ Z = S.center()
+ P = Z._p_elements_group(p)
+ h = P.random()
+ C_h = self.centralizer(h)
+ while C_h.order() % p*s_order != 0:
+ h = P.random()
+ C_h = self.centralizer(h)
+ C = C_h
+
+ return C.sylow_subgroup(p)
+
def presentation(G):
'''
Return an `FpGroup` presentation of the group.
@@ -3535,6 +3803,7 @@ def _rewrite(group, w):
return simplify_presentation(G_p)
+
def _orbit(degree, generators, alpha, action='tuples'):
r"""Compute the orbit of alpha `\{g(\alpha) | g \in G\}` as a set.
| diff --git a/sympy/combinatorics/tests/test_perm_groups.py b/sympy/combinatorics/tests/test_perm_groups.py
index 56d4e461c9d0..87da3838a93b 100644
--- a/sympy/combinatorics/tests/test_perm_groups.py
+++ b/sympy/combinatorics/tests/test_perm_groups.py
@@ -460,6 +460,15 @@ def test_minimal_block():
assert P1.minimal_block([0, 2]) == [0, 3, 0, 3, 0, 3]
assert P2.minimal_block([0, 2]) == [0, 3, 0, 3, 0, 3]
+def test_minimal_blocks():
+ P = PermutationGroup(Permutation(1, 5)(2, 4), Permutation(0, 1, 2, 3, 4, 5))
+ assert P.minimal_blocks() == [[0, 3, 0, 3, 0, 3], [0, 4, 5, 0, 4, 5]]
+
+ P = SymmetricGroup(5)
+ assert P.minimal_blocks() == [[0]*5]
+
+ P = PermutationGroup(Permutation(0, 3))
+ assert P.minimal_blocks() == False
def test_max_div():
S = SymmetricGroup(10)
@@ -778,6 +787,47 @@ def test_generator_product():
w = g*w
assert w == p
+def test_sylow_subgroup():
+ P = PermutationGroup(Permutation(1, 5)(2, 4), Permutation(0, 1, 2, 3, 4, 5))
+ S = P.sylow_subgroup(2)
+ assert S.order() == 4
+
+ P = DihedralGroup(12)
+ S = P.sylow_subgroup(3)
+ assert S.order() == 3
+
+ P = PermutationGroup(Permutation(1, 5)(2, 4), Permutation(0, 1, 2, 3, 4, 5), Permutation(0, 2))
+ S = P.sylow_subgroup(3)
+ assert S.order() == 9
+ S = P.sylow_subgroup(2)
+ assert S.order() == 8
+
+ P = SymmetricGroup(10)
+ S = P.sylow_subgroup(2)
+ assert S.order() == 256
+ S = P.sylow_subgroup(3)
+ assert S.order() == 81
+ S = P.sylow_subgroup(5)
+ assert S.order() == 25
+
+ # the length of the lower central series
+ # of a p-Sylow subgroup of Sym(n) grows with
+ # the highest exponent exp of p such
+ # that n >= p**exp
+ exp = 1
+ length = 0
+ for i in range(2, 9):
+ P = SymmetricGroup(i)
+ S = P.sylow_subgroup(2)
+ ls = S.lower_central_series()
+ if i // 2**exp > 0:
+ # length increases with exponent
+ assert len(ls) > length
+ length = len(ls)
+ exp += 1
+ else:
+ assert len(ls) == length
+
def test_presentation():
def _test(P):
G = P.presentation()
| [
{
"components": [
{
"doc": "Return the restriction of the homomorphism to the subgroup `H`\nof the domain.",
"lines": [
210,
220
],
"name": "GroupHomomorphism.restrict_to",
"signature": "def restrict_to(self, H):",
"type": "function"
... | [
"test_minimal_blocks",
"test_sylow_subgroup"
] | [
"test_has",
"test_generate",
"test_order",
"test_equality",
"test_stabilizer",
"test_center",
"test_centralizer",
"test_coset_rank",
"test_coset_factor",
"test_orbits",
"test_is_normal",
"test_eq",
"test_derived_subgroup",
"test_is_solvable",
"test_rubik1",
"test_direct_product",
"te... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
group theory: Implemented Sylow Subgroups
This PR adds the method `sylow_subgroup` to the `PermutationGroup` class for computing Sylow subgroups. As part of this:
- Implemented orbit and block system action homomorphisms (`orbit_homomorphism` and `block_homomorphism` in the homomorphisms module).
- Added the method `minimal_blocks` for returning all minimal block systems of a group
- Fixed a bug related to the `_strong_gens_slp` attribute: it used to be set inside `schreier_sims_incremental` however this method is called by `normalizer` and `stabilizer` methods which led to a key mismatch with respect to `strong_gens`.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/combinatorics/homomorphisms.py]
(definition of GroupHomomorphism.restrict_to:)
def restrict_to(self, H):
"""Return the restriction of the homomorphism to the subgroup `H`
of the domain."""
(definition of GroupHomomorphism.invert_subgroup:)
def invert_subgroup(self, H):
"""Return the subgroup of the domain that is the inverse image
of the subgroup `H` of the homomorphism image"""
(definition of orbit_homomorphism:)
def orbit_homomorphism(group, omega):
"""Return the homomorphism induced by the action of the permutation
group `group` on the set `omega` that is closed under the action."""
(definition of block_homomorphism:)
def block_homomorphism(group, blocks):
"""Return the homomorphism induced by the action of the permutation
group `group` on the block system `blocks`. The latter should be
of the same form as returned by the `minimal_block` method for
permutation groups, namely a list of length `group.degree` where
the i-th entry is a representative of the block i belongs to."""
[end of new definitions in sympy/combinatorics/homomorphisms.py]
[start of new definitions in sympy/combinatorics/perm_groups.py]
(definition of PermutationGroup.minimal_blocks:)
def minimal_blocks(self, randomized=True):
"""For a transitive group, return the list of all minimal
block systems. If a group is intrasitive, return `False`.
Examples
========
>>> from sympy.combinatorics import Permutation
>>> from sympy.combinatorics.perm_groups import PermutationGroup
>>> from sympy.combinatorics.named_groups import DihedralGroup
>>> DihedralGroup(6).minimal_blocks()
[[0, 3, 0, 3, 0, 3], [0, 4, 2, 0, 4, 2]]
>>> G = PermutationGroup(Permutation(1,2,5))
>>> G.minimal_blocks()
False
See Also
========
minimal_block, is_transitive, is_primitive"""
(definition of PermutationGroup.minimal_blocks._number_blocks:)
def _number_blocks(blocks):
(definition of PermutationGroup._p_elements_group:)
def _p_elements_group(G, p):
"""For an abelian p-group G return the subgroup consisting of
all elements of order p (and the identity)"""
(definition of PermutationGroup.sylow_subgroup:)
def sylow_subgroup(self, p):
"""Return a p-Sylow subgroup of the group.
The algorithm is described in [1], Chapter 4, Section 7
Examples
========
>>> from sympy.combinatorics.named_groups import DihedralGroup
>>> from sympy.combinatorics.named_groups import SymmetricGroup
>>> from sympy.combinatorics.named_groups import AlternatingGroup
>>> D = DihedralGroup(6)
>>> S = D.sylow_subgroup(2)
>>> S.order()
4
>>> G = SymmetricGroup(6)
>>> S = G.sylow_subgroup(5)
>>> S.order()
5
>>> G1 = AlternatingGroup(3)
>>> G2 = AlternatingGroup(5)
>>> G3 = AlternatingGroup(9)
>>> S1 = G1.sylow_subgroup(3)
>>> S2 = G2.sylow_subgroup(3)
>>> S3 = G3.sylow_subgroup(3)
>>> len1 = len(S1.lower_central_series())
>>> len2 = len(S2.lower_central_series())
>>> len3 = len(S3.lower_central_series())
>>> len1 == len2
True
>>> len1 < len3
True"""
(definition of PermutationGroup.sylow_subgroup.is_p_group:)
def is_p_group(G):
(definition of PermutationGroup.sylow_subgroup._sylow_reduce:)
def _sylow_reduce(mu, nu):
[end of new definitions in sympy/combinatorics/perm_groups.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | ec9e3c0436fbff934fa84e22bf07f1b3ef5bfac3 | ||
joke2k__faker-568 | 568 | joke2k/faker | null | c12a23f112265bf051d720a3758f9919631734ab | 2017-08-09T17:34:37Z | diff --git a/faker/providers/date_time/__init__.py b/faker/providers/date_time/__init__.py
index f4d2626bc3..3bb3c28b32 100644
--- a/faker/providers/date_time/__init__.py
+++ b/faker/providers/date_time/__init__.py
@@ -26,6 +26,20 @@ def datetime_to_timestamp(dt):
return timegm(dt.timetuple())
+def timestamp_to_datetime(timestamp, tzinfo):
+ if tzinfo is None:
+ pick = datetime.fromtimestamp(timestamp, tzlocal())
+ pick = pick.astimezone(tzutc()).replace(tzinfo=None)
+ else:
+ pick = datetime.fromtimestamp(timestamp, tzinfo)
+
+ return pick
+
+
+class ParseError(ValueError):
+ pass
+
+
timedelta_pattern = r''
for name, sym in [('years', 'y'), ('weeks', 'w'), ('days', 'd'), ('hours', 'h'), ('minutes', 'm'), ('seconds', 's')]:
timedelta_pattern += r'((?P<{0}>(?:\+|-)\d+?){1})?'.format(name, sym)
@@ -316,6 +330,37 @@ def time_object(cls):
"""
return cls.date_time().time()
+ @classmethod
+ def _parse_date_string(cls, value):
+ parts = cls.regex.match(value)
+ if not parts:
+ raise ParseError("Can't parse date string `{}`.".format(value))
+ parts = parts.groupdict()
+ time_params = {}
+ for (name, param) in parts.items():
+ if param:
+ time_params[name] = int(param)
+
+ if 'years' in time_params:
+ if 'days' not in time_params:
+ time_params['days'] = 0
+ time_params['days'] += 365.24 * time_params.pop('years')
+
+ if not time_params:
+ raise ParseError("Can't parse date string `{}`.".format(value))
+ return time_params
+
+ @classmethod
+ def _parse_timedelta(cls, value):
+ if isinstance(value, timedelta):
+ return value.total_seconds()
+ if is_string(value):
+ time_params = cls._parse_date_string(value)
+ return timedelta(**time_params).total_seconds()
+ if isinstance(value, (int, float)):
+ return value
+ raise ParseError("Invalid format for timedelta '{0}'".format(value))
+
@classmethod
def _parse_date_time(cls, text, tzinfo=None):
if isinstance(text, (datetime, date, real_datetime, real_date)):
@@ -326,24 +371,11 @@ def _parse_date_time(cls, text, tzinfo=None):
if is_string(text):
if text == 'now':
return datetime_to_timestamp(datetime.now(tzinfo))
- parts = cls.regex.match(text)
- if not parts:
- return
- parts = parts.groupdict()
- time_params = {}
- for (name, param) in parts.items():
- if param:
- time_params[name] = int(param)
-
- if 'years' in time_params:
- if 'days' not in time_params:
- time_params['days'] = 0
- time_params['days'] += 365.24 * time_params.pop('years')
-
+ time_params = cls._parse_date_string(text)
return datetime_to_timestamp(now + timedelta(**time_params))
if isinstance(text, int):
return datetime_to_timestamp(now + timedelta(text))
- raise ValueError("Invalid format for date '{0}'".format(text))
+ raise ParseError("Invalid format for date '{0}'".format(text))
@classmethod
def date_time_between(cls, start_date='-30y', end_date='now', tzinfo=None):
@@ -552,6 +584,35 @@ def date_time_this_month(cls, before_now=True, after_now=False, tzinfo=None):
else:
return now
+ @classmethod
+ def time_series(cls, start_date='-30d', end_date='now', precision=None, distrib=None, tzinfo=None):
+ """
+ Returns a generator yielding tuples of ``(<datetime>, <value>)``.
+
+ The data points will start at ``start_date``, and be at every time interval specified by
+ ``precision``.
+ """
+ start_date = cls._parse_date_time(start_date, tzinfo=tzinfo)
+ end_date = cls._parse_date_time(end_date, tzinfo=tzinfo)
+
+ if end_date < start_date:
+ raise ValueError("`end_date` must be greater than `start_date`.")
+
+ if precision is None:
+ precision = (end_date - start_date) / 30
+ precision = cls._parse_timedelta(precision)
+
+ if distrib is None:
+ distrib = lambda: random.uniform(0, precision) # noqa
+
+ if not callable(distrib):
+ raise ValueError("`distrib` must be a callable. Got {} instead.".format(distrib))
+
+ datapoint = start_date
+ while datapoint < end_date:
+ datapoint += precision
+ yield (timestamp_to_datetime(datapoint, tzinfo), distrib())
+
@classmethod
def am_pm(cls):
return cls.date('%p')
| diff --git a/tests/providers/date_time.py b/tests/providers/date_time.py
index c132384eeb..a4d405bc8a 100644
--- a/tests/providers/date_time.py
+++ b/tests/providers/date_time.py
@@ -231,6 +231,51 @@ def test_date_time_this_period_with_tzinfo(self):
datetime.now(utc).replace(second=0, microsecond=0)
)
+ def test_parse_timedelta(self):
+ from faker.providers.date_time import Provider
+
+ td = timedelta(days=7)
+ seconds = Provider._parse_timedelta(td)
+ self.assertEqual(seconds, 604800.0)
+
+ seconds = Provider._parse_timedelta('+1w')
+ self.assertEqual(seconds, 604800.0)
+
+ seconds = Provider._parse_timedelta('+1y')
+ self.assertEqual(seconds, 31556736.0)
+
+ with self.assertRaises(ValueError):
+ Provider._parse_timedelta('foobar')
+
+ def test_time_series(self):
+ from faker.providers.date_time import Provider
+
+ series = [i for i in Provider.time_series()]
+ self.assertTrue(len(series), 30)
+ self.assertTrue(series[1][0] - series[0][0], timedelta(days=1))
+
+ uniform = lambda: random.uniform(0, 5) # noqa
+ series = [i for i in Provider.time_series('now', '+1w', '+1d', uniform)]
+ self.assertTrue(len(series), 7)
+ self.assertTrue(series[1][0] - series[0][0], timedelta(days=1))
+
+ end = datetime.now() + timedelta(days=7)
+ series = [i for i in Provider.time_series('now', end, '+1d', uniform)]
+ self.assertTrue(len(series), 7)
+ self.assertTrue(series[1][0] - series[0][0], timedelta(days=1))
+
+ self.assertTrue(series[-1][0] <= end)
+
+ with self.assertRaises(ValueError):
+ [i for i in Provider.time_series('+1w', 'now', '+1d', uniform)]
+
+ with self.assertRaises(ValueError):
+ [i for i in Provider.time_series('now', '+1w', '+1d', 'uniform')]
+
+ series = [i for i in Provider.time_series('now', end, '+1d', uniform, tzinfo=utc)]
+ self.assertTrue(len(series), 7)
+ self.assertTrue(series[1][0] - series[0][0], timedelta(days=1))
+
class TestPlPL(unittest.TestCase):
| [
{
"components": [
{
"doc": "",
"lines": [
29,
36
],
"name": "timestamp_to_datetime",
"signature": "def timestamp_to_datetime(timestamp, tzinfo):",
"type": "function"
},
{
"doc": "",
"lines": [
39,
... | [
"tests/providers/date_time.py::TestDateTime::test_parse_timedelta",
"tests/providers/date_time.py::TestDateTime::test_time_series"
] | [
"tests/providers/date_time.py::TestDateTime::test_date_object",
"tests/providers/date_time.py::TestDateTime::test_date_time_between_dates",
"tests/providers/date_time.py::TestDateTime::test_date_time_between_dates_with_tzinfo",
"tests/providers/date_time.py::TestDateTime::test_date_time_this_period",
"tests... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Close #159. Add `time_series` provider
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/date_time/__init__.py]
(definition of timestamp_to_datetime:)
def timestamp_to_datetime(timestamp, tzinfo):
(definition of ParseError:)
class ParseError(ValueError):
(definition of Provider._parse_date_string:)
def _parse_date_string(cls, value):
(definition of Provider._parse_timedelta:)
def _parse_timedelta(cls, value):
(definition of Provider.time_series:)
def time_series(cls, start_date='-30d', end_date='now', precision=None, distrib=None, tzinfo=None):
"""Returns a generator yielding tuples of ``(<datetime>, <value>)``.
The data points will start at ``start_date``, and be at every time interval specified by
``precision``."""
[end of new definitions in faker/providers/date_time/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
is faking time series data in the scope of this project ?
I wanted some fake time series data for a project and couldn't find anything suitable for my needs.
Is something like [this](http://www.xaprb.com/blog/2014/01/24/methods-generate-realistic-time-series-data/) in the scope of this project ?
----------
Yes is in the scope absolutely, do you have some ideas how to implement that?
Interested on this also -- it's been 3 years...
@dodysw It's been 3 years and we haven't received any PR for it yet. Could you prepare one?
I could invest some hours for this.
Is there a expectation for this PR? e.g. what's the input, what's the output?
Currently I have a simple implementation that generates x number of random datetime(s) between two datetimes that follows a normal distribution. These list of datetimes then can be iterated as one of attribute or input to the the actual time series data generator.
The post above mentioned [this blog](https://www.xaprb.com/blog/2014/01/24/methods-generate-realistic-time-series-data/) which seems overall a great source of ideas, but I don't have the time to implement all of them in one PR.
Don't know yet if I can help, but I'm sure interested.
One PR is certainly not enough. Referring to '[The Cathedral and the Bazaar](https://en.wikipedia.org/wiki/The_Cathedral_and_the_Bazaar)', lesson 7: **Release early. Release often. And listen to your customers**
The other lessons are great for inspiration also ...
What should the output be? just a list of `datetime`s?
I would prefer an iterable list instead of a 'simple' list. The applications I have in mind often need to augment (or combine) the series with other data. That way, it's easy to add more data to the items in the series.
E.g. a dateseries that represents number of (simulated) visitors of a museum/airport/... at a given moment.
I'm thinking of a method with this signature:
```
def time_serie(start, end, precision, distribution):
```
`distribution` will be the function generating the values. `precision` is basically 'how often you want a datapoint'.
For example:
```
>>> import random
>>> from datetime import datetime, timedelta
>>> distribution = lambda: random.uniform(0, 5)
>>> time_serie(datetime.now(), datetime.now()+timedelta(days=7), precision=timedelta(days=1), distribution=distribution)
```
The output could be an iterable of tuples, something like:
```
(
(datetime.datetime(2017, 8, 8, 17, 9, 0, 569562), 0.767017023571912),
(datetime.datetime(2017, 8, 9, 17, 9, 0, 569562), 0.550568452469854),
(datetime.datetime(2017, 8, 10, 17, 9, 0, 569562), 1.8060492750124613),
(datetime.datetime(2017, 8, 11, 17, 9, 0, 569562), 1.0195647347289332),
(datetime.datetime(2017, 8, 12, 17, 9, 0, 569562), 4.063748959116069),
(datetime.datetime(2017, 8, 13, 17, 9, 0, 569562), 1.2355522029691586),
(datetime.datetime(2017, 8, 14, 17, 9, 0, 569562), 3.9570822924224434),
)
```
That seems very workable. My idea was to have a 'class' with an initializer where you could set (most) of the parameters and then have a method to generate the series. That was closer aligned to the idea description in the blog. But I don't object to your suggestion at all. First thing is to have something workable, then we can fine-tune.
I've got a draft working, let me know if there's anything wrong that jumps to your eye
https://gist.github.com/fcurella/8316c3a27c322eb0e02c3d4714de15c6
--------------------
</issues> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | |
joke2k__faker-561 | 561 | joke2k/faker | null | 3f6aec0b9be140cb0cdccf022cd459936a21c4e4 | 2017-07-23T09:01:29Z | diff --git a/faker/providers/person/pl_PL/__init__.py b/faker/providers/person/pl_PL/__init__.py
index aef9f9d167..94d491509a 100644
--- a/faker/providers/person/pl_PL/__init__.py
+++ b/faker/providers/person/pl_PL/__init__.py
@@ -3,6 +3,24 @@
from .. import Provider as PersonProvider
+def checksum_identity_card_number(characters):
+ """
+ Calculates and returns a control digit for given list of characters basing on Identity Card Number standards.
+ """
+ weights_for_check_digit = [7, 3, 1, 0, 7, 3, 1, 7, 3]
+ check_digit = 0
+
+ for i in range(3):
+ check_digit += weights_for_check_digit[i] * (ord(characters[i]) - 55)
+
+ for i in range(4, 9):
+ check_digit += weights_for_check_digit[i] * characters[i]
+
+ check_digit %= 10
+
+ return check_digit
+
+
class Provider(PersonProvider):
formats = (
'{{first_name}} {{last_name}}',
@@ -527,3 +545,29 @@ class Provider(PersonProvider):
@classmethod
def last_name(cls):
return cls.random_element(cls.unisex_last_names)
+
+ @classmethod
+ def identity_card_number(cls):
+ """
+ Returns 9 character Polish Identity Card Number,
+ Polish: Numer Dowodu Osobistego.
+
+ The card number consists of 3 letters followed by 6 digits (for example, ABA300000),
+ of which the first digit (at position 3) is the check digit.
+
+ https://en.wikipedia.org/wiki/Polish_identity_card
+ """
+ identity = []
+
+ for _ in range(3):
+ identity.append(cls.random_letter().upper())
+
+ # it will be overwritten by a checksum
+ identity.append(0)
+
+ for _ in range(5):
+ identity.append(cls.random_digit())
+
+ identity[3] = checksum_identity_card_number(identity)
+
+ return ''.join(str(character) for character in identity)
| diff --git a/tests/providers/person.py b/tests/providers/person.py
index 49732ae888..925fd3e7fc 100644
--- a/tests/providers/person.py
+++ b/tests/providers/person.py
@@ -2,11 +2,14 @@
from __future__ import unicode_literals
+import re
import unittest
from faker import Factory
from faker.providers.person.ne_NP import Provider as NeProvider
from faker.providers.person.sv_SE import Provider as SvSEProvider
+from faker.providers.person.pl_PL import (Provider as PlProvider,
+ checksum_identity_card_number as pl_checksum_identity_card_number)
from .. import string_types
@@ -100,3 +103,20 @@ def test_gender_first_names(self):
self.assertIn(name, SvSEProvider.first_names_male)
name = self.factory.first_name()
self.assertIn(name, SvSEProvider.first_names)
+
+
+class TestPlPL(unittest.TestCase):
+
+ def setUp(self):
+ self.factory = Factory.create('pl_PL')
+
+ def test_identity_card_number_checksum(self):
+ self.assertEqual(pl_checksum_identity_card_number(['A', 'I', 'S', 8, 5, 0, 2, 1, 4]), 8)
+ self.assertEqual(pl_checksum_identity_card_number(['A', 'U', 'L', 9, 2, 7, 2, 8, 5]), 9)
+ self.assertEqual(pl_checksum_identity_card_number(['A', 'E', 'I', 2, 5, 1, 8, 2, 4]), 2)
+ self.assertEqual(pl_checksum_identity_card_number(['A', 'H', 'F', 2, 2, 0, 6, 8, 0]), 2)
+ self.assertEqual(pl_checksum_identity_card_number(['A', 'X', 'E', 8, 2, 0, 3, 4, 0]), 8)
+
+ def test_identity_card_number(self):
+ for _ in range(100):
+ self.assertTrue(re.search(r'^[A-Z]{3}\d{6}$', PlProvider.identity_card_number()))
| [
{
"components": [
{
"doc": "Calculates and returns a control digit for given list of characters basing on Identity Card Number standards.",
"lines": [
6,
21
],
"name": "checksum_identity_card_number",
"signature": "def checksum_identity_card_numb... | [
"tests/providers/person.py::TestJaJP::test_person",
"tests/providers/person.py::TestNeNP::test_names",
"tests/providers/person.py::TestSvSE::test_gender_first_names",
"tests/providers/person.py::TestPlPL::test_identity_card_number",
"tests/providers/person.py::TestPlPL::test_identity_card_number_checksum"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Adding an implementation of `identity_card_number` for pl_PL person provider
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/person/pl_PL/__init__.py]
(definition of checksum_identity_card_number:)
def checksum_identity_card_number(characters):
"""Calculates and returns a control digit for given list of characters basing on Identity Card Number standards."""
(definition of Provider.identity_card_number:)
def identity_card_number(cls):
"""Returns 9 character Polish Identity Card Number,
Polish: Numer Dowodu Osobistego.
The card number consists of 3 letters followed by 6 digits (for example, ABA300000),
of which the first digit (at position 3) is the check digit.
https://en.wikipedia.org/wiki/Polish_identity_card"""
[end of new definitions in faker/providers/person/pl_PL/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
scikit-learn__scikit-learn-9424 | 9,424 | scikit-learn/scikit-learn | 0.21 | 4c2eb3a0d67cbdacdb9314f585b2f73590ff0ac8 | 2017-07-20T13:05:03Z | diff --git a/doc/modules/classes.rst b/doc/modules/classes.rst
index e09ca0422d8a7..86db1d361a639 100644
--- a/doc/modules/classes.rst
+++ b/doc/modules/classes.rst
@@ -1400,6 +1400,7 @@ Low-level methods
tree.export_graphviz
tree.plot_tree
+ tree.export_text
.. _utils_ref:
diff --git a/doc/modules/tree.rst b/doc/modules/tree.rst
index affe46385d79b..bda9a5721835e 100644
--- a/doc/modules/tree.rst
+++ b/doc/modules/tree.rst
@@ -182,6 +182,29 @@ render these plots inline automatically::
:align: center
:scale: 75
+Alternatively, the tree can also be exported in textual format with the
+function :func:`export_text`. This method doesn't require the installation
+of external libraries and is more compact:
+
+ >>> from sklearn.datasets import load_iris
+ >>> from sklearn.tree import DecisionTreeClassifier
+ >>> from sklearn.tree.export import export_text
+ >>> iris = load_iris()
+ >>> X = iris['data']
+ >>> y = iris['target']
+ >>> decision_tree = DecisionTreeClassifier(random_state=0, max_depth=2)
+ >>> decision_tree = decision_tree.fit(X, y)
+ >>> r = export_text(decision_tree, feature_names=iris['feature_names'])
+ >>> print(r)
+ |--- petal width (cm) <= 0.80
+ | |--- class: 0
+ |--- petal width (cm) > 0.80
+ | |--- petal width (cm) <= 1.75
+ | | |--- class: 1
+ | |--- petal width (cm) > 1.75
+ | | |--- class: 2
+ <BLANKLINE>
+
.. topic:: Examples:
* :ref:`sphx_glr_auto_examples_tree_plot_iris.py`
diff --git a/doc/whats_new/v0.21.rst b/doc/whats_new/v0.21.rst
index d2b5596e12d38..02159ddc86bcf 100644
--- a/doc/whats_new/v0.21.rst
+++ b/doc/whats_new/v0.21.rst
@@ -199,6 +199,10 @@ Support for Python 3.4 and below has been officially dropped.
:func:`tree.plot_tree` without relying on the ``dot`` library,
removing a hard-to-install dependency. :issue:`8508` by `Andreas Müller`_.
+- |Feature| Decision Trees can now be exported in a human readable
+ textual format using :func:`tree.export.export_text`.
+ :issue:`6261` by `Giuseppe Vettigli <JustGlowing>`.
+
- |Feature| ``get_n_leaves()`` and ``get_depth()`` have been added to
:class:`tree.BaseDecisionTree` and consequently all estimators based
on it, including :class:`tree.DecisionTreeClassifier`,
diff --git a/sklearn/tree/__init__.py b/sklearn/tree/__init__.py
index b3abe30d019fa..e91540bed8c5f 100644
--- a/sklearn/tree/__init__.py
+++ b/sklearn/tree/__init__.py
@@ -7,8 +7,8 @@
from .tree import DecisionTreeRegressor
from .tree import ExtraTreeClassifier
from .tree import ExtraTreeRegressor
-from .export import export_graphviz, plot_tree
+from .export import export_graphviz, plot_tree, export_text
__all__ = ["DecisionTreeClassifier", "DecisionTreeRegressor",
"ExtraTreeClassifier", "ExtraTreeRegressor", "export_graphviz",
- "plot_tree"]
+ "plot_tree", "export_text"]
diff --git a/sklearn/tree/export.py b/sklearn/tree/export.py
index 017275cfb1c19..e45fcfdf7d71a 100644
--- a/sklearn/tree/export.py
+++ b/sklearn/tree/export.py
@@ -9,6 +9,7 @@
# Satrajit Gosh <satrajit.ghosh@gmail.com>
# Trevor Stephens <trev.stephens@gmail.com>
# Li Li <aiki.nogard@gmail.com>
+# Giuseppe Vettigli <vettigli@gmail.com>
# License: BSD 3 clause
import warnings
from io import StringIO
@@ -22,6 +23,7 @@
from . import _criterion
from . import _tree
from ._reingold_tilford import buchheim, Tree
+from . import DecisionTreeClassifier
def _color_brew(n):
@@ -778,3 +780,178 @@ def export_graphviz(decision_tree, out_file=None, max_depth=None,
finally:
if own_file:
out_file.close()
+
+
+def _compute_depth(tree, node):
+ """
+ Returns the depth of the subtree rooted in node.
+ """
+ def compute_depth_(current_node, current_depth,
+ children_left, children_right, depths):
+ depths += [current_depth]
+ left = children_left[current_node]
+ right = children_right[current_node]
+ if left != -1 and right != -1:
+ compute_depth_(left, current_depth+1,
+ children_left, children_right, depths)
+ compute_depth_(right, current_depth+1,
+ children_left, children_right, depths)
+
+ depths = []
+ compute_depth_(node, 1, tree.children_left, tree.children_right, depths)
+ return max(depths)
+
+
+def export_text(decision_tree, feature_names=None, max_depth=10,
+ spacing=3, decimals=2, show_weights=False):
+ """Build a text report showing the rules of a decision tree.
+
+ Note that backwards compatibility may not be supported.
+
+ Parameters
+ ----------
+ decision_tree : object
+ The decision tree estimator to be exported.
+ It can be an instance of
+ DecisionTreeClassifier or DecisionTreeRegressor.
+
+ feature_names : list, optional (default=None)
+ A list of length n_features containing the feature names.
+ If None generic names will be used ("feature_0", "feature_1", ...).
+
+ max_depth : int, optional (default=10)
+ Only the first max_depth levels of the tree are exported.
+ Truncated branches will be marked with "...".
+
+ spacing : int, optional (default=3)
+ Number of spaces between edges. The higher it is, the wider the result.
+
+ decimals : int, optional (default=2)
+ Number of decimal digits to display.
+
+ show_weights : bool, optional (default=False)
+ If true the classification weights will be exported on each leaf.
+ The classification weights are the number of samples each class.
+
+ Returns
+ -------
+ report : string
+ Text summary of all the rules in the decision tree.
+
+ Examples
+ -------
+
+ >>> from sklearn.datasets import load_iris
+ >>> from sklearn.tree import DecisionTreeClassifier
+ >>> from sklearn.tree.export import export_text
+ >>> iris = load_iris()
+ >>> X = iris['data']
+ >>> y = iris['target']
+ >>> decision_tree = DecisionTreeClassifier(random_state=0, max_depth=2)
+ >>> decision_tree = decision_tree.fit(X, y)
+ >>> r = export_text(decision_tree, feature_names=iris['feature_names'])
+ >>> print(r)
+ |--- petal width (cm) <= 0.80
+ | |--- class: 0
+ |--- petal width (cm) > 0.80
+ | |--- petal width (cm) <= 1.75
+ | | |--- class: 1
+ | |--- petal width (cm) > 1.75
+ | | |--- class: 2
+ ...
+ """
+ check_is_fitted(decision_tree, 'tree_')
+ tree_ = decision_tree.tree_
+ class_names = decision_tree.classes_
+ right_child_fmt = "{} {} <= {}\n"
+ left_child_fmt = "{} {} > {}\n"
+ truncation_fmt = "{} {}\n"
+
+ if max_depth < 0:
+ raise ValueError("max_depth bust be >= 0, given %d" % max_depth)
+
+ if (feature_names is not None and
+ len(feature_names) != tree_.n_features):
+ raise ValueError("feature_names must contain "
+ "%d elements, got %d" % (tree_.n_features,
+ len(feature_names)))
+
+ if spacing <= 0:
+ raise ValueError("spacing must be > 0, given %d" % spacing)
+
+ if decimals < 0:
+ raise ValueError("decimals must be >= 0, given %d" % decimals)
+
+ if isinstance(decision_tree, DecisionTreeClassifier):
+ value_fmt = "{}{} weights: {}\n"
+ if not show_weights:
+ value_fmt = "{}{}{}\n"
+ else:
+ value_fmt = "{}{} value: {}\n"
+
+ if feature_names:
+ feature_names_ = [feature_names[i] for i in tree_.feature]
+ else:
+ feature_names_ = ["feature_{}".format(i) for i in tree_.feature]
+
+ export_text.report = ""
+
+ def _add_leaf(value, class_name, indent):
+ val = ''
+ is_classification = isinstance(decision_tree,
+ DecisionTreeClassifier)
+ if show_weights or not is_classification:
+ val = ["{1:.{0}f}, ".format(decimals, v) for v in value]
+ val = '['+''.join(val)[:-2]+']'
+ if is_classification:
+ val += ' class: ' + str(class_name)
+ export_text.report += value_fmt.format(indent, '', val)
+
+ def print_tree_recurse(node, depth):
+ indent = ("|" + (" " * spacing)) * depth
+ indent = indent[:-spacing] + "-" * spacing
+
+ value = None
+ if tree_.n_outputs == 1:
+ value = tree_.value[node][0]
+ else:
+ value = tree_.value[node].T[0]
+ class_name = np.argmax(value)
+
+ if (tree_.n_classes[0] != 1 and
+ tree_.n_outputs == 1):
+ class_name = class_names[class_name]
+
+ if depth <= max_depth+1:
+ info_fmt = ""
+ info_fmt_left = info_fmt
+ info_fmt_right = info_fmt
+
+ if tree_.feature[node] != _tree.TREE_UNDEFINED:
+ name = feature_names_[node]
+ threshold = tree_.threshold[node]
+ threshold = "{1:.{0}f}".format(decimals, threshold)
+ export_text.report += right_child_fmt.format(indent,
+ name,
+ threshold)
+ export_text.report += info_fmt_left
+ print_tree_recurse(tree_.children_left[node], depth+1)
+
+ export_text.report += left_child_fmt.format(indent,
+ name,
+ threshold)
+ export_text.report += info_fmt_right
+ print_tree_recurse(tree_.children_right[node], depth+1)
+ else: # leaf
+ _add_leaf(value, class_name, indent)
+ else:
+ subtree_depth = _compute_depth(tree_, node)
+ if subtree_depth == 1:
+ _add_leaf(value, class_name, indent)
+ else:
+ trunc_report = 'truncated branch of depth %d' % subtree_depth
+ export_text.report += truncation_fmt.format(indent,
+ trunc_report)
+
+ print_tree_recurse(0, 1)
+ return export_text.report
| diff --git a/sklearn/tree/tests/test_export.py b/sklearn/tree/tests/test_export.py
index 6c765675faf76..65b0a201be369 100644
--- a/sklearn/tree/tests/test_export.py
+++ b/sklearn/tree/tests/test_export.py
@@ -4,13 +4,14 @@
import pytest
from re import finditer, search
+from textwrap import dedent
from numpy.random import RandomState
from sklearn.base import is_classifier
from sklearn.tree import DecisionTreeClassifier, DecisionTreeRegressor
from sklearn.ensemble import GradientBoostingClassifier
-from sklearn.tree import export_graphviz, plot_tree
+from sklearn.tree import export_graphviz, plot_tree, export_text
from io import StringIO
from sklearn.utils.testing import (assert_in, assert_equal, assert_raises,
assert_less_equal, assert_raises_regex,
@@ -311,6 +312,93 @@ def test_precision():
precision + 1)
+def test_export_text_errors():
+ clf = DecisionTreeClassifier(max_depth=2, random_state=0)
+ clf.fit(X, y)
+
+ assert_raise_message(ValueError,
+ "max_depth bust be >= 0, given -1",
+ export_text, clf, max_depth=-1)
+ assert_raise_message(ValueError,
+ "feature_names must contain 2 elements, got 1",
+ export_text, clf, feature_names=['a'])
+ assert_raise_message(ValueError,
+ "decimals must be >= 0, given -1",
+ export_text, clf, decimals=-1)
+ assert_raise_message(ValueError,
+ "spacing must be > 0, given 0",
+ export_text, clf, spacing=0)
+
+
+def test_export_text():
+ clf = DecisionTreeClassifier(max_depth=2, random_state=0)
+ clf.fit(X, y)
+
+ expected_report = dedent("""
+ |--- feature_1 <= 0.00
+ | |--- class: -1
+ |--- feature_1 > 0.00
+ | |--- class: 1
+ """).lstrip()
+
+ assert export_text(clf) == expected_report
+ # testing that leaves at level 1 are not truncated
+ assert export_text(clf, max_depth=0) == expected_report
+ # testing that the rest of the tree is truncated
+ assert export_text(clf, max_depth=10) == expected_report
+
+ expected_report = dedent("""
+ |--- b <= 0.00
+ | |--- class: -1
+ |--- b > 0.00
+ | |--- class: 1
+ """).lstrip()
+ assert export_text(clf, feature_names=['a', 'b']) == expected_report
+
+ expected_report = dedent("""
+ |--- feature_1 <= 0.00
+ | |--- weights: [3.00, 0.00] class: -1
+ |--- feature_1 > 0.00
+ | |--- weights: [0.00, 3.00] class: 1
+ """).lstrip()
+ assert export_text(clf, show_weights=True) == expected_report
+
+ expected_report = dedent("""
+ |- feature_1 <= 0.00
+ | |- class: -1
+ |- feature_1 > 0.00
+ | |- class: 1
+ """).lstrip()
+ assert export_text(clf, spacing=1) == expected_report
+
+ X_l = [[-2, -1], [-1, -1], [-1, -2], [1, 1], [1, 2], [2, 1], [-1, 1]]
+ y_l = [-1, -1, -1, 1, 1, 1, 2]
+ clf = DecisionTreeClassifier(max_depth=4, random_state=0)
+ clf.fit(X_l, y_l)
+ expected_report = dedent("""
+ |--- feature_1 <= 0.00
+ | |--- class: -1
+ |--- feature_1 > 0.00
+ | |--- truncated branch of depth 2
+ """).lstrip()
+ assert export_text(clf, max_depth=0) == expected_report
+
+ X_mo = [[-2, -1], [-1, -1], [-1, -2], [1, 1], [1, 2], [2, 1]]
+ y_mo = [[-1, -1], [-1, -1], [-1, -1], [1, 1], [1, 1], [1, 1]]
+
+ reg = DecisionTreeRegressor(max_depth=2, random_state=0)
+ reg.fit(X_mo, y_mo)
+
+ expected_report = dedent("""
+ |--- feature_1 <= 0.0
+ | |--- value: [-1.0, -1.0]
+ |--- feature_1 > 0.0
+ | |--- value: [1.0, 1.0]
+ """).lstrip()
+ assert export_text(reg, decimals=1) == expected_report
+ assert export_text(reg, decimals=1, show_weights=True) == expected_report
+
+
def test_plot_tree():
# mostly smoke tests
pytest.importorskip("matplotlib.pyplot")
| diff --git a/doc/modules/classes.rst b/doc/modules/classes.rst
index e09ca0422d8a7..86db1d361a639 100644
--- a/doc/modules/classes.rst
+++ b/doc/modules/classes.rst
@@ -1400,6 +1400,7 @@ Low-level methods
tree.export_graphviz
tree.plot_tree
+ tree.export_text
.. _utils_ref:
diff --git a/doc/modules/tree.rst b/doc/modules/tree.rst
index affe46385d79b..bda9a5721835e 100644
--- a/doc/modules/tree.rst
+++ b/doc/modules/tree.rst
@@ -182,6 +182,29 @@ render these plots inline automatically::
:align: center
:scale: 75
+Alternatively, the tree can also be exported in textual format with the
+function :func:`export_text`. This method doesn't require the installation
+of external libraries and is more compact:
+
+ >>> from sklearn.datasets import load_iris
+ >>> from sklearn.tree import DecisionTreeClassifier
+ >>> from sklearn.tree.export import export_text
+ >>> iris = load_iris()
+ >>> X = iris['data']
+ >>> y = iris['target']
+ >>> decision_tree = DecisionTreeClassifier(random_state=0, max_depth=2)
+ >>> decision_tree = decision_tree.fit(X, y)
+ >>> r = export_text(decision_tree, feature_names=iris['feature_names'])
+ >>> print(r)
+ |--- petal width (cm) <= 0.80
+ | |--- class: 0
+ |--- petal width (cm) > 0.80
+ | |--- petal width (cm) <= 1.75
+ | | |--- class: 1
+ | |--- petal width (cm) > 1.75
+ | | |--- class: 2
+ <BLANKLINE>
+
.. topic:: Examples:
* :ref:`sphx_glr_auto_examples_tree_plot_iris.py`
diff --git a/doc/whats_new/v0.21.rst b/doc/whats_new/v0.21.rst
index d2b5596e12d38..02159ddc86bcf 100644
--- a/doc/whats_new/v0.21.rst
+++ b/doc/whats_new/v0.21.rst
@@ -199,6 +199,10 @@ Support for Python 3.4 and below has been officially dropped.
:func:`tree.plot_tree` without relying on the ``dot`` library,
removing a hard-to-install dependency. :issue:`8508` by `Andreas Müller`_.
+- |Feature| Decision Trees can now be exported in a human readable
+ textual format using :func:`tree.export.export_text`.
+ :issue:`6261` by `Giuseppe Vettigli <JustGlowing>`.
+
- |Feature| ``get_n_leaves()`` and ``get_depth()`` have been added to
:class:`tree.BaseDecisionTree` and consequently all estimators based
on it, including :class:`tree.DecisionTreeClassifier`,
| [
{
"components": [
{
"doc": "Returns the depth of the subtree rooted in node.",
"lines": [
785,
802
],
"name": "_compute_depth",
"signature": "def _compute_depth(tree, node):",
"type": "function"
},
{
"doc": "",
... | [
"sklearn/tree/tests/test_export.py::test_graphviz_toy",
"sklearn/tree/tests/test_export.py::test_graphviz_errors",
"sklearn/tree/tests/test_export.py::test_friedman_mse_in_graphviz",
"sklearn/tree/tests/test_export.py::test_precision",
"sklearn/tree/tests/test_export.py::test_export_text_errors",
"sklearn... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
[MRG] Print Decision Trees in ASCII format
#### What does this implement/fix? Explain your changes.
I noticed that the only way to visually check the rules of a Decision Tree was to export the tree in dot format then use Graph Viz. This process is a bit tedious and may be impossible on some machines. So, I decided to implement a little method to print the tree in ASCII:
```python
from sklearn.datasets import load_iris
from sklearn.tree import DecisionTreeClassifier
from sklearn.tree.export import export_ascii
iris = load_iris()
X = iris['data']
y = iris['target']
decision_tree = DecisionTreeClassifier(random_state=0, max_depth=2)
decision_tree.fit(X, y)
print(export_ascii(decision_tree, feature_names=iris['feature_names'],
class_names=iris['target_names'], show_class=True))
```
```
|---petal width (cm) <= 0.80
| | (class: setosa)
| |---* value: [ 50. 0. 0.]
| | | (class: setosa)
|---petal width (cm) > 0.80
| | (class: setosa)
| |---petal width (cm) <= 1.75
| | | (class: versicolor)
| | |---* value: [ 0. 49. 5.]
| | | | (class: versicolor)
| |---petal width (cm) > 1.75
| | | (class: versicolor)
| | |---* value: [ 0. 1. 45.]
| | | | (class: virginica)
```
Notes:
- The format was inspired by a similar functionality implemented in Weka: http://facweb.cs.depaul.edu/mobasher/classes/ect584/WEKA/classify/figure24-b.gif
- The code has been inspired by this post: https://stackoverflow.com/questions/20224526/how-to-extract-the-decision-rules-from-scikit-learn-decision-tree
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sklearn/tree/export.py]
(definition of _compute_depth:)
def _compute_depth(tree, node):
"""Returns the depth of the subtree rooted in node."""
(definition of _compute_depth.compute_depth_:)
def compute_depth_(current_node, current_depth, children_left, children_right, depths):
(definition of export_text:)
def export_text(decision_tree, feature_names=None, max_depth=10, spacing=3, decimals=2, show_weights=False):
"""Build a text report showing the rules of a decision tree.
Note that backwards compatibility may not be supported.
Parameters
----------
decision_tree : object
The decision tree estimator to be exported.
It can be an instance of
DecisionTreeClassifier or DecisionTreeRegressor.
feature_names : list, optional (default=None)
A list of length n_features containing the feature names.
If None generic names will be used ("feature_0", "feature_1", ...).
max_depth : int, optional (default=10)
Only the first max_depth levels of the tree are exported.
Truncated branches will be marked with "...".
spacing : int, optional (default=3)
Number of spaces between edges. The higher it is, the wider the result.
decimals : int, optional (default=2)
Number of decimal digits to display.
show_weights : bool, optional (default=False)
If true the classification weights will be exported on each leaf.
The classification weights are the number of samples each class.
Returns
-------
report : string
Text summary of all the rules in the decision tree.
Examples
-------
>>> from sklearn.datasets import load_iris
>>> from sklearn.tree import DecisionTreeClassifier
>>> from sklearn.tree.export import export_text
>>> iris = load_iris()
>>> X = iris['data']
>>> y = iris['target']
>>> decision_tree = DecisionTreeClassifier(random_state=0, max_depth=2)
>>> decision_tree = decision_tree.fit(X, y)
>>> r = export_text(decision_tree, feature_names=iris['feature_names'])
>>> print(r)
|--- petal width (cm) <= 0.80
| |--- class: 0
|--- petal width (cm) > 0.80
| |--- petal width (cm) <= 1.75
| | |--- class: 1
| |--- petal width (cm) > 1.75
| | |--- class: 2
..."""
(definition of export_text._add_leaf:)
def _add_leaf(value, class_name, indent):
(definition of export_text.print_tree_recurse:)
def print_tree_recurse(node, depth):
[end of new definitions in sklearn/tree/export.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 66cc1c7342f7f0cc0dc57fb6d56053fc46c8e5f0 | |
joke2k__faker-558 | 558 | joke2k/faker | null | 987337ac680019ad473f47c7687bd2eb41761d83 | 2017-07-17T22:33:47Z | diff --git a/faker/providers/company/pl_PL/__init__.py b/faker/providers/company/pl_PL/__init__.py
index a8d81a4078..719d7e9b7c 100644
--- a/faker/providers/company/pl_PL/__init__.py
+++ b/faker/providers/company/pl_PL/__init__.py
@@ -38,6 +38,21 @@ def local_regon_checksum(digits):
return check_digit
+def company_vat_checksum(digits):
+ """
+ Calculates and returns a control digit for given list of digits basing on NIP standard.
+ """
+ weights_for_check_digit = [6, 5, 7, 2, 3, 4, 5, 6, 7]
+ check_digit = 0
+
+ for i in range(0, 9):
+ check_digit += weights_for_check_digit[i] * digits[i]
+
+ check_digit %= 11
+
+ return check_digit
+
+
class Provider(CompanyProvider):
@classmethod
@@ -74,3 +89,29 @@ def local_regon(cls):
regon_digits.append(local_regon_checksum(regon_digits))
return ''.join(str(digit) for digit in regon_digits)
+
+ @classmethod
+ def company_vat(cls):
+ """
+ Returns 10 character tax identification number,
+ Polish: Numer identyfikacji podatkowej.
+
+ https://pl.wikipedia.org/wiki/NIP
+ """
+ vat_digits = []
+
+ for _ in range(3):
+ vat_digits.append(cls.random_digit_not_null())
+
+ for _ in range(6):
+ vat_digits.append(cls.random_digit())
+
+ check_digit = company_vat_checksum(vat_digits)
+
+ # in this case we must generate a tax number again, because check_digit cannot be 10
+ if check_digit == 10:
+ return cls.company_vat()
+
+ vat_digits.append(check_digit)
+
+ return ''.join(str(digit) for digit in vat_digits)
| diff --git a/tests/providers/company.py b/tests/providers/company.py
index 62c864d4f0..06450c0a11 100644
--- a/tests/providers/company.py
+++ b/tests/providers/company.py
@@ -9,7 +9,8 @@
from faker.providers.company.hu_HU import Provider as HuProvider
from faker.providers.company.ja_JP import Provider as JaProvider
from faker.providers.company.pt_BR import Provider as PtProvider, company_id_checksum
-from faker.providers.company.pl_PL import Provider as PlProvider, regon_checksum, local_regon_checksum
+from faker.providers.company.pl_PL import (Provider as PlProvider, regon_checksum, local_regon_checksum,
+ company_vat_checksum)
from .. import string_types
@@ -87,13 +88,23 @@ def test_regon(self):
self.assertTrue(re.search(r'^\d{9}$', PlProvider.regon()))
def test_local_regon_checksum(self):
- self.assertEquals(local_regon_checksum([1, 2, 3, 4, 5, 6, 7, 8, 5, 1, 2, 3, 4]), 7)
- self.assertEquals(local_regon_checksum([6, 1, 1, 9, 4, 8, 8, 3, 2, 7, 5, 8, 0]), 3)
- self.assertEquals(local_regon_checksum([8, 9, 2, 0, 0, 3, 6, 6, 0, 7, 0, 3, 2]), 3)
- self.assertEquals(local_regon_checksum([3, 5, 7, 7, 1, 0, 2, 2, 2, 5, 4, 3, 3]), 0)
- self.assertEquals(local_regon_checksum([9, 3, 5, 3, 1, 1, 0, 1, 2, 4, 8, 8, 2]), 1)
+ self.assertEqual(local_regon_checksum([1, 2, 3, 4, 5, 6, 7, 8, 5, 1, 2, 3, 4]), 7)
+ self.assertEqual(local_regon_checksum([6, 1, 1, 9, 4, 8, 8, 3, 2, 7, 5, 8, 0]), 3)
+ self.assertEqual(local_regon_checksum([8, 9, 2, 0, 0, 3, 6, 6, 0, 7, 0, 3, 2]), 3)
+ self.assertEqual(local_regon_checksum([3, 5, 7, 7, 1, 0, 2, 2, 2, 5, 4, 3, 3]), 0)
+ self.assertEqual(local_regon_checksum([9, 3, 5, 3, 1, 1, 0, 1, 2, 4, 8, 8, 2]), 1)
def test_local_regon(self):
for _ in range(100):
self.assertTrue(re.search(r'^\d{14}$', PlProvider.local_regon()))
+ def test_company_vat_checksum(self):
+ self.assertEqual(company_vat_checksum([7, 7, 5, 7, 7, 7, 6, 0, 5]), 9)
+ self.assertEqual(company_vat_checksum([1, 8, 6, 5, 4, 9, 9, 6, 4]), 2)
+ self.assertEqual(company_vat_checksum([7, 1, 2, 8, 9, 2, 4, 9, 9]), 7)
+ self.assertEqual(company_vat_checksum([3, 5, 4, 6, 1, 0, 6, 5, 8]), 4)
+ self.assertEqual(company_vat_checksum([3, 1, 9, 5, 5, 7, 0, 4, 5]), 0)
+
+ def test_company_vat(self):
+ for _ in range(100):
+ self.assertTrue(re.search(r'^\d{10}$', PlProvider.company_vat()))
| [
{
"components": [
{
"doc": "Calculates and returns a control digit for given list of digits basing on NIP standard.",
"lines": [
41,
53
],
"name": "company_vat_checksum",
"signature": "def company_vat_checksum(digits):",
"type": "function... | [
"tests/providers/company.py::TestJaJP::test_company",
"tests/providers/company.py::TestPtBR::test_pt_BR_cnpj",
"tests/providers/company.py::TestPtBR::test_pt_BR_company_id",
"tests/providers/company.py::TestPtBR::test_pt_BR_company_id_checksum",
"tests/providers/company.py::TestHuHU::test_company",
"tests... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Adding an implementation of `company_vat` for pl_PL company provider.
NIP is a tax identification number in Poland.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/company/pl_PL/__init__.py]
(definition of company_vat_checksum:)
def company_vat_checksum(digits):
"""Calculates and returns a control digit for given list of digits basing on NIP standard."""
(definition of Provider.company_vat:)
def company_vat(cls):
"""Returns 10 character tax identification number,
Polish: Numer identyfikacji podatkowej.
https://pl.wikipedia.org/wiki/NIP"""
[end of new definitions in faker/providers/company/pl_PL/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
joke2k__faker-554 | 554 | joke2k/faker | null | 76667a5695927cb1c44f753c1bf671df3937af73 | 2017-07-12T08:45:33Z | diff --git a/faker/providers/company/pl_PL/__init__.py b/faker/providers/company/pl_PL/__init__.py
index af99b27b14..a8d81a4078 100644
--- a/faker/providers/company/pl_PL/__init__.py
+++ b/faker/providers/company/pl_PL/__init__.py
@@ -20,6 +20,24 @@ def regon_checksum(digits):
return check_digit
+def local_regon_checksum(digits):
+ """
+ Calculates and returns a control digit for given list of digits basing on local REGON standard.
+ """
+ weights_for_check_digit = [2, 4, 8, 5, 0, 9, 7, 3, 6, 1, 2, 4, 8]
+ check_digit = 0
+
+ for i in range(0, 13):
+ check_digit += weights_for_check_digit[i] * digits[i]
+
+ check_digit %= 11
+
+ if check_digit == 10:
+ check_digit = 0
+
+ return check_digit
+
+
class Provider(CompanyProvider):
@classmethod
@@ -39,3 +57,20 @@ def regon(cls):
regon_digits.append(regon_checksum(regon_digits))
return ''.join(str(digit) for digit in regon_digits)
+
+ @classmethod
+ def local_regon(cls):
+ """
+ Returns 14 character Polish National Business Registry Number,
+ local entity number.
+
+ https://pl.wikipedia.org/wiki/REGON
+ """
+ regon_digits = [int(digit) for digit in list(cls.regon())]
+
+ for _ in range(4):
+ regon_digits.append(cls.random_digit())
+
+ regon_digits.append(local_regon_checksum(regon_digits))
+
+ return ''.join(str(digit) for digit in regon_digits)
| diff --git a/tests/providers/company.py b/tests/providers/company.py
index 9ec0ca2934..62c864d4f0 100644
--- a/tests/providers/company.py
+++ b/tests/providers/company.py
@@ -9,7 +9,7 @@
from faker.providers.company.hu_HU import Provider as HuProvider
from faker.providers.company.ja_JP import Provider as JaProvider
from faker.providers.company.pt_BR import Provider as PtProvider, company_id_checksum
-from faker.providers.company.pl_PL import Provider as PlProvider, regon_checksum
+from faker.providers.company.pl_PL import Provider as PlProvider, regon_checksum, local_regon_checksum
from .. import string_types
@@ -85,3 +85,15 @@ def test_regon_checksum(self):
def test_regon(self):
for _ in range(100):
self.assertTrue(re.search(r'^\d{9}$', PlProvider.regon()))
+
+ def test_local_regon_checksum(self):
+ self.assertEquals(local_regon_checksum([1, 2, 3, 4, 5, 6, 7, 8, 5, 1, 2, 3, 4]), 7)
+ self.assertEquals(local_regon_checksum([6, 1, 1, 9, 4, 8, 8, 3, 2, 7, 5, 8, 0]), 3)
+ self.assertEquals(local_regon_checksum([8, 9, 2, 0, 0, 3, 6, 6, 0, 7, 0, 3, 2]), 3)
+ self.assertEquals(local_regon_checksum([3, 5, 7, 7, 1, 0, 2, 2, 2, 5, 4, 3, 3]), 0)
+ self.assertEquals(local_regon_checksum([9, 3, 5, 3, 1, 1, 0, 1, 2, 4, 8, 8, 2]), 1)
+
+ def test_local_regon(self):
+ for _ in range(100):
+ self.assertTrue(re.search(r'^\d{14}$', PlProvider.local_regon()))
+
| [
{
"components": [
{
"doc": "Calculates and returns a control digit for given list of digits basing on local REGON standard.",
"lines": [
23,
38
],
"name": "local_regon_checksum",
"signature": "def local_regon_checksum(digits):",
"type": "... | [
"tests/providers/company.py::TestJaJP::test_company",
"tests/providers/company.py::TestPtBR::test_pt_BR_cnpj",
"tests/providers/company.py::TestPtBR::test_pt_BR_company_id",
"tests/providers/company.py::TestPtBR::test_pt_BR_company_id_checksum",
"tests/providers/company.py::TestHuHU::test_company",
"tests... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Adding an implementation of `local_regon` for pl_PL company provider.
An extension of REGON.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/company/pl_PL/__init__.py]
(definition of local_regon_checksum:)
def local_regon_checksum(digits):
"""Calculates and returns a control digit for given list of digits basing on local REGON standard."""
(definition of Provider.local_regon:)
def local_regon(cls):
"""Returns 14 character Polish National Business Registry Number,
local entity number.
https://pl.wikipedia.org/wiki/REGON"""
[end of new definitions in faker/providers/company/pl_PL/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
joke2k__faker-552 | 552 | joke2k/faker | null | 76667a5695927cb1c44f753c1bf671df3937af73 | 2017-07-10T20:57:35Z | diff --git a/faker/providers/automotive/__init__.py b/faker/providers/automotive/__init__.py
new file mode 100644
index 0000000000..692d59db1c
--- /dev/null
+++ b/faker/providers/automotive/__init__.py
@@ -0,0 +1,17 @@
+# coding=utf-8
+
+localized = True
+
+from .. import BaseProvider
+from string import ascii_uppercase
+import re
+
+class Provider(BaseProvider):
+ license_formats = ()
+
+ @classmethod
+ def license_plate(cls):
+ temp = re.sub(r'\?',
+ lambda x: cls.random_element(ascii_uppercase),
+ cls.random_element(cls.license_formats))
+ return cls.numerify(temp)
diff --git a/faker/providers/automotive/en_CA/__init__.py b/faker/providers/automotive/en_CA/__init__.py
new file mode 100644
index 0000000000..473a7a429a
--- /dev/null
+++ b/faker/providers/automotive/en_CA/__init__.py
@@ -0,0 +1,42 @@
+# coding=utf-8
+
+from __future__ import unicode_literals
+from .. import Provider as AutomotiveProvider
+
+
+class Provider(AutomotiveProvider):
+ # from https://www.revolvy.com/main/index.php?s=Canadian%20licence%20plate%20designs%20and%20serial%20formats
+ license_formats = (
+ # Alberta
+ '???-####',
+ # BC
+ '??# ##?',
+ '?? ####',
+ # Manitoba
+ '??? ###',
+ # New Brunswick
+ '??? ###',
+ # Newfoundland and Labrador
+ '??? ###',
+ # NWT
+ '######',
+ # Nova Scotia
+ '??? ###',
+ # Nunavut
+ '### ###',
+ # Ontario
+ '### ???',
+ '???? ###',
+ '??# ###',
+ '### #??',
+ '?? ####',
+ 'GV??-###',
+ # PEI
+ '## ##??',
+ # Quebec
+ '?## ???',
+ # Saskatchewan
+ '### ???',
+ # Yukon
+ '???##'
+ )
\ No newline at end of file
diff --git a/faker/providers/automotive/en_GB/__init__.py b/faker/providers/automotive/en_GB/__init__.py
new file mode 100644
index 0000000000..b84c1960cb
--- /dev/null
+++ b/faker/providers/automotive/en_GB/__init__.py
@@ -0,0 +1,12 @@
+# coding=utf-8
+
+from __future__ import unicode_literals
+from .. import Provider as AutomotiveProvider
+
+
+class Provider(AutomotiveProvider):
+ # from https://en.wikipedia.org/wiki/Vehicle_registration_plates_of_the_United_Kingdom
+ license_formats = (
+ '??## ???',
+ '??##???'
+ )
\ No newline at end of file
diff --git a/faker/providers/automotive/en_US/__init__.py b/faker/providers/automotive/en_US/__init__.py
new file mode 100644
index 0000000000..8d2c9ea37e
--- /dev/null
+++ b/faker/providers/automotive/en_US/__init__.py
@@ -0,0 +1,164 @@
+# coding=utf-8
+
+from __future__ import unicode_literals
+from .. import Provider as AutomotiveProvider
+
+class Provider(AutomotiveProvider):
+ # from https://en.wikipedia.org/wiki/United_States_license_plate_designs_and_serial_formats#Current_standard-issue_passenger_plate_designs_and_serial_formats
+ license_formats = (
+ # Alabama
+ '#??####',
+ '##??###',
+ # Alaska
+ '### ???',
+ # American Samoa
+ '####',
+ # Arizona
+ '???####',
+ # Arkansas
+ '### ???',
+ '###???',
+ # California
+ '#???###',
+ # Colarado
+ '###-???',
+ '???-###',
+ # Conneticut
+ '###-???',
+ # Delaware
+ '######',
+ # DC
+ '??-####',
+ # Florda
+ '??? ?##',
+ '### ???',
+ '?## #??',
+ '### #??',
+ # Georgia
+ '???####',
+ # Guam
+ '?? ####',
+ # Hawaii
+ '??? ###',
+ 'H?? ###',
+ 'Z?? ###',
+ 'K?? ###',
+ 'L?? ###',
+ 'M?? ###',
+ # Idaho
+ '? ######',
+ '#? #####',
+ '#? ?####',
+ '#? ??###',
+ '#? #?#???',
+ '#? ####?',
+ '##? ####',
+ # Illinois
+ '?? #####',
+ '??# ####',
+ # Indiana
+ '###?',
+ '###??',
+ '###???',
+ # Iowa
+ '??? ###',
+ # Kansas
+ '### ???',
+ # Kentucky
+ '### ???',
+ # Louisiana
+ '### ???',
+ # Maine
+ '#### ??',
+ # Maryland
+ '#??####',
+ # Massachusetts
+ '#??? ##',
+ '#?? ###',
+ '### ??#',
+ '##? ?##',
+ # Michigan
+ '### ???',
+ '#?? ?##',
+ # Minnesota
+ '###-???',
+ # Mississippi
+ '??? ###',
+ # Missouri
+ '??# ?#?',
+ # Montana
+ '#-#####?',
+ '##-####?',
+ # Nebraska
+ '??? ###',
+ '#-?####',
+ '##-?###',
+ '##-??##',
+ # Nevada
+ '##?•###',
+ # New Hampshire
+ '### ####',
+ # New Jersey
+ '?##-???',
+ # New Mexico
+ '###-???',
+ '???-###',
+ # New York
+ '???-####',
+ # North Carolina
+ '###-????',
+ # North Dakota
+ '### ???',
+ # Nothern Mariana Islands
+ '??? ###',
+ # Ohio
+ '??? ####',
+ # Oklahoma
+ '???-###',
+ # Oregon
+ '### ???',
+ # Pennsylvania
+ '???-####',
+ # Peurto Rico
+ '???-###',
+ # Rhode Island
+ '###-###',
+ # South Carolina
+ '### #??',
+ # South Dakota
+ '#?? ###',
+ '#?? ?##',
+ '##? ###',
+ '##? ?##',
+ '##? ??#',
+ # Tennessee
+ '?##-##?',
+ # Texas
+ '???-####',
+ # Utah
+ '?## #??',
+ '?## #??',
+ # Vermont
+ '??? ###',
+ '##??#',
+ '#??##',
+ '###?#',
+ '#?###',
+ # US Virgin Islands
+ '??? ###',
+ # Virginia
+ '???-####',
+ # Washington
+ '???####',
+ '###-???'
+ # West Virginia
+ '#?? ###',
+ '??? ###',
+ # Wisconsin
+ '???-####',
+ '###-???'
+ # Wyoming
+ '#-#####',
+ '#-####?',
+ '##-#####'
+ )
\ No newline at end of file
| diff --git a/tests/utils/__init__.py b/tests/utils/__init__.py
index cd84dead62..cfdfe9850c 100644
--- a/tests/utils/__init__.py
+++ b/tests/utils/__init__.py
@@ -60,6 +60,7 @@ def test_find_available_providers(self):
expected_providers = list(map(str, [
'faker.providers.address',
+ 'faker.providers.automotive',
'faker.providers.barcode',
'faker.providers.color',
'faker.providers.company',
| [
{
"components": [
{
"doc": "",
"lines": [
9,
17
],
"name": "Provider",
"signature": "class Provider(BaseProvider):",
"type": "class"
},
{
"doc": "",
"lines": [
13,
17
],
"nam... | [
"tests/utils/__init__.py::UtilsTestCase::test_find_available_providers"
] | [
"tests/utils/__init__.py::UtilsTestCase::test_add_dicts",
"tests/utils/__init__.py::UtilsTestCase::test_choice_distribution",
"tests/utils/__init__.py::UtilsTestCase::test_find_available_locales"
] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
New Provider: Automotive
This PR adds license plate generators for the following locales: `en_US`, `en_CA` and `en_GB`.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/automotive/__init__.py]
(definition of Provider:)
class Provider(BaseProvider):
(definition of Provider.license_plate:)
def license_plate(cls):
[end of new definitions in faker/providers/automotive/__init__.py]
[start of new definitions in faker/providers/automotive/en_CA/__init__.py]
(definition of Provider:)
class Provider(AutomotiveProvider):
[end of new definitions in faker/providers/automotive/en_CA/__init__.py]
[start of new definitions in faker/providers/automotive/en_GB/__init__.py]
(definition of Provider:)
class Provider(AutomotiveProvider):
[end of new definitions in faker/providers/automotive/en_GB/__init__.py]
[start of new definitions in faker/providers/automotive/en_US/__init__.py]
(definition of Provider:)
class Provider(AutomotiveProvider):
[end of new definitions in faker/providers/automotive/en_US/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
falconry__falcon-1079 | 1,079 | falconry/falcon | null | 72c634e8f752f83183791ee5a6edff070b701d4f | 2017-07-04T00:24:24Z | diff --git a/falcon/routing/converters.py b/falcon/routing/converters.py
index 51c6b271c..e6934cb04 100644
--- a/falcon/routing/converters.py
+++ b/falcon/routing/converters.py
@@ -12,6 +12,12 @@
# See the License for the specific language governing permissions and
# limitations under the License.
+from datetime import datetime
+
+
+# PERF(kgriffs): Avoid an extra namespace lookup when using this function
+strptime = datetime.strptime
+
class IntConverter(object):
"""Converts a field value to an int.
@@ -59,6 +65,28 @@ def convert(self, fragment):
return value
+class DateTimeConverter(object):
+ """Converts a field value to a datetime.
+
+ Keyword Args:
+ format_string (str): String used to parse the param value
+ into a datetime. Any format recognized by strptime() is
+ supported (default ``'%Y-%m-%dT%H:%M:%SZ'``).
+ """
+
+ __slots__ = ('_format_string',)
+
+ def __init__(self, format_string='%Y-%m-%dT%H:%M:%SZ'):
+ self._format_string = format_string
+
+ def convert(self, fragment):
+ try:
+ return strptime(fragment, self._format_string)
+ except ValueError:
+ return None
+
+
BUILTIN = (
('int', IntConverter),
+ ('dt', DateTimeConverter),
)
| diff --git a/tests/test_uri_converters.py b/tests/test_uri_converters.py
index 66a6f712d..536512a7b 100644
--- a/tests/test_uri_converters.py
+++ b/tests/test_uri_converters.py
@@ -1,3 +1,4 @@
+from datetime import datetime
import string
import pytest
@@ -5,7 +6,7 @@
from falcon.routing import converters
-@pytest.mark.parametrize('segment,num_digits,min,max,expected', [
+@pytest.mark.parametrize('fragment, num_digits, min, max, expected', [
('123', None, None, None, 123),
('01', None, None, None, 1),
('001', None, None, None, 1),
@@ -33,22 +34,43 @@
('12', 2, 13, 12, None),
('12', 2, 13, 13, None),
])
-def test_int_filter(segment, num_digits, min, max, expected):
+def test_int_converter(fragment, num_digits, min, max, expected):
c = converters.IntConverter(num_digits, min, max)
- assert c.convert(segment) == expected
+ assert c.convert(fragment) == expected
-@pytest.mark.parametrize('segment', (
+@pytest.mark.parametrize('fragment', (
['0x0F', 'something', '', ' '] +
['123' + w for w in string.whitespace] +
[w + '123' for w in string.whitespace]
))
-def test_int_filter_malformed(segment):
+def test_int_converter_malformed(fragment):
c = converters.IntConverter()
- assert c.convert(segment) is None
+ assert c.convert(fragment) is None
@pytest.mark.parametrize('num_digits', [0, -1, -10])
-def test_int_filter_invalid_config(num_digits):
+def test_int_converter_invalid_config(num_digits):
with pytest.raises(ValueError):
converters.IntConverter(num_digits)
+
+
+@pytest.mark.parametrize('fragment, format_string, expected', [
+ ('07-03-17', '%m-%d-%y', datetime(2017, 7, 3)),
+ ('07-03-17 ', '%m-%d-%y ', datetime(2017, 7, 3)),
+ ('2017-07-03T14:30:01Z', '%Y-%m-%dT%H:%M:%SZ', datetime(2017, 7, 3, 14, 30, 1)),
+ ('2017-07-03T14:30:01', '%Y-%m-%dT%H:%M:%S', datetime(2017, 7, 3, 14, 30, 1)),
+ ('2017-07-03T14:30:01', '%Y-%m-%dT%H:%M:%SZ', None),
+ ('2017_19', '%Y_%H', datetime(2017, 1, 1, 19, 0)),
+ ('07-03-17 ', '%m-%d-%y', None),
+ (' 07-03-17', '%m-%d-%y', None),
+ ('07 -03-17', '%m-%d-%y', None),
+])
+def test_datetime_converter(fragment, format_string, expected):
+ c = converters.DateTimeConverter(format_string)
+ assert c.convert(fragment) == expected
+
+
+def test_datetime_converter_default_format():
+ c = converters.DateTimeConverter()
+ assert c.convert('2017-07-03T14:30:01Z') == datetime(2017, 7, 3, 14, 30, 1)
diff --git a/tests/test_uri_templates.py b/tests/test_uri_templates.py
index f18c2297d..e330898be 100644
--- a/tests/test_uri_templates.py
+++ b/tests/test_uri_templates.py
@@ -5,6 +5,8 @@
path via simulate_get(), vs. probing the router directly.
"""
+from datetime import datetime
+
import pytest
import six
@@ -137,7 +139,7 @@ def test_single(client, resource, field_name):
'/{id:int(min=123)}',
'/{id:int(min=123, max=123)}',
])
-def test_converter(client, uri_template):
+def test_int_converter(client, uri_template):
resource1 = IDResource()
client.app.add_route(uri_template, resource1)
@@ -154,7 +156,7 @@ def test_converter(client, uri_template):
'/{id:int(min=124)}',
'/{id:int(num_digits=3, max=100)}',
])
-def test_converter_rejections(client, uri_template):
+def test_int_converter_rejections(client, uri_template):
resource1 = IDResource()
client.app.add_route(uri_template, resource1)
@@ -164,6 +166,43 @@ def test_converter_rejections(client, uri_template):
assert not resource1.called
+@pytest.mark.parametrize('uri_template, path, dt_expected', [
+ (
+ '/{start_year:int}-to-{timestamp:dt}',
+ '/1961-to-1969-07-21T02:56:00Z',
+ datetime(1969, 7, 21, 2, 56, 0)
+ ),
+ (
+ '/{start_year:int}-to-{timestamp:dt("%Y-%m-%d")}',
+ '/1961-to-1969-07-21',
+ datetime(1969, 7, 21)
+ ),
+ (
+ '/{start_year:int}/{timestamp:dt("%Y-%m-%d %H:%M")}',
+ '/1961/1969-07-21 14:30',
+ datetime(1969, 7, 21, 14, 30)
+ ),
+ (
+ '/{start_year:int}-to-{timestamp:dt("%Y-%m")}',
+ '/1961-to-1969-07-21',
+ None
+ ),
+])
+def test_datetime_converter(client, resource, uri_template, path, dt_expected):
+ client.app.add_route(uri_template, resource)
+
+ result = client.simulate_get(path)
+
+ if dt_expected is None:
+ assert result.status_code == 404
+ assert not resource.called
+ else:
+ assert result.status_code == 200
+ assert resource.called
+ assert resource.captured_kwargs['start_year'] == 1961
+ assert resource.captured_kwargs['timestamp'] == dt_expected
+
+
def test_converter_custom(client, resource):
class SpamConverter(object):
def convert(self, fragment):
| [
{
"components": [
{
"doc": "Converts a field value to a datetime.\n\nKeyword Args:\n format_string (str): String used to parse the param value\n into a datetime. Any format recognized by strptime() is\n supported (default ``'%Y-%m-%dT%H:%M:%SZ'``).",
"lines": [
6... | [
"tests/test_uri_converters.py::test_datetime_converter[07-03-17-%m-%d-%y-expected0]",
"tests/test_uri_converters.py::test_datetime_converter[07-03-17",
"tests/test_uri_converters.py::test_datetime_converter[2017-07-03T14:30:01Z-%Y-%m-%dT%H:%M:%SZ-expected2]",
"tests/test_uri_converters.py::test_datetime_conve... | [
"tests/test_uri_converters.py::test_int_converter[123-None-None-None-123]",
"tests/test_uri_converters.py::test_int_converter[01-None-None-None-1]",
"tests/test_uri_converters.py::test_int_converter[001-None-None-None-1]",
"tests/test_uri_converters.py::test_int_converter[0-None-None-None-0]",
"tests/test_u... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat(routing): Add DateTimeConverter
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/routing/converters.py]
(definition of DateTimeConverter:)
class DateTimeConverter(object):
"""Converts a field value to a datetime.
Keyword Args:
format_string (str): String used to parse the param value
into a datetime. Any format recognized by strptime() is
supported (default ``'%Y-%m-%dT%H:%M:%SZ'``)."""
(definition of DateTimeConverter.__init__:)
def __init__(self, format_string='%Y-%m-%dT%H:%M:%SZ'):
(definition of DateTimeConverter.convert:)
def convert(self, fragment):
[end of new definitions in falcon/routing/converters.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | ||
scikit-learn__scikit-learn-9270 | 9,270 | scikit-learn/scikit-learn | 0.20 | 4d4116097deee37d9ea38c447401c29456000e78 | 2017-07-03T12:08:47Z | diff --git a/sklearn/base.py b/sklearn/base.py
index 6f59cea3c7ab7..f62c0308fb566 100644
--- a/sklearn/base.py
+++ b/sklearn/base.py
@@ -524,6 +524,29 @@ def score(self, X, y=None):
pass
+class OutlierMixin(object):
+ """Mixin class for all outlier detection estimators in scikit-learn."""
+ _estimator_type = "outlier_detector"
+
+ def fit_predict(self, X, y=None):
+ """Performs outlier detection on X.
+
+ Returns -1 for outliers and 1 for inliers.
+
+ Parameters
+ ----------
+ X : ndarray, shape (n_samples, n_features)
+ Input data.
+
+ Returns
+ -------
+ y : ndarray, shape (n_samples,)
+ 1 for inliers, -1 for outliers.
+ """
+ # override for transductive outlier detectors like LocalOulierFactor
+ return self.fit(X).predict(X)
+
+
###############################################################################
class MetaEstimatorMixin(object):
"""Mixin class for all meta estimators in scikit-learn."""
@@ -562,3 +585,19 @@ def is_regressor(estimator):
True if estimator is a regressor and False otherwise.
"""
return getattr(estimator, "_estimator_type", None) == "regressor"
+
+
+def is_outlier_detector(estimator):
+ """Returns True if the given estimator is (probably) an outlier detector.
+
+ Parameters
+ ----------
+ estimator : object
+ Estimator object to test.
+
+ Returns
+ -------
+ out : bool
+ True if estimator is an outlier detector and False otherwise.
+ """
+ return getattr(estimator, "_estimator_type", None) == "outlier_detector"
diff --git a/sklearn/covariance/elliptic_envelope.py b/sklearn/covariance/elliptic_envelope.py
index 1d712207f0665..633f102fc006c 100644
--- a/sklearn/covariance/elliptic_envelope.py
+++ b/sklearn/covariance/elliptic_envelope.py
@@ -8,9 +8,10 @@
from . import MinCovDet
from ..utils.validation import check_is_fitted, check_array
from ..metrics import accuracy_score
+from ..base import OutlierMixin
-class EllipticEnvelope(MinCovDet):
+class EllipticEnvelope(MinCovDet, OutlierMixin):
"""An object for detecting outliers in a Gaussian distributed dataset.
Read more in the :ref:`User Guide <outlier_detection>`.
diff --git a/sklearn/ensemble/iforest.py b/sklearn/ensemble/iforest.py
index 4c59e48efc64c..37da6aa986ff8 100644
--- a/sklearn/ensemble/iforest.py
+++ b/sklearn/ensemble/iforest.py
@@ -17,6 +17,7 @@
from ..tree import ExtraTreeRegressor
from ..utils import check_random_state, check_array
from ..utils.validation import check_is_fitted
+from ..base import OutlierMixin
from .bagging import BaseBagging
@@ -25,7 +26,7 @@
INTEGER_TYPES = (numbers.Integral, np.integer)
-class IsolationForest(BaseBagging):
+class IsolationForest(BaseBagging, OutlierMixin):
"""Isolation Forest Algorithm
Return the anomaly score of each sample using the IsolationForest algorithm
diff --git a/sklearn/neighbors/lof.py b/sklearn/neighbors/lof.py
index 41e156b42ea81..f7f1a16ebeb28 100644
--- a/sklearn/neighbors/lof.py
+++ b/sklearn/neighbors/lof.py
@@ -9,6 +9,7 @@
from .base import NeighborsBase
from .base import KNeighborsMixin
from .base import UnsupervisedMixin
+from ..base import OutlierMixin
from ..utils.validation import check_is_fitted
from ..utils import check_array
@@ -16,7 +17,8 @@
__all__ = ["LocalOutlierFactor"]
-class LocalOutlierFactor(NeighborsBase, KNeighborsMixin, UnsupervisedMixin):
+class LocalOutlierFactor(NeighborsBase, KNeighborsMixin, UnsupervisedMixin,
+ OutlierMixin):
"""Unsupervised Outlier Detection using Local Outlier Factor (LOF)
The anomaly score of each sample is called Local Outlier Factor.
diff --git a/sklearn/svm/classes.py b/sklearn/svm/classes.py
index 2f2bd49be2598..7a7955d30d29e 100644
--- a/sklearn/svm/classes.py
+++ b/sklearn/svm/classes.py
@@ -2,7 +2,7 @@
import numpy as np
from .base import _fit_liblinear, BaseSVC, BaseLibSVM
-from ..base import BaseEstimator, RegressorMixin
+from ..base import BaseEstimator, RegressorMixin, OutlierMixin
from ..linear_model.base import LinearClassifierMixin, SparseCoefMixin, \
LinearModel
from ..utils import check_X_y
@@ -975,7 +975,7 @@ def __init__(self, nu=0.5, C=1.0, kernel='rbf', degree=3,
verbose=verbose, max_iter=max_iter, random_state=None)
-class OneClassSVM(BaseLibSVM):
+class OneClassSVM(BaseLibSVM, OutlierMixin):
"""Unsupervised Outlier Detection.
Estimate the support of a high-dimensional distribution.
| diff --git a/sklearn/utils/estimator_checks.py b/sklearn/utils/estimator_checks.py
index b079c37f7bea2..15aa360f6146b 100644
--- a/sklearn/utils/estimator_checks.py
+++ b/sklearn/utils/estimator_checks.py
@@ -18,6 +18,7 @@
from sklearn.utils.testing import assert_raise_message
from sklearn.utils.testing import assert_equal
from sklearn.utils.testing import assert_not_equal
+from sklearn.utils.testing import assert_almost_equal
from sklearn.utils.testing import assert_true
from sklearn.utils.testing import assert_false
from sklearn.utils.testing import assert_in
@@ -36,7 +37,8 @@
from sklearn.base import (clone, TransformerMixin, ClusterMixin,
- BaseEstimator, is_classifier, is_regressor)
+ BaseEstimator, is_classifier, is_regressor,
+ is_outlier_detector)
from sklearn.metrics import accuracy_score, adjusted_rand_score, f1_score
@@ -210,6 +212,20 @@ def _yield_clustering_checks(name, clusterer):
yield check_non_transformer_estimators_n_iter
+def _yield_outliers_checks(name, estimator):
+
+ # checks for all outlier detectors
+ yield check_outliers_fit_predict
+
+ # checks for estimators that can be used on a test set
+ if hasattr(estimator, 'predict'):
+ yield check_outliers_train
+ # test outlier detectors can handle non-array data
+ yield check_classifier_data_not_an_array
+ # test if NotFittedError is raised
+ yield check_estimators_unfitted
+
+
def _yield_all_checks(name, estimator):
for check in _yield_non_meta_checks(name, estimator):
yield check
@@ -225,6 +241,9 @@ def _yield_all_checks(name, estimator):
if isinstance(estimator, ClusterMixin):
for check in _yield_clustering_checks(name, estimator):
yield check
+ if is_outlier_detector(estimator):
+ for check in _yield_outliers_checks(name, estimator):
+ yield check
yield check_fit2d_predict1d
yield check_methods_subset_invariance
if name != 'GaussianProcess': # FIXME
@@ -1360,6 +1379,67 @@ def check_classifiers_train(name, classifier_orig):
assert_array_equal(np.argsort(y_log_prob), np.argsort(y_prob))
+def check_outliers_train(name, estimator_orig):
+ X, _ = make_blobs(n_samples=300, random_state=0)
+ X = shuffle(X, random_state=7)
+ n_samples, n_features = X.shape
+ estimator = clone(estimator_orig)
+ set_random_state(estimator)
+
+ # fit
+ estimator.fit(X)
+ # with lists
+ estimator.fit(X.tolist())
+
+ y_pred = estimator.predict(X)
+ assert y_pred.shape == (n_samples,)
+ assert y_pred.dtype.kind == 'i'
+ assert_array_equal(np.unique(y_pred), np.array([-1, 1]))
+
+ decision = estimator.decision_function(X)
+ assert decision.dtype == np.dtype('float')
+
+ score = estimator.score_samples(X)
+ assert score.dtype == np.dtype('float')
+
+ # raises error on malformed input for predict
+ assert_raises(ValueError, estimator.predict, X.T)
+
+ # decision_function agrees with predict
+ decision = estimator.decision_function(X)
+ assert decision.shape == (n_samples,)
+ dec_pred = (decision >= 0).astype(np.int)
+ dec_pred[dec_pred == 0] = -1
+ assert_array_equal(dec_pred, y_pred)
+
+ # raises error on malformed input for decision_function
+ assert_raises(ValueError, estimator.decision_function, X.T)
+
+ # decision_function is a translation of score_samples
+ y_scores = estimator.score_samples(X)
+ assert y_scores.shape == (n_samples,)
+ y_dec = y_scores - estimator.offset_
+ assert_array_equal(y_dec, decision)
+
+ # raises error on malformed input for score_samples
+ assert_raises(ValueError, estimator.score_samples, X.T)
+
+ # contamination parameter (not for OneClassSVM which has the nu parameter)
+ if hasattr(estimator, "contamination"):
+ # proportion of outliers equal to contamination parameter when not
+ # set to 'auto'
+ contamination = 0.1
+ estimator.set_params(contamination=contamination)
+ estimator.fit(X)
+ y_pred = estimator.predict(X)
+ assert_almost_equal(np.mean(y_pred != 1), contamination)
+
+ # raises error when contamination is a scalar and not in [0,1]
+ for contamination in [-0.5, 2.3]:
+ estimator.set_params(contamination=contamination)
+ assert_raises(ValueError, estimator.fit, X)
+
+
@ignore_warnings(category=(DeprecationWarning, FutureWarning))
def check_estimators_fit_returns_self(name, estimator_orig):
"""Check if self is returned when calling fit"""
@@ -1388,7 +1468,7 @@ def check_estimators_unfitted(name, estimator_orig):
therefore be adequately raised for that purpose.
"""
- # Common test for Regressors as well as Classifiers
+ # Common test for Regressors, Classifiers and Outlier detection estimators
X, y = _boston_subset()
est = clone(estimator_orig)
@@ -1997,3 +2077,37 @@ def check_decision_proba_consistency(name, estimator_orig):
a = estimator.predict_proba(X_test)[:, 1]
b = estimator.decision_function(X_test)
assert_array_equal(rankdata(a), rankdata(b))
+
+
+def check_outliers_fit_predict(name, estimator_orig):
+ # Check fit_predict for outlier detectors.
+
+ X, _ = make_blobs(n_samples=300, random_state=0)
+ X = shuffle(X, random_state=7)
+ n_samples, n_features = X.shape
+ estimator = clone(estimator_orig)
+
+ set_random_state(estimator)
+
+ y_pred = estimator.fit_predict(X)
+ assert y_pred.shape == (n_samples,)
+ assert y_pred.dtype.kind == 'i'
+ assert_array_equal(np.unique(y_pred), np.array([-1, 1]))
+
+ # check fit_predict = fit.predict when possible
+ if hasattr(estimator, 'predict'):
+ y_pred_2 = estimator.fit(X).predict(X)
+ assert_array_equal(y_pred, y_pred_2)
+
+ if hasattr(estimator, "contamination"):
+ # proportion of outliers equal to contamination parameter when not
+ # set to 'auto'
+ contamination = 0.1
+ estimator.set_params(contamination=contamination)
+ y_pred = estimator.fit_predict(X)
+ assert_almost_equal(np.mean(y_pred != 1), contamination)
+
+ # raises error when contamination is a scalar and not in [0,1]
+ for contamination in [-0.5, 2.3]:
+ estimator.set_params(contamination=contamination)
+ assert_raises(ValueError, estimator.fit_predict, X)
diff --git a/sklearn/utils/testing.py b/sklearn/utils/testing.py
index 6e2d9d5902add..be0d6620d741e 100644
--- a/sklearn/utils/testing.py
+++ b/sklearn/utils/testing.py
@@ -525,8 +525,8 @@ def uninstall_mldata_mock():
OTHER = ["Pipeline", "FeatureUnion", "GridSearchCV", "RandomizedSearchCV",
"SelectFromModel"]
-# some trange ones
-DONT_TEST = ['SparseCoder', 'EllipticEnvelope', 'DictVectorizer',
+# some strange ones
+DONT_TEST = ['SparseCoder', 'DictVectorizer',
'LabelBinarizer', 'LabelEncoder',
'MultiLabelBinarizer', 'TfidfTransformer',
'TfidfVectorizer', 'IsotonicRegression',
| [
{
"components": [
{
"doc": "Mixin class for all outlier detection estimators in scikit-learn.",
"lines": [
527,
547
],
"name": "OutlierMixin",
"signature": "class OutlierMixin(object):",
"type": "class"
},
{
"doc": "Pe... | [
"sklearn/utils/testing.py::sklearn.utils.testing.ignore_warnings"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
[MRG+1] Common tests for outlier detection estimators
<!--
Thanks for contributing a pull request! Please ensure you have taken a look at
the contribution guidelines: https://github.com/scikit-learn/scikit-learn/blob/master/CONTRIBUTING.md#Contributing-Pull-Requests
-->
#### Reference Issue
Addresses #8677
#### What does this implement/fix? Explain your changes.
This implements common tests for outlier detection estimators: `OneClassSVM`, `EllipticEnvelope`, `LocalOutlierFactor` and `IsolationForest`.
#### Any other comments?
`LocalOutlierFactor` is only meant for outlier detection (and not novelty detection) as it cannot (yet) be applied to unseen data. Therefore common tests for outlier detection estimators are split into two categories:
- tests for outlier detection which consist basically in testing `fit_predict` (to be run on all estimators). Note that all estimators can be applied in the outlier detection context and I therefore created an OutlierMixin implementing a `fit_predict` for all outlier detection estimators.
- tests for novelty detection (to be run on all estimators except `LocalOutlierFactor`)
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sklearn/base.py]
(definition of OutlierMixin:)
class OutlierMixin(object):
"""Mixin class for all outlier detection estimators in scikit-learn."""
(definition of OutlierMixin.fit_predict:)
def fit_predict(self, X, y=None):
"""Performs outlier detection on X.
Returns -1 for outliers and 1 for inliers.
Parameters
----------
X : ndarray, shape (n_samples, n_features)
Input data.
Returns
-------
y : ndarray, shape (n_samples,)
1 for inliers, -1 for outliers."""
(definition of is_outlier_detector:)
def is_outlier_detector(estimator):
"""Returns True if the given estimator is (probably) an outlier detector.
Parameters
----------
estimator : object
Estimator object to test.
Returns
-------
out : bool
True if estimator is an outlier detector and False otherwise."""
[end of new definitions in sklearn/base.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 51407623e4f491f00e3b465626dd5c4b55860bd0 | ||
sympy__sympy-12827 | 12,827 | sympy/sympy | 1.1 | d2c3800fd3aaa226c0d37da84086530dd3e5abaf | 2017-06-28T14:49:46Z | diff --git a/sympy/combinatorics/fp_groups.py b/sympy/combinatorics/fp_groups.py
index da43d335afc0..882bc10d6307 100644
--- a/sympy/combinatorics/fp_groups.py
+++ b/sympy/combinatorics/fp_groups.py
@@ -7,7 +7,8 @@
from sympy.printing.defaults import DefaultPrinting
from sympy.utilities import public
from sympy.utilities.iterables import flatten
-from sympy.combinatorics.free_groups import FreeGroupElement, free_group, zero_mul_simp
+from sympy.combinatorics.free_groups import (FreeGroup, FreeGroupElement,
+ free_group, zero_mul_simp)
from sympy.combinatorics.coset_table import (CosetTable,
coset_enumeration_r,
coset_enumeration_c)
@@ -53,30 +54,35 @@ class FpGroup(DefaultPrinting):
is_FpGroup = True
is_PermutationGroup = False
- def __new__(cls, fr_grp, relators):
+ def __init__(self, fr_grp, relators):
relators = _parse_relators(relators)
# return the corresponding FreeGroup if no relators are specified
if not relators:
return fr_grp
- obj = object.__new__(cls)
- obj.free_group = fr_grp
- obj.relators = relators
- obj.generators = obj._generators()
- obj.dtype = type("FpGroupElement", (FpGroupElement,), {"group": obj})
+ self.free_group = fr_grp
+ self.relators = relators
+ self.generators = self._generators()
+ self.dtype = type("FpGroupElement", (FpGroupElement,), {"group": self})
# CosetTable instance on identity subgroup
- obj._coset_table = None
+ self._coset_table = None
# returns whether coset table on identity subgroup
# has been standardized
- obj._is_standardized = False
+ self._is_standardized = False
- obj._order = None
- obj._center = None
- return obj
+ self._order = None
+ self._center = None
def _generators(self):
return self.free_group.generators
+ @property
+ def identity(self):
+ return self.free_group.identity
+
+ def __contains__(self, g):
+ return g in self.free_group
+
def subgroup(self, gens, C=None):
'''
Return the subgroup generated by `gens` using the
@@ -279,6 +285,87 @@ def __str__(self):
__repr__ = __str__
+class FpSubgroup(DefaultPrinting):
+ '''
+ The class implementing a subgroup of an FpGroup or a FreeGroup
+ (only finite index subgroups are supported at this point). This
+ is to be used if one wishes to check if an element of the original
+ group belongs to the subgroup
+
+ '''
+ def __init__(self, G, gens):
+ super(FpSubgroup,self).__init__()
+ self.parent = G
+ self.generators = list(set([g for g in gens if g != G.identity]))
+ self._min_words = None #for use in __contains__
+ self.C = None
+
+ def __contains__(self, g):
+ if self._min_words is None:
+ gens = self.generators[:]
+ gens.extend([e**-1 for e in gens])
+ for w1 in gens:
+ for w2 in gens:
+ if w2**-1 == w1:
+ continue
+ if w2[len(w2)-1]**-1 == w1[0] and w2*w1 not in gens:
+ gens.append(w2*w1)
+ if w2[0]**-1 == w1[len(w1)-1] and w1*w2 not in gens:
+ gens.append(w1*w2)
+ self._min_words = gens
+
+ min_words = self._min_words
+ known = {} #to keep track of words
+
+ def _word_break(w):
+ if len(w) == 0:
+ return True
+ i = 0
+ while i < len(w):
+ i += 1
+ prefix = w.subword(0, i)
+ if prefix not in min_words:
+ continue
+ rest = w.subword(i, len(w))
+ if rest not in known:
+ known[rest] = _word_break(rest)
+ if known[rest]:
+ return True
+ return False
+
+ if _word_break(g):
+ return True
+ elif isinstance(self.parent, FreeGroup):
+ return False
+ else:
+ if self.C is None:
+ C = self.parent.coset_enumeration(self.generators)
+ self.C = C
+ i = 0
+ C = self.C
+ for j in range(len(g)):
+ i = C.table[i][C.A_dict[g[j]]]
+ return i == 0
+
+ def order(self):
+ if not self.generators:
+ return 1
+ if isinstance(self.parent, FreeGroup):
+ return S.Infinity
+ if self.C is None:
+ C = self.parent.coset_enumeration(self.generators)
+ self.C = C
+ # This is valid because `len(self.C.table)` (the index of the subgroup)
+ # will always be finite - otherwise coset enumeration doesn't terminate
+ return self.parent.order()/len(self.C.table)
+
+ def to_FpGroup(self):
+ if isinstance(self.parent, FreeGroup):
+ gen_syms = [('x_%d'%i) for i in range(len(self.generators))]
+ return free_group(', '.join(gen_syms))[0]
+ return self.parent.subgroup(C=self.C)
+
+
###############################################################################
# LOW INDEX SUBGROUPS #
###############################################################################
diff --git a/sympy/combinatorics/free_groups.py b/sympy/combinatorics/free_groups.py
index f5ce6a3eba15..d4af2025840a 100644
--- a/sympy/combinatorics/free_groups.py
+++ b/sympy/combinatorics/free_groups.py
@@ -135,6 +135,7 @@ class FreeGroup(DefaultPrinting):
is_group = True
is_FreeGroup = True
is_PermutationGroup = False
+ relators = tuple()
def __new__(cls, symbols):
symbols = tuple(_parse_symbols(symbols))
diff --git a/sympy/combinatorics/homomorphisms.py b/sympy/combinatorics/homomorphisms.py
new file mode 100644
index 000000000000..6501c835c4eb
--- /dev/null
+++ b/sympy/combinatorics/homomorphisms.py
@@ -0,0 +1,227 @@
+from __future__ import print_function, division
+
+from sympy.combinatorics.fp_groups import FpGroup, FpSubgroup
+from sympy.combinatorics.free_groups import FreeGroup, FreeGroupElement
+from sympy.combinatorics.perm_groups import PermutationGroup
+
+class GroupHomomorphism(object):
+ '''
+ A class representing group homomorphisms. Instantiate using `homomorphism()`.
+
+ References
+ ==========
+ [1] Holt, D., Eick, B. and O'Brien, E. (2005). Handbook of computational group theory.
+
+ '''
+
+ def __init__(self, domain, codomain, images):
+ self.domain = domain
+ self.codomain = codomain
+ self.images = images
+ self._inverses = None
+ self._kernel = None
+ self._image = None
+
+ def _invs(self):
+ '''
+ Return a dictionary with `{gen: inverse}` where `gen` is a rewriting
+ generator of `codomain` (e.g. strong generator for permutation groups)
+ and `inverse` is an element of its preimage
+
+ '''
+ image = self.image()
+ inverses = {}
+ for k in list(self.images.keys()):
+ v = self.images[k]
+ if not (v in inverses
+ or v.is_identity):
+ inverses[v] = k
+ gens = image.strong_gens
+ for g in gens:
+ if g in inverses or g.is_identity:
+ continue
+ w = self.domain.identity
+ for s in image._strong_gens_slp[g]:
+ if s in inverses:
+ w = inverses[s]*w
+ else:
+ w = inverses[s**-1]**-1*w
+ inverses[g] = w
+ return inverses
+
+ def invert(self, g):
+ '''
+ Return an element of the preimage of `g`
+
+ '''
+ if not isinstance(self.codomain, PermutationGroup):
+ raise NotImplementedError(
+ "Only elements of PermutationGroups can be inverted")
+ if self._inverses is None:
+ self._inverses = self._invs()
+ image = self.image()
+ w = self.domain.identity
+ for g in image.generator_product(g):
+ if g.is_identity:
+ continue
+ w = self._inverses[g]*w
+ return w
+
+ def kernel(self):
+ '''
+ Compute the kernel of `self`.
+
+ '''
+ if self._kernel is None:
+ self._kernel = self._compute_kernel()
+ return self._kernel
+
+ def _compute_kernel(self):
+ from sympy import S
+ G = self.domain
+ G_order = G.order()
+ if G_order == S.Infinity:
+ raise NotImplementedError(
+ "Kernel computation is not implemented for infinite groups")
+ gens = []
+ K = FpSubgroup(G, gens)
+ i = self.image().order()
+ while K.order()*i != G_order:
+ r = G.random_element()
+ k = r*self.invert(self(r))
+ if not k in K:
+ gens.append(k)
+ K = FpSubgroup(G, gens)
+ return K
+
+ def image(self):
+ '''
+ Compute the image of `self`.
+
+ '''
+ if self._image is None:
+ values = list(set(self.images.values()))
+ if isinstance(self.codomain, PermutationGroup):
+ self._image = self.codomain.subgroup(values)
+ elif isinstance(self.codomain, FpGroup):
+ self._image = FpSubgroup(self.codomain, values)
+ else:
+ self._image = FreeSubgroup(self.codomain, values)
+ return self._image
+
+ def _apply(self, elem):
+ '''
+ Apply `self` to `elem`.
+
+ '''
+ if not elem in self.domain.free_group:
+ raise ValueError("The supplied element doesn't belong to the domain")
+ if elem.is_identity:
+ return self.codomain.identity
+ else:
+ p = elem.array_form[0][1]
+ if p < 0:
+ g = elem[0]**-1
+ else:
+ g = elem[0]
+ return self.images[g]**p*self._apply(elem.subword(abs(p), len(elem)))
+
+ def __call__(self, elem):
+ return self._apply(elem)
+
+ def is_injective(self):
+ '''
+ Check if the homomorphism is injective
+
+ '''
+ return self.kernel().order() == 1
+
+ def is_surjective(self):
+ '''
+ Check if the homomorphism is surjective
+
+ '''
+ from sympy import S
+ im = self.image().order()
+ oth = self.codomain.order()
+ if im == S.Infinity and oth == S.Infinity:
+ return None
+ else:
+ return im == oth
+
+ def is_isomorphism(self):
+ '''
+ Check if `self` is an isomorphism.
+
+ '''
+ return self.is_injective() and self.is_surjective()
+
+ def is_trivial(self):
+ '''
+ Check is `self` is a trivial homomorphism, i.e. all elements
+ are mapped to the identity.
+
+ '''
+ return self.image().order() == 1
+
+def homomorphism(domain, codomain, gens, images=[]):
+ '''
+ Create (if possible) a group homomorphism from the group `domain`
+ to the group `codomain` defined by the images of the domain's
+ generators `gens`. `gens` and `images` can be either lists or tuples
+ of equal sizes. If `gens` is a proper subset of the group's generators,
+ the unspecified generators will be mapped to the identity. If the
+ images are not specified, a trivial homomorphism will be created.
+
+ If the given images of the generators do not define a homomorphism,
+ an exception is raised.
+
+ '''
+ if isinstance(domain, PermutationGroup):
+ raise NotImplementedError("Homomorphisms from permutation groups are not currently implemented")
+ elif not isinstance(domain, (FpGroup, FreeGroup)):
+ raise TypeError("The domain must be a group")
+ if not isinstance(codomain, (PermutationGroup, FpGroup, FreeGroup)):
+ raise TypeError("The codomain must be a group")
+
+ generators = domain.generators
+ if any([g not in generators for g in gens]):
+ raise ValueError("The supplied generators must be a subset of the domain's generators")
+ if any([g not in codomain for g in images]):
+ raise ValueError("The images must be elements of the codomain")
+
+ if images and len(images) != len(gens):
+ raise ValueError("The number of images must be equal to the number of generators")
+
+ gens = list(gens)
+ images = list(images)
+ images.extend([codomain.identity]*(len(generators)-len(images)))
+ gens.extend([g for g in generators if g not in gens])
+ images = dict(zip(gens,images))
+
+ if not _check_homomorphism(domain, images, codomain.identity):
+ raise ValueError("The given images do not define a homomorphism")
+ return GroupHomomorphism(domain, codomain, images)
+
+def _check_homomorphism(domain, images, identity):
+ rels = domain.relators
+ def _image(r):
+ if r.is_identity:
+ return identity
+ else:
+ w = identity
+ r_arr = r.array_form
+ i = 0
+ while i < len(r):
+ power = r_arr[i][1]
+ if r[i] in images:
+ w = w*images[r[i]]**power
+ else:
+ w = w*images[r[i]**-1]**power
+ i += abs(power)
+ return w
+
+ if any([not _image(r).is_identity for r in rels]):
+ return False
+ else:
+ return True
diff --git a/sympy/combinatorics/perm_groups.py b/sympy/combinatorics/perm_groups.py
index 8952a28852d2..295f137d6869 100644
--- a/sympy/combinatorics/perm_groups.py
+++ b/sympy/combinatorics/perm_groups.py
@@ -158,8 +158,10 @@ def __new__(cls, *args, **kwargs):
# these attributes are assigned after running schreier_sims
obj._base = []
obj._strong_gens = []
+ obj._strong_gens_slp = []
obj._basic_orbits = []
obj._transversals = []
+ obj._transversal_slp = []
# these attributes are assigned after running _random_pr_init
obj._random_gens = []
@@ -1090,6 +1092,22 @@ def coset_factor(self, g, factor_index=False):
factors = [tr[i][factors[i]] for i in range(len(base))]
return factors
+ def generator_product(self, g):
+ '''
+ Return a list of strong generators `[s1, ..., sn]`
+ s.t `g = sn*...*s1`.
+
+ '''
+ if g in self.strong_gens:
+ return [g]
+ f = self.coset_factor(g, True)
+ product = []
+ for i, j in enumerate(f):
+ slp = self._transversal_slp[i][j]
+ for s in slp:
+ product.append(self.strong_gens[s])
+ return product
+
def coset_rank(self, g):
"""rank using Schreier-Sims representation
@@ -1192,6 +1210,14 @@ def degree(self):
"""
return self._degree
+ @property
+ def identity(self):
+ '''
+ Return the identity element of the permutation group.
+
+ '''
+ return _af_new(list(range(self.degree)))
+
@property
def elements(self):
"""Returns all the elements of the permutation group as a set
@@ -2628,10 +2654,18 @@ def _schreier_sims(self, base=None):
return
strong_gens_distr = _distribute_gens_by_base(base, strong_gens)
- basic_orbits, transversals = _orbits_transversals_from_bsgs(base,\
- strong_gens_distr)
+ basic_orbits, transversals, slps = _orbits_transversals_from_bsgs(base,\
+ strong_gens_distr, slp=True)
+
+ # rewrite the indices stored in slps in terms of strong_gens
+ for i, slp in enumerate(slps):
+ gens = strong_gens_distr[i]
+ for k in slp:
+ slp[k] = [strong_gens.index(gens[s]) for s in slp[k]]
+
self._transversals = transversals
self._basic_orbits = [sorted(x) for x in basic_orbits]
+ self._transversal_slp = slps
def schreier_sims_incremental(self, base=None, gens=None):
"""Extend a sequence of points and generating set to a base and strong
@@ -2696,6 +2730,7 @@ def schreier_sims_incremental(self, base=None, gens=None):
id_af = list(range(degree))
# handle the trivial group
if len(gens) == 1 and gens[0].is_Identity:
+ self._strong_gens_slp = {gens[0]: [gens[0]]}
return base, gens
# prevent side effects
_base, _gens = base[:], gens[:]
@@ -2712,13 +2747,16 @@ def schreier_sims_incremental(self, base=None, gens=None):
_base.append(new)
# distribute generators according to basic stabilizers
strong_gens_distr = _distribute_gens_by_base(_base, _gens)
+ strong_gens_slp = {}
# initialize the basic stabilizers, basic orbits and basic transversals
orbs = {}
transversals = {}
+ slps = {}
base_len = len(_base)
for i in range(base_len):
- transversals[i] = dict(_orbit_transversal(degree, strong_gens_distr[i],
- _base[i], pairs=True, af=True))
+ transversals[i], slps[i] = _orbit_transversal(degree, strong_gens_distr[i],
+ _base[i], pairs=True, af=True, slp=True)
+ transversals[i] = dict(transversals[i])
orbs[i] = list(transversals[i].keys())
# main loop: amend the stabilizer chain until we have generators
# for all stabilizers
@@ -2730,10 +2768,12 @@ def schreier_sims_incremental(self, base=None, gens=None):
# test the generators for being a strong generating set
db = {}
for beta, u_beta in list(transversals[i].items()):
- for gen in strong_gens_distr[i]:
+ for j, gen in enumerate(strong_gens_distr[i]):
gb = gen._array_form[beta]
u1 = transversals[i][gb]
g1 = _af_rmul(gen._array_form, u_beta)
+ slp = [(i, g) for g in slps[i][beta]]
+ slp = [(i, j)] + slp
if g1 != u1:
# test if the schreier generator is in the i+1-th
# would-be basic stabilizer
@@ -2743,7 +2783,11 @@ def schreier_sims_incremental(self, base=None, gens=None):
except KeyError:
u1_inv = db[gb] = _af_invert(u1)
schreier_gen = _af_rmul(u1_inv, g1)
- h, j = _strip_af(schreier_gen, _base, orbs, transversals, i)
+ u1_inv_slp = slps[i][gb][:]
+ u1_inv_slp.reverse()
+ u1_inv_slp = [(i, (g,)) for g in u1_inv_slp]
+ slp = u1_inv_slp + slp
+ h, j, slp = _strip_af(schreier_gen, _base, orbs, transversals, i, slp=slp, slps=slps)
if j <= base_len:
# new strong generator h at level j
y = False
@@ -2760,11 +2804,13 @@ def schreier_sims_incremental(self, base=None, gens=None):
# if a new strong generator is found, update the
# data structures and start over
h = _af_new(h)
+ strong_gens_slp[h] = slp
for l in range(i + 1, j):
strong_gens_distr[l].append(h)
- transversals[l] =\
- dict(_orbit_transversal(degree, strong_gens_distr[l],
- _base[l], pairs=True, af=True))
+ transversals[l], slps[l] =\
+ _orbit_transversal(degree, strong_gens_distr[l],
+ _base[l], pairs=True, af=True, slp=True)
+ transversals[l] = dict(transversals[l])
orbs[l] = list(transversals[l].keys())
i = j - 1
# continue main loop using the flag
@@ -2778,6 +2824,22 @@ def schreier_sims_incremental(self, base=None, gens=None):
i -= 1
# build the strong generating set
strong_gens = list(uniq(i for gens in strong_gens_distr for i in gens))
+
+ # rewrite the indices of strong_gens_slp in terms of the elements
+ # of strong_gens
+ for k in strong_gens_slp:
+ slp = strong_gens_slp[k]
+ for i in range(len(slp)):
+ s = slp[i]
+ if isinstance(s[1], tuple):
+ slp[i] = strong_gens_distr[s[0]][s[1][0]]**-1
+ else:
+ slp[i] = strong_gens_distr[s[0]][s[1]]
+ strong_gens_slp[k] = slp
+ # add the original generators
+ for g in _gens:
+ strong_gens_slp[g] = [strong_gens.index(g)]
+ self._strong_gens_slp = strong_gens_slp
return _base, strong_gens
def schreier_sims_random(self, base=None, gens=None, consec_succ=10,
@@ -3433,7 +3495,7 @@ def _orbits(degree, generators):
sorted_I = [i for i in sorted_I if i not in orb]
return orbs
-def _orbit_transversal(degree, generators, alpha, pairs, af=False):
+def _orbit_transversal(degree, generators, alpha, pairs, af=False, slp=False):
r"""Computes a transversal for the orbit of ``alpha`` as a set.
generators generators of the group ``G``
@@ -3448,6 +3510,11 @@ def _orbit_transversal(degree, generators, alpha, pairs, af=False):
if ``af`` is ``True``, the transversal elements are given in
array form.
+ If `slp` is `True`, a dictionary `{beta: slp_beta}` is returned
+ for `\beta \in Orb` where `slp_beta` is a list of indices of the
+ generators in `generators` s.t. if `slp_beta = [i_1 ... i_n]`
+ `g_\beta = generators[i_n]*...*generators[i_1]`.
+
Examples
========
@@ -3461,24 +3528,35 @@ def _orbit_transversal(degree, generators, alpha, pairs, af=False):
"""
tr = [(alpha, list(range(degree)))]
+ slp_dict = {alpha: []}
used = [False]*degree
used[alpha] = True
gens = [x._array_form for x in generators]
for x, px in tr:
+ px_slp = slp_dict[x]
for gen in gens:
temp = gen[x]
if used[temp] == False:
+ slp_dict[temp] = [gens.index(gen)] + px_slp
tr.append((temp, _af_rmul(gen, px)))
used[temp] = True
if pairs:
if not af:
tr = [(x, _af_new(y)) for x, y in tr]
- return tr
+ if not slp:
+ return tr
+ return tr, slp_dict
if af:
- return [y for _, y in tr]
+ tr = [y for _, y in tr]
+ if not slp:
+ return tr
+ return tr, slp_dict
- return [_af_new(y) for _, y in tr]
+ tr = [_af_new(y) for _, y in tr]
+ if not slp:
+ return tr
+ return tr, slp_dict
def _stabilizer(degree, generators, alpha):
r"""Return the stabilizer subgroup of ``alpha``.
diff --git a/sympy/combinatorics/permutations.py b/sympy/combinatorics/permutations.py
index af08ac7e0031..f24c4181c3e0 100644
--- a/sympy/combinatorics/permutations.py
+++ b/sympy/combinatorics/permutations.py
@@ -1924,6 +1924,10 @@ def is_Empty(self):
"""
return self.size == 0
+ @property
+ def is_identity(self):
+ return self.is_Identity
+
@property
def is_Identity(self):
"""
diff --git a/sympy/combinatorics/util.py b/sympy/combinatorics/util.py
index dbd93a94e399..a784879a3167 100644
--- a/sympy/combinatorics/util.py
+++ b/sympy/combinatorics/util.py
@@ -246,7 +246,7 @@ def _handle_precomputed_bsgs(base, strong_gens, transversals=None,
def _orbits_transversals_from_bsgs(base, strong_gens_distr,
- transversals_only=False):
+ transversals_only=False, slp=False):
"""
Compute basic orbits and transversals from a base and strong generating set.
@@ -262,6 +262,10 @@ def _orbits_transversals_from_bsgs(base, strong_gens_distr,
stabilizers
``transversals_only`` - a flag switching between returning only the
transversals/ both orbits and transversals
+ ``slp`` - if ``True``, return a list of dictionaries containing the
+ generator presentations of the elements of the transversals,
+ i.e. the list of indices of generators from `strong_gens_distr[i]`
+ such that their product is the relevant transversal element
Examples
========
@@ -288,17 +292,21 @@ def _orbits_transversals_from_bsgs(base, strong_gens_distr,
base_len = len(base)
degree = strong_gens_distr[0][0].size
transversals = [None]*base_len
+ slps = [None]*base_len
if transversals_only is False:
basic_orbits = [None]*base_len
for i in range(base_len):
- transversals[i] = dict(_orbit_transversal(degree, strong_gens_distr[i],
- base[i], pairs=True))
+ transversals[i], slps[i] = _orbit_transversal(degree, strong_gens_distr[i],
+ base[i], pairs=True, slp=True)
+ transversals[i] = dict(transversals[i])
if transversals_only is False:
basic_orbits[i] = list(transversals[i].keys())
if transversals_only:
return transversals
else:
- return basic_orbits, transversals
+ if not slp:
+ return basic_orbits, transversals
+ return basic_orbits, transversals, slps
def _remove_gens(base, strong_gens, basic_orbits=None, strong_gens_distr=None):
@@ -453,7 +461,7 @@ def _strip(g, base, orbits, transversals):
return _af_new(h), base_len + 1
-def _strip_af(h, base, orbits, transversals, j):
+def _strip_af(h, base, orbits, transversals, j, slp=[], slps={}):
"""
optimized _strip, with h, transversals and result in array form
if the stripped elements is the identity, it returns False, base_len + 1
@@ -466,12 +474,23 @@ def _strip_af(h, base, orbits, transversals, j):
if beta == base[i]:
continue
if beta not in orbits[i]:
- return h, i + 1
+ if not slp:
+ return h, i + 1
+ return h, i + 1, slp
u = transversals[i][beta]
if h == u:
- return False, base_len + 1
+ if not slp:
+ return False, base_len + 1
+ return False, base_len + 1, slp
h = _af_rmul(_af_invert(u), h)
- return h, base_len + 1
+ if slp:
+ u_slp = slps[i][beta][:]
+ u_slp.reverse()
+ u_slp = [(i, (g,)) for g in u_slp]
+ slp = u_slp + slp
+ if not slp:
+ return h, base_len + 1
+ return h, base_len + 1, slp
def _strong_gens_from_distr(strong_gens_distr):
| diff --git a/sympy/combinatorics/tests/test_homomorphisms.py b/sympy/combinatorics/tests/test_homomorphisms.py
new file mode 100644
index 000000000000..f422ee6a75c7
--- /dev/null
+++ b/sympy/combinatorics/tests/test_homomorphisms.py
@@ -0,0 +1,32 @@
+from sympy.combinatorics import Permutation
+from sympy.combinatorics.perm_groups import PermutationGroup
+from sympy.combinatorics.homomorphisms import homomorphism
+from sympy.combinatorics.free_groups import free_group
+from sympy.combinatorics.fp_groups import FpGroup
+from sympy.combinatorics.named_groups import AlternatingGroup
+
+def test_homomorphism():
+ F, a, b = free_group("a, b")
+ G = FpGroup(F, [a**3, b**3, (a*b)**2])
+
+ c = Permutation(3)(0, 1, 2)
+ d = Permutation(3)(1, 2, 3)
+ A = AlternatingGroup(4)
+ T = homomorphism(G, A, [a, b], [c, d])
+ assert T(a*b**2*a**-1) == c*d**2*c**-1
+ assert T.is_isomorphism()
+ assert T(T.invert(Permutation(3)(0, 2, 3))) == Permutation(3)(0, 2, 3)
+
+ T = homomorphism(G, AlternatingGroup(4), G.generators)
+ assert T.is_trivial()
+ assert T.kernel().order() == G.order()
+
+ F, a = free_group("a")
+ G = FpGroup(F, [a**8])
+ P = PermutationGroup([Permutation(0, 1, 2, 3), Permutation(0, 2)])
+ T = homomorphism(G, P, [a], [Permutation(0, 1, 2, 3)])
+ assert T.image().order() == 4
+ assert T(T.invert(Permutation(0, 2)(1, 3))) == Permutation(0, 2)(1, 3)
+
+ T = homomorphism(F, AlternatingGroup(4), F.generators, [c])
+ assert T.invert(c**2) == a**2
diff --git a/sympy/combinatorics/tests/test_perm_groups.py b/sympy/combinatorics/tests/test_perm_groups.py
index e1b31e73a1a3..16d6e1c77536 100644
--- a/sympy/combinatorics/tests/test_perm_groups.py
+++ b/sympy/combinatorics/tests/test_perm_groups.py
@@ -1,5 +1,6 @@
from sympy.core.compatibility import range
-from sympy.combinatorics.perm_groups import PermutationGroup
+from sympy.combinatorics.perm_groups import (PermutationGroup,
+ _orbit_transversal)
from sympy.combinatorics.named_groups import SymmetricGroup, CyclicGroup,\
DihedralGroup, AlternatingGroup, AbelianGroup, RubikGroup
from sympy.combinatorics.permutations import Permutation
@@ -240,6 +241,15 @@ def test_orbits():
[(0, Permutation([0, 1, 2])), (2, Permutation([2, 0, 1])),
(1, Permutation([1, 2, 0]))]
+ G = DihedralGroup(6)
+ transversal, slps = _orbit_transversal(G.degree, G.generators, 0, True, slp=True)
+ for i, t in transversal:
+ slp = slps[i]
+ w = G.identity
+ for s in slp:
+ w = G.generators[s]*w
+ assert w == t
+
a = Permutation(list(range(1, 100)) + [0])
G = PermutationGroup([a])
assert [min(o) for o in G.orbits()] == [0]
@@ -752,3 +762,13 @@ def test_subgroup():
G = PermutationGroup(Permutation(0,1,2), Permutation(0,2,3))
H = G.subgroup([Permutation(0,1,3)])
assert H.is_subgroup(G)
+
+def test_generator_product():
+ G = SymmetricGroup(5)
+ p = Permutation(0, 2, 3)(1, 4)
+ gens = G.generator_product(p)
+ assert all(g in G.strong_gens for g in gens)
+ w = G.identity
+ for g in gens:
+ w = g*w
+ assert w == p
| [
{
"components": [
{
"doc": "",
"lines": [
57,
74
],
"name": "FpGroup.__init__",
"signature": "def __init__(self, fr_grp, relators):",
"type": "function"
},
{
"doc": "",
"lines": [
80,
81
... | [
"test_orbits"
] | [
"test_has",
"test_generate",
"test_order",
"test_equality",
"test_stabilizer",
"test_center",
"test_centralizer",
"test_coset_rank",
"test_coset_factor",
"test_is_normal",
"test_eq",
"test_derived_subgroup",
"test_is_solvable",
"test_rubik1",
"test_direct_product",
"test_orbit_rep",
... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Implemented Group Homomorphisms
This PR implements group homomorphisms from `FpGroup`s to `PermutationGroup`s (with plans to extend it too all group combinations in the future).
`GroupHomomorphism` is a new class to be instantiated with the function `homomorphism(domain, codomain, gens, images)` which creates a homomorphism from `domain` to `codomain` defined by sending the elements of `gens` (generators of `domain`) to the elements of `images` (elements of `codomain`). The unspecified generators are mapped to the identity and if no images are given, a trivial homomorphism is created. If the definition doesn't extend to a homomorphism on the whole of `domain`, an exception is raised.
A new class `FpSubgroup` in `sympy.combinatorics.fp_groups` is implemented for handling subgroups of `FpGroup`s in a way that would allow the use of words in the same generators as the original group. This is mainly for testing if an element of the parent group is contained in the subgroup and also the order can be calculated. The method `to_FpGroup()` returns an isomorphic `FpGroup` on different generators (same as would be returned by `FpGroup`'s `subgroup()` method).
Also there is a new method `generator_product()` for permutation groups which, given an element `g` of the group, returns a list of strong generators `[s_1 ... s_n]` s.t. `g = s_1*...*s_n`. This is possible by keeping track of the generators making up the elements of `basic_transversals` and storing their indices in a new attribute `_transversal_slp`.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/combinatorics/fp_groups.py]
(definition of FpGroup.__init__:)
def __init__(self, fr_grp, relators):
(definition of FpGroup.identity:)
def identity(self):
(definition of FpGroup.__contains__:)
def __contains__(self, g):
(definition of FpSubgroup:)
class FpSubgroup(DefaultPrinting):
"""The class implementing a subgroup of an FpGroup or a FreeGroup
(only finite index subgroups are supported at this point). This
is to be used if one wishes to check if an element of the original
group belongs to the subgroup"""
(definition of FpSubgroup.__init__:)
def __init__(self, G, gens):
(definition of FpSubgroup.__contains__:)
def __contains__(self, g):
(definition of FpSubgroup.__contains__._word_break:)
def _word_break(w):
(definition of FpSubgroup.order:)
def order(self):
(definition of FpSubgroup.to_FpGroup:)
def to_FpGroup(self):
[end of new definitions in sympy/combinatorics/fp_groups.py]
[start of new definitions in sympy/combinatorics/homomorphisms.py]
(definition of GroupHomomorphism:)
class GroupHomomorphism(object):
"""A class representing group homomorphisms. Instantiate using `homomorphism()`.
References
==========
[1] Holt, D., Eick, B. and O'Brien, E. (2005). Handbook of computational group theory."""
(definition of GroupHomomorphism.__init__:)
def __init__(self, domain, codomain, images):
(definition of GroupHomomorphism._invs:)
def _invs(self):
"""Return a dictionary with `{gen: inverse}` where `gen` is a rewriting
generator of `codomain` (e.g. strong generator for permutation groups)
and `inverse` is an element of its preimage"""
(definition of GroupHomomorphism.invert:)
def invert(self, g):
"""Return an element of the preimage of `g`"""
(definition of GroupHomomorphism.kernel:)
def kernel(self):
"""Compute the kernel of `self`."""
(definition of GroupHomomorphism._compute_kernel:)
def _compute_kernel(self):
(definition of GroupHomomorphism.image:)
def image(self):
"""Compute the image of `self`."""
(definition of GroupHomomorphism._apply:)
def _apply(self, elem):
"""Apply `self` to `elem`."""
(definition of GroupHomomorphism.__call__:)
def __call__(self, elem):
(definition of GroupHomomorphism.is_injective:)
def is_injective(self):
"""Check if the homomorphism is injective"""
(definition of GroupHomomorphism.is_surjective:)
def is_surjective(self):
"""Check if the homomorphism is surjective"""
(definition of GroupHomomorphism.is_isomorphism:)
def is_isomorphism(self):
"""Check if `self` is an isomorphism."""
(definition of GroupHomomorphism.is_trivial:)
def is_trivial(self):
"""Check is `self` is a trivial homomorphism, i.e. all elements
are mapped to the identity."""
(definition of homomorphism:)
def homomorphism(domain, codomain, gens, images=[]):
"""Create (if possible) a group homomorphism from the group `domain`
to the group `codomain` defined by the images of the domain's
generators `gens`. `gens` and `images` can be either lists or tuples
of equal sizes. If `gens` is a proper subset of the group's generators,
the unspecified generators will be mapped to the identity. If the
images are not specified, a trivial homomorphism will be created.
If the given images of the generators do not define a homomorphism,
an exception is raised."""
(definition of _check_homomorphism:)
def _check_homomorphism(domain, images, identity):
(definition of _check_homomorphism._image:)
def _image(r):
[end of new definitions in sympy/combinatorics/homomorphisms.py]
[start of new definitions in sympy/combinatorics/perm_groups.py]
(definition of PermutationGroup.generator_product:)
def generator_product(self, g):
"""Return a list of strong generators `[s1, ..., sn]`
s.t `g = sn*...*s1`."""
(definition of PermutationGroup.identity:)
def identity(self):
"""Return the identity element of the permutation group."""
[end of new definitions in sympy/combinatorics/perm_groups.py]
[start of new definitions in sympy/combinatorics/permutations.py]
(definition of Permutation.is_identity:)
def is_identity(self):
[end of new definitions in sympy/combinatorics/permutations.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | ec9e3c0436fbff934fa84e22bf07f1b3ef5bfac3 | ||
joke2k__faker-538 | 538 | joke2k/faker | null | 64b83d83f9e24154a73e4d292b6e1a1a53b83474 | 2017-06-18T19:03:51Z | diff --git a/faker/providers/ssn/pl_PL/__init__.py b/faker/providers/ssn/pl_PL/__init__.py
new file mode 100644
index 0000000000..378519e9da
--- /dev/null
+++ b/faker/providers/ssn/pl_PL/__init__.py
@@ -0,0 +1,66 @@
+# coding=utf-8
+
+from __future__ import unicode_literals
+from .. import Provider as SsnProvider
+from faker.providers.date_time import Provider as DateTimeProvider
+
+
+def checksum(digits):
+ """
+ Calculates and returns a control digit for given list of digits basing on PESEL standard.
+ """
+ weights_for_check_digit = [9, 7, 3, 1, 9, 7, 3, 1, 9, 7]
+ check_digit = 0
+
+ for i in range(0, 10):
+ check_digit += weights_for_check_digit[i] * digits[i]
+
+ check_digit %= 10
+
+ return check_digit
+
+
+def calculate_month(birth_date):
+ """
+ Calculates and returns a month number basing on PESEL standard.
+ """
+ year = int(birth_date.strftime('%Y'))
+ month = int(birth_date.strftime('%m')) + ((int(year / 100) - 14) % 5) * 20
+
+ return month
+
+
+class Provider(SsnProvider):
+
+ @classmethod
+ def ssn(cls):
+ """
+ Returns 11 character Polish national identity code (Public Electronic Census System,
+ Polish: Powszechny Elektroniczny System Ewidencji Ludności - PESEL).
+
+ It has the form YYMMDDZZZXQ, where YYMMDD is the date of birth (with century
+ encoded in month field), ZZZ is the personal identification number, X denotes sex
+ (even for females, odd for males) and Q is a parity number.
+
+ https://en.wikipedia.org/wiki/National_identification_number#Poland
+ """
+ birth_date = DateTimeProvider.date_time()
+
+ year_without_century = int(birth_date.strftime('%y'))
+ month = calculate_month(birth_date)
+ day = int(birth_date.strftime('%d'))
+
+ pesel_digits = [
+ int(year_without_century / 10),
+ year_without_century % 10,
+ int(month / 10),
+ month % 10,
+ int(day / 10), day % 10
+ ]
+
+ for _ in range(4):
+ pesel_digits.append(cls.random_digit())
+
+ pesel_digits.append(checksum(pesel_digits))
+
+ return ''.join(str(digit) for digit in pesel_digits)
| diff --git a/tests/providers/ssn.py b/tests/providers/ssn.py
index fe559113d1..9e9e66277b 100644
--- a/tests/providers/ssn.py
+++ b/tests/providers/ssn.py
@@ -4,11 +4,14 @@
import unittest
import re
+from datetime import datetime
from faker import Factory
from faker.providers.ssn.et_EE import Provider as EtProvider, checksum as et_checksum
from faker.providers.ssn.hr_HR import Provider as HrProvider, checksum as hr_checksum
from faker.providers.ssn.pt_BR import Provider as PtProvider, checksum as pt_checksum
+from faker.providers.ssn.pl_PL import (Provider as PlProvider, checksum as pl_checksum,
+ calculate_month as pl_calculate_mouth)
class TestEtEE(unittest.TestCase):
@@ -48,7 +51,6 @@ def test_ssn(self):
class TestPtBR(unittest.TestCase):
-
def setUp(self):
self.factory = Factory.create('pt_BR')
@@ -64,3 +66,37 @@ def test_pt_BR_cpf(self):
for _ in range(100):
self.assertTrue(re.search(r'\d{3}\.\d{3}\.\d{3}\-\d{2}', PtProvider.cpf()))
+
+class TestPlPL(unittest.TestCase):
+ """ Tests SSN in the pl_PL locale """
+
+ def setUp(self):
+ self.factory = Factory.create('pl_PL')
+
+ def test_ssn_checksum(self):
+ self.assertEqual(pl_checksum([0, 5, 2, 6, 2, 8, 1, 2, 3, 6]), 5)
+ self.assertEqual(pl_checksum([8, 5, 0, 5, 0, 8, 1, 5, 5, 8]), 7)
+ self.assertEqual(pl_checksum([4, 5, 1, 1, 1, 0, 0, 2, 4, 3]), 3)
+ self.assertEqual(pl_checksum([9, 1, 0, 7, 2, 6, 1, 4, 8, 7]), 3)
+ self.assertEqual(pl_checksum([8, 1, 1, 2, 1, 4, 1, 1, 8, 7]), 6)
+
+ def test_calculate_month(self):
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('1 1 1900', '%m %d %Y')), 1)
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('12 1 1900', '%m %d %Y')), 12)
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('1 1 1999', '%m %d %Y')), 1)
+
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('1 1 2000', '%m %d %Y')), 21)
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('12 1 2000', '%m %d %Y')), 32)
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('1 1 2099', '%m %d %Y')), 21)
+
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('1 1 2100', '%m %d %Y')), 41)
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('12 1 2100', '%m %d %Y')), 52)
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('1 1 2199', '%m %d %Y')), 41)
+
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('1 1 2200', '%m %d %Y')), 61)
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('12 1 2200', '%m %d %Y')), 72)
+ self.assertEqual(pl_calculate_mouth(datetime.strptime('1 1 2299', '%m %d %Y')), 61)
+
+ def test_ssn(self):
+ for i in range(100):
+ self.assertTrue(re.search(r'^\d{11}$', PlProvider.ssn()))
| [
{
"components": [
{
"doc": "Calculates and returns a control digit for given list of digits basing on PESEL standard.",
"lines": [
8,
20
],
"name": "checksum",
"signature": "def checksum(digits):",
"type": "function"
},
{
... | [
"tests/providers/ssn.py::TestEtEE::test_ssn",
"tests/providers/ssn.py::TestEtEE::test_ssn_checksum",
"tests/providers/ssn.py::TestHrHR::test_ssn",
"tests/providers/ssn.py::TestHrHR::test_ssn_checksum",
"tests/providers/ssn.py::TestPtBR::test_pt_BR_cpf",
"tests/providers/ssn.py::TestPtBR::test_pt_BR_ssn",
... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Adding an implementation of `ssn` provider for the pl_PL locale.
Reference: https://en.wikipedia.org/wiki/National_identification_number#Poland
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/ssn/pl_PL/__init__.py]
(definition of checksum:)
def checksum(digits):
"""Calculates and returns a control digit for given list of digits basing on PESEL standard."""
(definition of calculate_month:)
def calculate_month(birth_date):
"""Calculates and returns a month number basing on PESEL standard."""
(definition of Provider:)
class Provider(SsnProvider): @classmethod
(definition of Provider.ssn:)
def ssn(cls):
"""Returns 11 character Polish national identity code (Public Electronic Census System,
Polish: Powszechny Elektroniczny System Ewidencji Ludności - PESEL).
It has the form YYMMDDZZZXQ, where YYMMDD is the date of birth (with century
encoded in month field), ZZZ is the personal identification number, X denotes sex
(even for females, odd for males) and Q is a parity number.
https://en.wikipedia.org/wiki/National_identification_number#Poland"""
[end of new definitions in faker/providers/ssn/pl_PL/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
sympy__sympy-12767 | 12,767 | sympy/sympy | 1.5 | 5b589f797d5e8a8ed7c9878f71af29256da79651 | 2017-06-18T00:10:58Z | diff --git a/sympy/physics/wigner.py b/sympy/physics/wigner.py
index 4c41075c4a67..01d31aa4a320 100644
--- a/sympy/physics/wigner.py
+++ b/sympy/physics/wigner.py
@@ -1,3 +1,4 @@
+# -*- coding: utf-8 -*-
r"""
Wigner, Clebsch-Gordan, Racah, and Gaunt coefficients
@@ -12,9 +13,19 @@
References
~~~~~~~~~~
+.. [Regge58] 'Symmetry Properties of Clebsch-Gordan Coefficients',
+ T. Regge, Nuovo Cimento, Volume 10, pp. 544 (1958)
+.. [Regge59] 'Symmetry Properties of Racah Coefficients',
+ T. Regge, Nuovo Cimento, Volume 11, pp. 116 (1959)
+.. [Edmonds74] A. R. Edmonds. Angular momentum in quantum mechanics.
+ Investigations in physics, 4.; Investigations in physics, no. 4.
+ Princeton, N.J., Princeton University Press, 1957.
.. [Rasch03] J. Rasch and A. C. H. Yu, 'Efficient Storage Scheme for
Pre-calculated Wigner 3j, 6j and Gaunt Coefficients', SIAM
J. Sci. Comput. Volume 25, Issue 4, pp. 1416-1428 (2003)
+.. [Liberatodebrito82] 'FORTRAN program for the integral of three
+ spherical harmonics', A. Liberato de Brito,
+ Comput. Phys. Commun., Volume 25, pp. 81-85 (1982)
Credits and Copyright
~~~~~~~~~~~~~~~~~~~~~
@@ -29,12 +40,16 @@
- Jens Rasch (2009-05-31): updated to sage-4.0
+- Oscar Gerardo Lazo Arjona (2017-06-18): added Wigner D matrices
+
Copyright (C) 2008 Jens Rasch <jyr2000@gmail.com>
+
"""
from __future__ import print_function, division
-from sympy import (Integer, pi, sqrt, sympify, Dummy, S, Sum, Ynm,
- Function)
+from sympy import (Integer, pi, sqrt, sympify, Dummy, S, Sum, Ynm, zeros,
+ Function, sin, cos, exp, I, factorial, binomial,
+ Add, ImmutableMatrix)
from sympy.core.compatibility import range
# This list of precomputed factorials is needed to massively
@@ -146,14 +161,6 @@ def wigner_3j(j_1, j_2, j_3, m_1, m_2, m_3):
for finite precision arithmetic and only useful for a computer
algebra system [Rasch03]_.
- REFERENCES:
-
- .. [Regge58] 'Symmetry Properties of Clebsch-Gordan Coefficients',
- T. Regge, Nuovo Cimento, Volume 10, pp. 544 (1958)
-
- .. [Edmonds74] 'Angular Momentum in Quantum Mechanics',
- A. R. Edmonds, Princeton University Press (1974)
-
AUTHORS:
- Jens Rasch (2009-03-24): initial version
@@ -479,10 +486,6 @@ def wigner_6j(j_1, j_2, j_3, j_4, j_5, j_6, prec=None):
for finite precision arithmetic and only useful for a computer
algebra system [Rasch03]_.
- REFERENCES:
-
- .. [Regge59] 'Symmetry Properties of Racah Coefficients',
- T. Regge, Nuovo Cimento, Volume 11, pp. 116 (1959)
"""
res = (-1) ** int(j_1 + j_2 + j_4 + j_5) * \
racah(j_1, j_2, j_5, j_4, j_3, j_6, prec)
@@ -638,12 +641,6 @@ def gaunt(l_1, l_2, l_3, m_1, m_2, m_3, prec=None):
therefore unsuitable for finite precision arithmetic and only
useful for a computer algebra system [Rasch03]_.
- REFERENCES:
-
- .. [Liberatodebrito82] 'FORTRAN program for the integral of three
- spherical harmonics', A. Liberato de Brito,
- Comput. Phys. Commun., Volume 25, pp. 81-85 (1982)
-
AUTHORS:
- Jens Rasch (2009-03-24): initial version for Sage
@@ -753,3 +750,196 @@ def alpha(l,m,j,p,k):
return (-S(1))**(m+p) * Sum(Ynm(k, m+p, theta, phi) * alpha(l,m,j,p,k) / 2 \
*(k**2-j**2-l**2+k-j-l), (k, abs(l-j), l+j))
+
+
+def wigner_d_small(J, beta):
+ u"""Return the small Wigner d matrix for angular momentum J.
+
+ INPUT:
+
+ - ``J`` - An integer, half-integer, or sympy symbol for the total angular
+ momentum of the angular momentum space being rotated.
+
+ - ``beta`` - A real number representing the Euler angle of rotation about
+ the so-called line of nodes. See [Edmonds74]_.
+
+ OUTPUT:
+
+ A matrix representing the corresponding Euler angle rotation( in the basis
+ of eigenvectors of `J_z`).
+
+ .. math ::
+ \\mathcal{d}_{\\beta} = \\exp\\big( \\frac{i\\beta}{\\hbar} J_y\\big)
+
+ The components are calculated using the general form [Edmonds74]_,
+ equation 4.1.15.
+
+ Examples
+ ========
+
+ >>> from sympy import Integer, symbols, pi, pprint
+ >>> from sympy.physics.wigner import wigner_d_small
+ >>> half = 1/Integer(2)
+ >>> beta = symbols("beta", real=True)
+ >>> pprint(wigner_d_small(half, beta), use_unicode=True)
+ ⎡ ⎛β⎞ ⎛β⎞⎤
+ ⎢cos⎜─⎟ sin⎜─⎟⎥
+ ⎢ ⎝2⎠ ⎝2⎠⎥
+ ⎢ ⎥
+ ⎢ ⎛β⎞ ⎛β⎞⎥
+ ⎢-sin⎜─⎟ cos⎜─⎟⎥
+ ⎣ ⎝2⎠ ⎝2⎠⎦
+
+ >>> pprint(wigner_d_small(2*half, beta), use_unicode=True)
+ ⎡ 2⎛β⎞ ⎛β⎞ ⎛β⎞ 2⎛β⎞ ⎤
+ ⎢ cos ⎜─⎟ √2⋅sin⎜─⎟⋅cos⎜─⎟ sin ⎜─⎟ ⎥
+ ⎢ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎥
+ ⎢ ⎥
+ ⎢ ⎛β⎞ ⎛β⎞ 2⎛β⎞ 2⎛β⎞ ⎛β⎞ ⎛β⎞⎥
+ ⎢-√2⋅sin⎜─⎟⋅cos⎜─⎟ - sin ⎜─⎟ + cos ⎜─⎟ √2⋅sin⎜─⎟⋅cos⎜─⎟⎥
+ ⎢ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠⎥
+ ⎢ ⎥
+ ⎢ 2⎛β⎞ ⎛β⎞ ⎛β⎞ 2⎛β⎞ ⎥
+ ⎢ sin ⎜─⎟ -√2⋅sin⎜─⎟⋅cos⎜─⎟ cos ⎜─⎟ ⎥
+ ⎣ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎦
+
+ From table 4 in [Edmonds74]_
+
+ >>> pprint(wigner_d_small(half, beta).subs({beta:pi/2}), use_unicode=True)
+ ⎡ √2 √2⎤
+ ⎢ ── ──⎥
+ ⎢ 2 2 ⎥
+ ⎢ ⎥
+ ⎢-√2 √2⎥
+ ⎢──── ──⎥
+ ⎣ 2 2 ⎦
+
+ >>> pprint(wigner_d_small(2*half, beta).subs({beta:pi/2}),
+ ... use_unicode=True)
+ ⎡ √2 ⎤
+ ⎢1/2 ── 1/2⎥
+ ⎢ 2 ⎥
+ ⎢ ⎥
+ ⎢-√2 √2 ⎥
+ ⎢──── 0 ── ⎥
+ ⎢ 2 2 ⎥
+ ⎢ ⎥
+ ⎢ -√2 ⎥
+ ⎢1/2 ──── 1/2⎥
+ ⎣ 2 ⎦
+
+ >>> pprint(wigner_d_small(3*half, beta).subs({beta:pi/2}),
+ ... use_unicode=True)
+ ⎡ √2 √6 √6 √2⎤
+ ⎢ ── ── ── ──⎥
+ ⎢ 4 4 4 4 ⎥
+ ⎢ ⎥
+ ⎢-√6 -√2 √2 √6⎥
+ ⎢──── ──── ── ──⎥
+ ⎢ 4 4 4 4 ⎥
+ ⎢ ⎥
+ ⎢ √6 -√2 -√2 √6⎥
+ ⎢ ── ──── ──── ──⎥
+ ⎢ 4 4 4 4 ⎥
+ ⎢ ⎥
+ ⎢-√2 √6 -√6 √2⎥
+ ⎢──── ── ──── ──⎥
+ ⎣ 4 4 4 4 ⎦
+
+ >>> pprint(wigner_d_small(4*half, beta).subs({beta:pi/2}),
+ ... use_unicode=True)
+ ⎡ √6 ⎤
+ ⎢1/4 1/2 ── 1/2 1/4⎥
+ ⎢ 4 ⎥
+ ⎢ ⎥
+ ⎢-1/2 -1/2 0 1/2 1/2⎥
+ ⎢ ⎥
+ ⎢ √6 √6 ⎥
+ ⎢ ── 0 -1/2 0 ── ⎥
+ ⎢ 4 4 ⎥
+ ⎢ ⎥
+ ⎢-1/2 1/2 0 -1/2 1/2⎥
+ ⎢ ⎥
+ ⎢ √6 ⎥
+ ⎢1/4 -1/2 ── -1/2 1/4⎥
+ ⎣ 4 ⎦
+
+ """
+ M = [J-i for i in range(2*J+1)]
+ d = zeros(2*J+1)
+ for i, Mi in enumerate(M):
+ for j, Mj in enumerate(M):
+
+ # We get the maximum and minimum value of sigma.
+ sigmamax = max([-Mi-Mj, J-Mj])
+ sigmamin = min([0, J-Mi])
+
+ dij = sqrt(factorial(J+Mi)*factorial(J-Mi) /
+ factorial(J+Mj)/factorial(J-Mj))
+ terms = [(-1)**(J-Mi-s) *
+ binomial(J+Mj, J-Mi-s) *
+ binomial(J-Mj, s) *
+ cos(beta/2)**(2*s+Mi+Mj) *
+ sin(beta/2)**(2*J-2*s-Mj-Mi)
+ for s in range(sigmamin, sigmamax+1)]
+
+ d[i, j] = dij*Add(*terms)
+
+ return ImmutableMatrix(d)
+
+
+def wigner_d(J, alpha, beta, gamma):
+ u"""Return the Wigner D matrix for angular momentum J.
+
+ INPUT:
+
+ - ``J`` - An integer, half-integer, or sympy symbol for the total angular
+ momentum of the angular momentum space being rotated.
+
+ - ``alpha``, ``beta``, ``gamma`` - Real numbers representing the Euler
+ angles of rotation about the so-called vertical, line of nodes, and
+ figure axes. See [Edmonds74]_.
+
+ OUTPUT:
+
+ A matrix representing the corresponding Euler angle rotation( in the basis
+ of eigenvectors of `J_z`).
+
+ .. math ::
+ \\mathcal{D}_{\\alpha \\beta \\gamma} =
+ \\exp\\big( \\frac{i\\alpha}{\\hbar} J_z\\big)
+ \\exp\\big( \\frac{i\\beta}{\\hbar} J_y\\big)
+ \\exp\\big( \\frac{i\\gamma}{\\hbar} J_z\\big)
+
+ The components are calculated using the general form [Edmonds74]_,
+ equation 4.1.12.
+
+ Examples
+ ========
+
+ The simplest possible example:
+
+ >>> from sympy.physics.wigner import wigner_d
+ >>> from sympy import Integer, symbols, pprint
+ >>> from sympy.physics.wigner import wigner_d_small
+ >>> half = 1/Integer(2)
+ >>> alpha, beta, gamma = symbols("alpha, beta, gamma", real=True)
+ >>> pprint(wigner_d(half, alpha, beta, gamma), use_unicode=True)
+ ⎡ ⅈ⋅α ⅈ⋅γ ⅈ⋅α -ⅈ⋅γ ⎤
+ ⎢ ─── ─── ─── ───── ⎥
+ ⎢ 2 2 ⎛β⎞ 2 2 ⎛β⎞ ⎥
+ ⎢ ℯ ⋅ℯ ⋅cos⎜─⎟ ℯ ⋅ℯ ⋅sin⎜─⎟ ⎥
+ ⎢ ⎝2⎠ ⎝2⎠ ⎥
+ ⎢ ⎥
+ ⎢ -ⅈ⋅α ⅈ⋅γ -ⅈ⋅α -ⅈ⋅γ ⎥
+ ⎢ ───── ─── ───── ───── ⎥
+ ⎢ 2 2 ⎛β⎞ 2 2 ⎛β⎞⎥
+ ⎢-ℯ ⋅ℯ ⋅sin⎜─⎟ ℯ ⋅ℯ ⋅cos⎜─⎟⎥
+ ⎣ ⎝2⎠ ⎝2⎠⎦
+
+ """
+ d = wigner_d_small(J, beta)
+ M = [J-i for i in range(2*J+1)]
+ D = [[exp(I*Mi*alpha)*d[i, j]*exp(I*Mj*gamma)
+ for j, Mj in enumerate(M)] for i, Mi in enumerate(M)]
+ return ImmutableMatrix(D)
| diff --git a/sympy/physics/tests/test_clebsch_gordan.py b/sympy/physics/tests/test_clebsch_gordan.py
index 4b8eae51ea9b..6f5d9d379902 100644
--- a/sympy/physics/tests/test_clebsch_gordan.py
+++ b/sympy/physics/tests/test_clebsch_gordan.py
@@ -1,6 +1,7 @@
-from sympy import S, sqrt, pi, Dummy, Sum, Ynm, symbols
+from sympy import (S, sqrt, pi, Dummy, Sum, Ynm, symbols, exp, sin, cos, I,
+ Matrix)
from sympy.physics.wigner import (clebsch_gordan, wigner_9j, wigner_6j, gaunt,
- racah, dot_rot_grad_Ynm, Wigner3j, wigner_3j)
+ racah, dot_rot_grad_Ynm, Wigner3j, wigner_3j, wigner_d_small, wigner_d)
from sympy.core.numbers import Rational
# for test cases, refer : https://en.wikipedia.org/wiki/Table_of_Clebsch%E2%80%93Gordan_coefficients
@@ -316,3 +317,17 @@ def test_dot_rota_grad_SH():
assert dot_rot_grad_Ynm(3, 2, 3, 2, theta, phi).doit().expand() == \
-sqrt(70)*Ynm(4, 4, theta, phi)/(11*sqrt(pi)) + \
45*sqrt(182)*Ynm(6, 4, theta, phi)/(143*sqrt(pi))
+
+
+def test_wigner_d():
+ half = S(1)/2
+ alpha, beta, gamma = symbols("alpha, beta, gamma", real=True)
+ d = wigner_d_small(half, beta).subs({beta: pi/2})
+ d_ = Matrix([[1, 1], [-1, 1]])/sqrt(2)
+ assert d == d_
+
+ D = wigner_d(half, alpha, beta, gamma)
+ assert D[0, 0] == exp(I*alpha/2)*exp(I*gamma/2)*cos(beta/2)
+ assert D[0, 1] == exp(I*alpha/2)*exp(-I*gamma/2)*sin(beta/2)
+ assert D[1, 0] == -exp(-I*alpha/2)*exp(I*gamma/2)*sin(beta/2)
+ assert D[1, 1] == exp(-I*alpha/2)*exp(-I*gamma/2)*cos(beta/2)
| [
{
"components": [
{
"doc": "Return the small Wigner d matrix for angular momentum J.\n\nINPUT:\n\n- ``J`` - An integer, half-integer, or sympy symbol for the total angular\n momentum of the angular momentum space being rotated.\n\n- ``beta`` - A real number representing the Euler angle of rot... | [
"test_clebsch_gordan_docs",
"test_clebsch_gordan1",
"test_clebsch_gordan2",
"test_clebsch_gordan3",
"test_clebsch_gordan4",
"test_clebsch_gordan5",
"test_wigner",
"test_gaunt",
"test_racah",
"test_dot_rota_grad_SH"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Added Wigner D matrices
Fixes #12766
Adding functions `wigner_d_small` and `wigner_d` to the module `physics.wigner`. Doctests are included.
<!-- BEGIN RELEASE NOTES -->
* physics.wigner
* added functions `wigner_d_small` and `wigner_d`.
<!-- END RELEASE NOTES -->
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/physics/wigner.py]
(definition of wigner_d_small:)
def wigner_d_small(J, beta):
"""Return the small Wigner d matrix for angular momentum J.
INPUT:
- ``J`` - An integer, half-integer, or sympy symbol for the total angular
momentum of the angular momentum space being rotated.
- ``beta`` - A real number representing the Euler angle of rotation about
the so-called line of nodes. See [Edmonds74]_.
OUTPUT:
A matrix representing the corresponding Euler angle rotation( in the basis
of eigenvectors of `J_z`).
.. math ::
\mathcal{d}_{\beta} = \exp\big( \frac{i\beta}{\hbar} J_y\big)
The components are calculated using the general form [Edmonds74]_,
equation 4.1.15.
Examples
========
>>> from sympy import Integer, symbols, pi, pprint
>>> from sympy.physics.wigner import wigner_d_small
>>> half = 1/Integer(2)
>>> beta = symbols("beta", real=True)
>>> pprint(wigner_d_small(half, beta), use_unicode=True)
⎡ ⎛β⎞ ⎛β⎞⎤
⎢cos⎜─⎟ sin⎜─⎟⎥
⎢ ⎝2⎠ ⎝2⎠⎥
⎢ ⎥
⎢ ⎛β⎞ ⎛β⎞⎥
⎢-sin⎜─⎟ cos⎜─⎟⎥
⎣ ⎝2⎠ ⎝2⎠⎦
>>> pprint(wigner_d_small(2*half, beta), use_unicode=True)
⎡ 2⎛β⎞ ⎛β⎞ ⎛β⎞ 2⎛β⎞ ⎤
⎢ cos ⎜─⎟ √2⋅sin⎜─⎟⋅cos⎜─⎟ sin ⎜─⎟ ⎥
⎢ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎥
⎢ ⎥
⎢ ⎛β⎞ ⎛β⎞ 2⎛β⎞ 2⎛β⎞ ⎛β⎞ ⎛β⎞⎥
⎢-√2⋅sin⎜─⎟⋅cos⎜─⎟ - sin ⎜─⎟ + cos ⎜─⎟ √2⋅sin⎜─⎟⋅cos⎜─⎟⎥
⎢ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠⎥
⎢ ⎥
⎢ 2⎛β⎞ ⎛β⎞ ⎛β⎞ 2⎛β⎞ ⎥
⎢ sin ⎜─⎟ -√2⋅sin⎜─⎟⋅cos⎜─⎟ cos ⎜─⎟ ⎥
⎣ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎝2⎠ ⎦
From table 4 in [Edmonds74]_
>>> pprint(wigner_d_small(half, beta).subs({beta:pi/2}), use_unicode=True)
⎡ √2 √2⎤
⎢ ── ──⎥
⎢ 2 2 ⎥
⎢ ⎥
⎢-√2 √2⎥
⎢──── ──⎥
⎣ 2 2 ⎦
>>> pprint(wigner_d_small(2*half, beta).subs({beta:pi/2}),
... use_unicode=True)
⎡ √2 ⎤
⎢1/2 ── 1/2⎥
⎢ 2 ⎥
⎢ ⎥
⎢-√2 √2 ⎥
⎢──── 0 ── ⎥
⎢ 2 2 ⎥
⎢ ⎥
⎢ -√2 ⎥
⎢1/2 ──── 1/2⎥
⎣ 2 ⎦
>>> pprint(wigner_d_small(3*half, beta).subs({beta:pi/2}),
... use_unicode=True)
⎡ √2 √6 √6 √2⎤
⎢ ── ── ── ──⎥
⎢ 4 4 4 4 ⎥
⎢ ⎥
⎢-√6 -√2 √2 √6⎥
⎢──── ──── ── ──⎥
⎢ 4 4 4 4 ⎥
⎢ ⎥
⎢ √6 -√2 -√2 √6⎥
⎢ ── ──── ──── ──⎥
⎢ 4 4 4 4 ⎥
⎢ ⎥
⎢-√2 √6 -√6 √2⎥
⎢──── ── ──── ──⎥
⎣ 4 4 4 4 ⎦
>>> pprint(wigner_d_small(4*half, beta).subs({beta:pi/2}),
... use_unicode=True)
⎡ √6 ⎤
⎢1/4 1/2 ── 1/2 1/4⎥
⎢ 4 ⎥
⎢ ⎥
⎢-1/2 -1/2 0 1/2 1/2⎥
⎢ ⎥
⎢ √6 √6 ⎥
⎢ ── 0 -1/2 0 ── ⎥
⎢ 4 4 ⎥
⎢ ⎥
⎢-1/2 1/2 0 -1/2 1/2⎥
⎢ ⎥
⎢ √6 ⎥
⎢1/4 -1/2 ── -1/2 1/4⎥
⎣ 4 ⎦"""
(definition of wigner_d:)
def wigner_d(J, alpha, beta, gamma):
"""Return the Wigner D matrix for angular momentum J.
INPUT:
- ``J`` - An integer, half-integer, or sympy symbol for the total angular
momentum of the angular momentum space being rotated.
- ``alpha``, ``beta``, ``gamma`` - Real numbers representing the Euler
angles of rotation about the so-called vertical, line of nodes, and
figure axes. See [Edmonds74]_.
OUTPUT:
A matrix representing the corresponding Euler angle rotation( in the basis
of eigenvectors of `J_z`).
.. math ::
\mathcal{D}_{\alpha \beta \gamma} =
\exp\big( \frac{i\alpha}{\hbar} J_z\big)
\exp\big( \frac{i\beta}{\hbar} J_y\big)
\exp\big( \frac{i\gamma}{\hbar} J_z\big)
The components are calculated using the general form [Edmonds74]_,
equation 4.1.12.
Examples
========
The simplest possible example:
>>> from sympy.physics.wigner import wigner_d
>>> from sympy import Integer, symbols, pprint
>>> from sympy.physics.wigner import wigner_d_small
>>> half = 1/Integer(2)
>>> alpha, beta, gamma = symbols("alpha, beta, gamma", real=True)
>>> pprint(wigner_d(half, alpha, beta, gamma), use_unicode=True)
⎡ ⅈ⋅α ⅈ⋅γ ⅈ⋅α -ⅈ⋅γ ⎤
⎢ ─── ─── ─── ───── ⎥
⎢ 2 2 ⎛β⎞ 2 2 ⎛β⎞ ⎥
⎢ ℯ ⋅ℯ ⋅cos⎜─⎟ ℯ ⋅ℯ ⋅sin⎜─⎟ ⎥
⎢ ⎝2⎠ ⎝2⎠ ⎥
⎢ ⎥
⎢ -ⅈ⋅α ⅈ⋅γ -ⅈ⋅α -ⅈ⋅γ ⎥
⎢ ───── ─── ───── ───── ⎥
⎢ 2 2 ⎛β⎞ 2 2 ⎛β⎞⎥
⎢-ℯ ⋅ℯ ⋅sin⎜─⎟ ℯ ⋅ℯ ⋅cos⎜─⎟⎥
⎣ ⎝2⎠ ⎝2⎠⎦"""
[end of new definitions in sympy/physics/wigner.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Add Wigner D matrices
We should have [Wigner D matrices](https://en.wikipedia.org/wiki/Wigner_D-matrix) in sympy. They fit nicely in the physics.wigner module. I have functions `wigner_d_small` and `wigner_d` as defined in [1] that I would like to add.
[1] [A. R. Edmonds. Angular momentum in quantum mechanics](http://press.princeton.edu/titles/478.html). Investigations
in physics, 4.; Investigations in physics, no. 4. Princeton, N.J.,
Princeton University Press, 1957.
----------
--------------------
</issues> | c72f122f67553e1af930bac6c35732d2a0bbb776 | |
joke2k__faker-526 | 526 | joke2k/faker | null | 68c30528959451af2bc27b76a51636dfaf266b41 | 2017-06-02T19:25:35Z | diff --git a/faker/providers/date_time/__init__.py b/faker/providers/date_time/__init__.py
index a1e6fff3e6..f8d344ae1b 100644
--- a/faker/providers/date_time/__init__.py
+++ b/faker/providers/date_time/__init__.py
@@ -361,6 +361,62 @@ def date_time_between(cls, start_date='-30y', end_date='now', tzinfo=None):
timestamp = random.randint(start_date, end_date)
return datetime.fromtimestamp(timestamp, tzinfo)
+ @classmethod
+ def future_datetime(cls, end_date='+30d', tzinfo=None):
+ """
+ Get a DateTime object based on a random date between now and a given date.
+ Accepts date strings that can be recognized by strtotime().
+
+ :param end_date Defaults to "+30d"
+ :param tzinfo: timezone, instance of datetime.tzinfo subclass
+ :example DateTime('1999-02-02 11:42:52')
+ :return DateTime
+ """
+ return cls.date_time_between(
+ start_date=timedelta(seconds=1), end_date=end_date, tzinfo=tzinfo,
+ )
+
+ @classmethod
+ def future_date(cls, end_date='+30d', tzinfo=None):
+ """
+ Get a Date object based on a random date between now and a given date.
+ Accepts date strings that can be recognized by strtotime().
+
+ :param end_date Defaults to "+30d"
+ :param tzinfo: timezone, instance of datetime.tzinfo subclass
+ :example DateTime('1999-02-02 11:42:52')
+ :return DateTime
+ """
+ return cls.future_datetime(end_date=end_date, tzinfo=tzinfo).date()
+
+ @classmethod
+ def past_datetime(cls, start_date='-30d', tzinfo=None):
+ """
+ Get a DateTime object based on a random date between a given date and now.
+ Accepts date strings that can be recognized by strtotime().
+
+ :param start_date Defaults to "-30d"
+ :param tzinfo: timezone, instance of datetime.tzinfo subclass
+ :example DateTime('1999-02-02 11:42:52')
+ :return DateTime
+ """
+ return cls.date_time_between(
+ start_date=start_date, end_date=timedelta(seconds=-1), tzinfo=tzinfo,
+ )
+
+ @classmethod
+ def past_date(cls, start_date='-30d', tzinfo=None):
+ """
+ Get a Date object based on a random date between a given date and now.
+ Accepts date strings that can be recognized by strtotime().
+
+ :param start_date Defaults to "-30d"
+ :param tzinfo: timezone, instance of datetime.tzinfo subclass
+ :example DateTime('1999-02-02 11:42:52')
+ :return DateTime
+ """
+ return cls.past_datetime(start_date=start_date, tzinfo=tzinfo).date()
+
@classmethod
def date_time_between_dates(cls, datetime_start=None, datetime_end=None, tzinfo=None):
"""
| diff --git a/tests/providers/date_time.py b/tests/providers/date_time.py
index 20079207d1..f31e15a5b5 100644
--- a/tests/providers/date_time.py
+++ b/tests/providers/date_time.py
@@ -1,17 +1,18 @@
# coding: utf-8
-
from __future__ import unicode_literals
+from datetime import date, datetime
import unittest
+
from faker import Factory
+from faker.providers.date_time import Provider as DatetimeProvider
from .. import string_types
-class TestHuHU(unittest.TestCase):
- """ Tests date_time in hu_HU locale. """
+class TestDateTime(unittest.TestCase):
def setUp(self):
- self.factory = Factory.create('hu_HU')
+ self.factory = Factory.create()
def test_day(self):
day = self.factory.day_of_week()
@@ -21,6 +22,27 @@ def test_month(self):
month = self.factory.month()
assert isinstance(month, string_types)
+ def test_past_datetime(self):
+ past_datetime = self.factory.past_datetime()
+ self.assertTrue(past_datetime < datetime.now())
+
+ def test_past_date(self):
+ past_date = self.factory.past_date()
+ self.assertTrue(past_date < date.today())
+
+ def test_future_datetime(self):
+ future_datetime, now = self.factory.future_datetime(), datetime.now()
+ self.assertTrue(future_datetime > now)
+
+ def test_future_date(self):
+ future_date = self.factory.future_date()
+ self.assertTrue(future_date > date.today())
+
+ def test_parse_date_time(self):
+ timestamp = DatetimeProvider._parse_date_time('+30d')
+ now = DatetimeProvider._parse_date_time('now')
+ self.assertTrue(timestamp > now)
+
class TestPlPL(unittest.TestCase):
| [
{
"components": [
{
"doc": "Get a DateTime object based on a random date between now and a given date.\nAccepts date strings that can be recognized by strtotime().\n\n:param end_date Defaults to \"+30d\"\n:param tzinfo: timezone, instance of datetime.tzinfo subclass\n:example DateTime('1999-02-02 ... | [
"tests/providers/date_time.py::TestDateTime::test_future_date",
"tests/providers/date_time.py::TestDateTime::test_future_datetime",
"tests/providers/date_time.py::TestDateTime::test_past_date",
"tests/providers/date_time.py::TestDateTime::test_past_datetime"
] | [
"tests/providers/date_time.py::TestDateTime::test_day",
"tests/providers/date_time.py::TestDateTime::test_month",
"tests/providers/date_time.py::TestDateTime::test_parse_date_time",
"tests/providers/date_time.py::TestPlPL::test_day",
"tests/providers/date_time.py::TestPlPL::test_month"
] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add `future_date`, `future_datetime`, `past_date` and `past_datetime`
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/date_time/__init__.py]
(definition of Provider.future_datetime:)
def future_datetime(cls, end_date='+30d', tzinfo=None):
"""Get a DateTime object based on a random date between now and a given date.
Accepts date strings that can be recognized by strtotime().
:param end_date Defaults to "+30d"
:param tzinfo: timezone, instance of datetime.tzinfo subclass
:example DateTime('1999-02-02 11:42:52')
:return DateTime"""
(definition of Provider.future_date:)
def future_date(cls, end_date='+30d', tzinfo=None):
"""Get a Date object based on a random date between now and a given date.
Accepts date strings that can be recognized by strtotime().
:param end_date Defaults to "+30d"
:param tzinfo: timezone, instance of datetime.tzinfo subclass
:example DateTime('1999-02-02 11:42:52')
:return DateTime"""
(definition of Provider.past_datetime:)
def past_datetime(cls, start_date='-30d', tzinfo=None):
"""Get a DateTime object based on a random date between a given date and now.
Accepts date strings that can be recognized by strtotime().
:param start_date Defaults to "-30d"
:param tzinfo: timezone, instance of datetime.tzinfo subclass
:example DateTime('1999-02-02 11:42:52')
:return DateTime"""
(definition of Provider.past_date:)
def past_date(cls, start_date='-30d', tzinfo=None):
"""Get a Date object based on a random date between a given date and now.
Accepts date strings that can be recognized by strtotime().
:param start_date Defaults to "-30d"
:param tzinfo: timezone, instance of datetime.tzinfo subclass
:example DateTime('1999-02-02 11:42:52')
:return DateTime"""
[end of new definitions in faker/providers/date_time/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
joke2k__faker-520 | 520 | joke2k/faker | null | 0ee26313099392f8b347c323d3c8eb294505ab58 | 2017-05-29T07:27:56Z | diff --git a/faker/providers/date_time/pl_PL/__init__.py b/faker/providers/date_time/pl_PL/__init__.py
new file mode 100644
index 0000000000..5d71a5c459
--- /dev/null
+++ b/faker/providers/date_time/pl_PL/__init__.py
@@ -0,0 +1,41 @@
+# coding: utf-8
+
+from __future__ import unicode_literals
+
+from .. import Provider as DateTimeProvider
+
+
+class Provider(DateTimeProvider):
+
+ @classmethod
+ def day_of_week(cls):
+ day = cls.date('%w')
+ DAY_NAMES = {
+ '0': 'poniedziałek',
+ '1': 'wtorek',
+ '2': 'środa',
+ '3': 'czwartek',
+ '4': 'piątek',
+ '5': 'sobota',
+ '6': 'niedziela',
+ }
+ return DAY_NAMES[day]
+
+ @classmethod
+ def month_name(cls):
+ month = cls.month()
+ MONTH_NAMES = {
+ '01': 'styczeń',
+ '02': 'luty',
+ '03': 'marzec',
+ '04': 'kwiecień',
+ '05': 'maj',
+ '06': 'czerwiec',
+ '07': 'lipiec',
+ '08': 'sierpień',
+ '09': 'wrzesień',
+ '10': 'październik',
+ '11': 'listopad',
+ '12': 'grudzień'
+ }
+ return MONTH_NAMES[month]
| diff --git a/tests/providers/date_time.py b/tests/providers/date_time.py
index 99e02ea46f..c4910c2d00 100644
--- a/tests/providers/date_time.py
+++ b/tests/providers/date_time.py
@@ -6,6 +6,7 @@
from faker import Factory
from .. import string_types
+
class TestHuHU(unittest.TestCase):
""" Tests date_time in hu_HU locale. """
@@ -20,3 +21,41 @@ def test_month(self):
month = self.factory.month()
assert isinstance(month, string_types)
+
+class TestPlPL(unittest.TestCase):
+
+ DAY_NAMES = (
+ 'poniedziałek',
+ 'wtorek',
+ 'środa',
+ 'czwartek',
+ 'piątek',
+ 'sobota',
+ 'niedziela',
+ )
+
+ MONTH_NAMES = (
+ 'styczeń',
+ 'luty',
+ 'marzec',
+ 'kwiecień',
+ 'maj',
+ 'czerwiec',
+ 'lipiec',
+ 'sierpień',
+ 'wrzesień',
+ 'październik',
+ 'listopad',
+ 'grudzień'
+ )
+
+ def setUp(self):
+ self.factory = Factory.create('pl_PL')
+
+ def test_day(self):
+ day = self.factory.day_of_week()
+ assert day in self.DAY_NAMES
+
+ def test_month(self):
+ month = self.factory.month_name()
+ assert month in self.MONTH_NAMES
| [
{
"components": [
{
"doc": "",
"lines": [
8,
41
],
"name": "Provider",
"signature": "class Provider(DateTimeProvider): @classmethod",
"type": "class"
},
{
"doc": "",
"lines": [
11,
22
... | [
"tests/providers/date_time.py::TestPlPL::test_day",
"tests/providers/date_time.py::TestPlPL::test_month"
] | [
"tests/providers/date_time.py::TestHuHU::test_day",
"tests/providers/date_time.py::TestHuHU::test_month"
] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
DateTime provider for pl_PL
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/date_time/pl_PL/__init__.py]
(definition of Provider:)
class Provider(DateTimeProvider): @classmethod
(definition of Provider.day_of_week:)
def day_of_week(cls):
(definition of Provider.month_name:)
def month_name(cls):
[end of new definitions in faker/providers/date_time/pl_PL/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
joke2k__faker-518 | 518 | joke2k/faker | null | 0ee26313099392f8b347c323d3c8eb294505ab58 | 2017-05-28T16:03:29Z | diff --git a/README.rst b/README.rst
index 32de05e492..ada799580f 100644
--- a/README.rst
+++ b/README.rst
@@ -136,6 +136,7 @@ Included localized providers:
- `en\_US <https://faker.readthedocs.io/en/master/locales/en_US.html>`__ - English (United States)
- `es\_ES <https://faker.readthedocs.io/en/master/locales/es_ES.html>`__ - Spanish (Spain)
- `es\_MX <https://faker.readthedocs.io/en/master/locales/es_MX.html>`__ - Spanish (Mexico)
+- `et\_EE <https://faker.readthedocs.io/en/master/locales/et_EE.html>`__ - Estonian
- `fa\_IR <https://faker.readthedocs.io/en/master/locales/fa_IR.html>`__ - Persian (Iran)
- `fi\_FI <https://faker.readthedocs.io/en/master/locales/fi_FI.html>`__ - Finnish
- `fr\_FR <https://faker.readthedocs.io/en/master/locales/fr_FR.html>`__ - French
diff --git a/docs/index.rst b/docs/index.rst
index b1d8c0976e..44cac1aeb8 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -141,6 +141,7 @@ Included localized providers:
- `en\_US <https://faker.readthedocs.io/en/master/locales/en_US.html>`__ - English (United States)
- `es\_ES <https://faker.readthedocs.io/en/master/locales/es_ES.html>`__ - Spanish (Spain)
- `es\_MX <https://faker.readthedocs.io/en/master/locales/es_MX.html>`__ - Spanish (Mexico)
+- `et\_EE <https://faker.readthedocs.io/en/master/locales/et_EE.html>`__ - Estonian
- `fa\_IR <https://faker.readthedocs.io/en/master/locales/fa_IR.html>`__ - Persian (Iran)
- `fi\_FI <https://faker.readthedocs.io/en/master/locales/fi_FI.html>`__ - Finnish
- `fr\_FR <https://faker.readthedocs.io/en/master/locales/fr_FR.html>`__ - French
diff --git a/faker/providers/person/et_EE/__init__.py b/faker/providers/person/et_EE/__init__.py
new file mode 100644
index 0000000000..538897c4fc
--- /dev/null
+++ b/faker/providers/person/et_EE/__init__.py
@@ -0,0 +1,190 @@
+# coding=utf-8
+from __future__ import unicode_literals
+from .. import Provider as PersonProvider
+
+
+class Provider(PersonProvider):
+ # https://en.wikipedia.org/wiki/Demographics_of_Estonia#Ethnic_groups
+ # Main population groups in Estonia are Estonians and ethnic Russians:
+ # About 70% of the population are Estonians and about 25% are Russians
+ est_rat = 0.7
+ rus_rat = 1.0 - est_rat
+ formats = {'{{first_name_est}} {{last_name_est}}': est_rat,
+ '{{first_name_rus}} {{last_name_rus}}': rus_rat}
+
+ formats_male = {'{{first_name_male_est}} {{last_name_est}}': est_rat,
+ '{{first_name_male_rus}} {{last_name_rus}}': rus_rat}
+ formats_female = {'{{first_name_female_est}} {{last_name_est}}': est_rat,
+ '{{first_name_female_rus}} {{last_name_rus}}': rus_rat}
+
+ prefixes_neutral = ('doktor', 'dr', 'prof')
+ prefixes_male = ('härra', 'hr') + prefixes_neutral
+ prefixes_female = ('proua', 'pr') + prefixes_neutral
+ prefixes = set(prefixes_male + prefixes_female)
+
+ suffixes = ('PhD', 'MSc', 'BSc')
+
+ # source: http://www.stat.ee/public/apps/nimed/TOP
+ # TOP 50 male names in 2017 according to the Statistics Estonia
+ first_names_male_est = ('Aivar', 'Aleksander', 'Alexander', 'Andres',
+ 'Andrus', 'Ants', 'Indrek', 'Jaan', 'Jaanus',
+ 'Jüri', 'Kristjan', 'Marek', 'Margus', 'Marko',
+ 'Martin', 'Mati', 'Meelis', 'Mihkel', 'Peeter',
+ 'Priit', 'Raivo', 'Rein', 'Sander', 'Siim', 'Tarmo',
+ 'Tiit', 'Toomas', 'Tõnu', 'Urmas', 'Vello')
+
+ first_names_female_est = ('Aino', 'Anna', 'Anne', 'Anneli', 'Anu', 'Diana',
+ 'Ene', 'Eve', 'Kadri', 'Katrin', 'Kristi',
+ 'Kristiina', 'Kristina', 'Laura', 'Linda', 'Maie',
+ 'Malle', 'Mare', 'Maria', 'Marika', 'Merike',
+ 'Niina', 'Piret', 'Reet', 'Riina', 'Sirje',
+ 'Tiina', 'Tiiu', 'Triin', 'Ülle')
+
+ first_names_est = first_names_male_est + first_names_female_est
+
+ first_names_male_rus = ('Aleksander', 'Aleksandr', 'Aleksei', 'Alexander',
+ 'Andrei', 'Artur', 'Dmitri', 'Igor', 'Ivan',
+ 'Jevgeni', 'Juri', 'Maksim', 'Mihhail', 'Nikolai',
+ 'Oleg', 'Pavel', 'Roman', 'Sergei', 'Sergey',
+ 'Valeri', 'Viktor', 'Vladimir')
+
+ first_names_female_rus = ('Aleksandra', 'Anna', 'Diana', 'Elena', 'Galina',
+ 'Irina', 'Jekaterina', 'Jelena', 'Julia',
+ 'Kristina', 'Ljubov', 'Ljudmila', 'Maria',
+ 'Marina', 'Nadežda', 'Natalia', 'Natalja', 'Nina',
+ 'Olga', 'Svetlana', 'Tamara', 'Tatiana',
+ 'Tatjana', 'Valentina', 'Viktoria')
+
+ first_names_rus = first_names_male_rus + first_names_female_rus
+
+ first_names_male = set(first_names_male_est + first_names_male_rus)
+ first_names_female = set(first_names_female_est + first_names_female_rus)
+ first_names = first_names_male | first_names_female
+
+ # http://ekspress.delfi.ee/kuum/\
+ # top-500-eesti-koige-levinumad-perekonnanimed?id=27677149
+ last_names_est = ('Aas', 'Aasa', 'Aasmäe', 'Aavik', 'Abel', 'Adamson',
+ 'Ader', 'Alas', 'Allas', 'Allik', 'Anderson', 'Annus',
+ 'Anton', 'Arro', 'Aru', 'Arula', 'Aun', 'Aus', 'Eller',
+ 'Erik', 'Erm', 'Ernits', 'Gross', 'Hallik', 'Hansen',
+ 'Hanson', 'Hein', 'Heinsalu', 'Heinsoo', 'Holm', 'Hunt',
+ 'Härm', 'Ilves', 'Ivask','Jaakson', 'Jaanson', 'Jaanus',
+ 'Jakobson', 'Jalakas', 'Johanson', 'Juhanson', 'Juhkam',
+ 'Jänes', 'Järv', 'Järve', 'Jõe', 'Jõesaar', 'Jõgi',
+ 'Jürgens', 'Jürgenson', 'Jürisson', 'Kaasik', 'Kadak',
+ 'Kala', 'Kalamees', 'Kalda', 'Kaljula', 'Kaljurand',
+ 'Kaljuste', 'Kaljuvee', 'Kallas', 'Kallaste', 'Kalm',
+ 'Kalmus', 'Kangro', 'Kangur', 'Kapp', 'Karro', 'Karu',
+ 'Kasak', 'Kase', 'Kasemaa', 'Kasemets', 'Kask', 'Kass',
+ 'Kattai', 'Kaur', 'Kelder', 'Kesküla', 'Kiik', 'Kiil',
+ 'Kiis', 'Kiisk', 'Kikas', 'Kikkas', 'Kilk', 'Kink',
+ 'Kirs', 'Kirsipuu', 'Kirss', 'Kivi', 'Kivilo', 'Kivimäe',
+ 'Kivistik', 'Klaas', 'Klein', 'Koger', 'Kohv', 'Koit',
+ 'Koitla', 'Kokk', 'Kolk', 'Kont', 'Kool', 'Koort',
+ 'Koppel', 'Korol', 'Kotkas', 'Kotov', 'Koval', 'Kozlov',
+ 'Kriisa', 'Kroon', 'Krõlov', 'Kudrjavtsev', 'Kulikov',
+ 'Kuningas', 'Kurg', 'Kurm', 'Kurvits', 'Kutsar', 'Kuus',
+ 'Kuuse', 'Kuusik', 'Kuusk', 'Kärner', 'Käsper', 'Käär',
+ 'Käärik', 'Kõiv', 'Kütt', 'Laan', 'Laane', 'Laanemets',
+ 'Laas', 'Laht', 'Laine', 'Laks', 'Lang', 'Lass', 'Laur',
+ 'Lauri', 'Lehiste', 'Leht', 'Lehtla', 'Lehtmets', 'Leis',
+ 'Lember', 'Lepik', 'Lepp', 'Leppik', 'Liblik', 'Liiv',
+ 'Liiva', 'Liivak', 'Liivamägi', 'Lill', 'Lillemets',
+ 'Lind', 'Link', 'Lipp', 'Lokk', 'Lomp', 'Loorits', 'Luht',
+ 'Luik', 'Lukin', 'Lukk', 'Lumi', 'Lumiste', 'Luts',
+ 'Lätt', 'Lääne', 'Lääts', 'Lõhmus', 'Maasik', 'Madisson',
+ 'Maidla', 'Mandel', 'Maripuu', 'Mark', 'Markus', 'Martin',
+ 'Martinson', 'Meier', 'Meister', 'Melnik', 'Merila',
+ 'Mets', 'Michelson', 'Mikk', 'Miller', 'Mitt', 'Moor',
+ 'Muru', 'Must', 'Mäe', 'Mäeots', 'Mäesalu', 'Mägi',
+ 'Mänd', 'Mändla', 'Männik', 'Männiste', 'Mõttus',
+ 'Mölder', 'Mürk', 'Müür', 'Müürsepp', 'Niit', 'Nurk',
+ 'Nurm', 'Nuut', 'Nõmm', 'Nõmme', 'Nõmmik', 'Oja', 'Ojala',
+ 'Ojaste', 'Oks', 'Olesk', 'Oras', 'Orav', 'Org', 'Ots',
+ 'Ott', 'Paal', 'Paap', 'Paas', 'Paju', 'Pajula', 'Palm',
+ 'Palu', 'Parts', 'Pent', 'Peterson', 'Pettai', 'Pihelgas',
+ 'Pihlak', 'Piho', 'Piir', 'Piirsalu', 'Pikk', 'Ploom',
+ 'Poom', 'Post', 'Pruul', 'Pukk', 'Pulk', 'Puusepp',
+ 'Pärn', 'Pärna', 'Pärnpuu', 'Pärtel', 'Põder', 'Põdra',
+ 'Põld', 'Põldma', 'Põldmaa', 'Põllu', 'Püvi', 'Raadik',
+ 'Raag', 'Raamat', 'Raid', 'Raidma', 'Raja', 'Rand',
+ 'Randmaa', 'Randoja', 'Raud', 'Raudsepp', 'Rebane',
+ 'Reimann', 'Reinsalu', 'Remmel', 'Rohtla', 'Roos',
+ 'Roosileht', 'Roots', 'Rosenberg', 'Rosin', 'Ruus',
+ 'Rätsep', 'Rüütel', 'Saar', 'Saare', 'Saks', 'Salu',
+ 'Salumets', 'Salumäe', 'Sander', 'Sarap', 'Sarapuu',
+ 'Sarv', 'Saul', 'Schmidt', 'Sepp', 'Sibul', 'Siim',
+ 'Sikk', 'Sild', 'Sillaots', 'Sillaste', 'Silm', 'Simson',
+ 'Sirel', 'Sisask', 'Sokk', 'Soo', 'Soon', 'Soosaar',
+ 'Soosalu', 'Soots', 'Suits', 'Sulg', 'Susi', 'Sutt',
+ 'Suur', 'Suvi', 'Säde', 'Sööt', 'Taal', 'Tali', 'Talts',
+ 'Tamberg', 'Tamm', 'Tamme', 'Tammik', 'Teder', 'Teearu',
+ 'Teesalu', 'Teras', 'Tiik', 'Tiits', 'Tilk', 'Tomingas',
+ 'Tomson', 'Toom', 'Toome', 'Tooming', 'Toomsalu', 'Toots',
+ 'Trei', 'Treial', 'Treier', 'Truu', 'Tuisk', 'Tuul',
+ 'Tuulik', 'Täht', 'Tõnisson', 'Uibo', 'Unt', 'Urb', 'Uus',
+ 'Uustalu', 'Vaher', 'Vaht', 'Vahter', 'Vahtra', 'Vain',
+ 'Vaino', 'Valge', 'Valk', 'Vares', 'Varik', 'Veski',
+ 'Viik', 'Viira', 'Viks', 'Vill', 'Villemson', 'Visnapuu',
+ 'Vähi', 'Väli', 'Võsu', 'Õispuu', 'Õun', 'Õunapuu')
+
+ last_names_rus = ('Abramov', 'Afanasjev', 'Aleksandrov', 'Alekseev',
+ 'Andreev', 'Anissimov', 'Antonov', 'Baranov', 'Beljajev',
+ 'Belov', 'Bogdanov', 'Bondarenko', 'Borissov', 'Bõstrov',
+ 'Danilov', 'Davõdov', 'Denissov', 'Dmitriev', 'Drozdov',
+ 'Egorov', 'Fedorov', 'Fedotov', 'Filatov', 'Filippov',
+ 'Fjodorov', 'Fomin', 'Frolov', 'Gavrilov', 'Gerassimov',
+ 'Golubev', 'Gontšarov', 'Gorbunov', 'Grigoriev', 'Gromov',
+ 'Gusev', 'Ignatjev', 'Iljin', 'Ivanov', 'Jakovlev',
+ 'Jefimov', 'Jegorov', 'Jermakov', 'Jeršov', 'Kalinin',
+ 'Karpov', 'Karpov', 'Kazakov', 'Kirillov', 'Kisseljov',
+ 'Klimov', 'Kolesnik', 'Komarov', 'Kondratjev',
+ 'Konovalov', 'Konstantinov', 'Korol', 'Kostin', 'Kotov',
+ 'Koval', 'Kozlov', 'Kruglov', 'Krõlov', 'Kudrjavtsev',
+ 'Kulikov', 'Kuzmin', 'Kuznetsov', 'Lebedev', 'Loginov',
+ 'Lukin', 'Makarov', 'Maksimov', 'Malõšev', 'Maslov',
+ 'Matvejev', 'Medvedev', 'Melnik', 'Mihhailov', 'Miller',
+ 'Mironov', 'Moroz', 'Naumov', 'Nazarov', 'Nikiforov',
+ 'Nikitin', 'Nikolaev', 'Novikov', 'Orlov', 'Ossipov',
+ 'Panov', 'Pavlov', 'Petrov', 'Poljakov', 'Popov',
+ 'Romanov', 'Rosenberg', 'Rumjantsev', 'Safronov',
+ 'Saveljev', 'Semenov', 'Sergejev', 'Sidorov', 'Smirnov',
+ 'Sobolev', 'Sokolov', 'Solovjov', 'Sorokin', 'Stepanov',
+ 'Suvorov', 'Tarassov', 'Tihhomirov', 'Timofejev', 'Titov',
+ 'Trofimov', 'Tsvetkov', 'Vasiliev', 'Vinogradov',
+ 'Vlassov', 'Volkov', 'Vorobjov', 'Voronin', 'Zahharov',
+ 'Zaitsev', 'Zujev', 'Ševtšenko', 'Štšerbakov',
+ 'Štšerbakov', 'Žukov', 'Žuravljov')
+ last_names = set(last_names_est + last_names_rus)
+
+ @classmethod
+ def first_name_male_est(cls):
+ return cls.random_element(cls.first_names_male_est)
+
+ @classmethod
+ def first_name_female_est(cls):
+ return cls.random_element(cls.first_names_female_est)
+
+ @classmethod
+ def first_name_male_rus(cls):
+ return cls.random_element(cls.first_names_male_rus)
+
+ @classmethod
+ def first_name_female_rus(cls):
+ return cls.random_element(cls.first_names_female_rus)
+
+ @classmethod
+ def first_name_est(cls):
+ return cls.random_element(cls.first_names_est)
+
+ @classmethod
+ def first_name_rus(cls):
+ return cls.random_element(cls.first_names_rus)
+
+ @classmethod
+ def last_name_est(cls):
+ return cls.random_element(cls.last_names_est)
+
+ @classmethod
+ def last_name_rus(cls):
+ return cls.random_element(cls.last_names_rus)
diff --git a/faker/providers/ssn/et_EE/__init__.py b/faker/providers/ssn/et_EE/__init__.py
new file mode 100644
index 0000000000..f1181b5485
--- /dev/null
+++ b/faker/providers/ssn/et_EE/__init__.py
@@ -0,0 +1,67 @@
+# coding=utf-8
+
+from __future__ import unicode_literals
+from .. import Provider as SsnProvider
+from faker.generator import random
+import datetime
+import operator
+
+
+def checksum(digits):
+ """Calculate checksum of Estonian personal identity code.
+
+ Checksum is calculated with "Modulo 11" method using level I or II scale:
+ Level I scale: 1 2 3 4 5 6 7 8 9 1
+ Level II scale: 3 4 5 6 7 8 9 1 2 3
+
+ The digits of the personal code are multiplied by level I scale and summed;
+ if remainder of modulo 11 of the sum is less than 10, checksum is the
+ remainder.
+ If remainder is 10, then level II scale is used; checksum is remainder if
+ remainder < 10 or 0 if remainder is 10.
+
+ See also https://et.wikipedia.org/wiki/Isikukood
+ """
+ sum_mod11 = sum(map(operator.mul, digits, Provider.scale1)) % 11
+ if sum_mod11 < 10:
+ return sum_mod11
+ sum_mod11 = sum(map(operator.mul, digits, Provider.scale2)) % 11
+ return 0 if sum_mod11 == 10 else sum_mod11
+
+
+class Provider(SsnProvider):
+ min_age = 16 * 365
+ max_age = 90 * 365
+ scale1 = (1, 2, 3, 4, 5, 6, 7, 8, 9, 1)
+ scale2 = (3, 4, 5, 6, 7, 8, 9, 1, 2, 3)
+
+ @classmethod
+ def ssn(cls):
+ """
+ Returns 11 character Estonian personal identity code (isikukood, IK).
+
+ Age of person is between 16 and 90 years, based on local computer date.
+ This function assigns random sex to person.
+ An Estonian Personal identification code consists of 11 digits,
+ generally given without any whitespace or other delimiters.
+ The form is GYYMMDDSSSC, where G shows sex and century of birth (odd
+ number male, even number female, 1-2 19th century, 3-4 20th century,
+ 5-6 21st century), SSS is a serial number separating persons born on
+ the same date and C a checksum.
+
+ https://en.wikipedia.org/wiki/National_identification_number#Estonia
+ """
+ age = datetime.timedelta(days=random.randrange(Provider.min_age,
+ Provider.max_age))
+ birthday = datetime.date.today() - age
+ if birthday.year < 2000:
+ ik = random.choice(('3', '4'))
+ elif birthday.year < 2100:
+ ik = random.choice(('5', '6'))
+ else:
+ ik = random.choice(('7', '8'))
+
+ ik += "%02d%02d%02d" % ((birthday.year % 100), birthday.month,
+ birthday.day)
+ ik += str(random.randrange(0, 999)).zfill(3)
+ return ik + str(checksum([int(ch) for ch in ik]))
| diff --git a/tests/providers/ssn.py b/tests/providers/ssn.py
index 37369aac9f..fe559113d1 100644
--- a/tests/providers/ssn.py
+++ b/tests/providers/ssn.py
@@ -6,10 +6,28 @@
import re
from faker import Factory
+from faker.providers.ssn.et_EE import Provider as EtProvider, checksum as et_checksum
from faker.providers.ssn.hr_HR import Provider as HrProvider, checksum as hr_checksum
from faker.providers.ssn.pt_BR import Provider as PtProvider, checksum as pt_checksum
+class TestEtEE(unittest.TestCase):
+ """ Tests SSN in the et_EE locale """
+
+ def setUp(self):
+ self.factory = Factory.create('et_EE')
+
+ def test_ssn_checksum(self):
+ self.assertEqual(et_checksum([4, 4, 1, 1, 1, 3, 0, 4, 9, 2]), 3)
+ self.assertEqual(et_checksum([3, 6, 7, 0, 1, 1, 6, 6, 2, 7]), 8)
+ self.assertEqual(et_checksum([4, 7, 0, 0, 4, 2, 1, 5, 0, 1]), 2)
+ self.assertEqual(et_checksum([3, 9, 7, 0, 3, 0, 4, 3, 3, 6]), 0)
+
+ def test_ssn(self):
+ for i in range(100):
+ self.assertTrue(re.search(r'^\d{11}$', EtProvider.ssn()))
+
+
class TestHrHR(unittest.TestCase):
""" Tests SSN in the hr_HR locale """
| diff --git a/README.rst b/README.rst
index 32de05e492..ada799580f 100644
--- a/README.rst
+++ b/README.rst
@@ -136,6 +136,7 @@ Included localized providers:
- `en\_US <https://faker.readthedocs.io/en/master/locales/en_US.html>`__ - English (United States)
- `es\_ES <https://faker.readthedocs.io/en/master/locales/es_ES.html>`__ - Spanish (Spain)
- `es\_MX <https://faker.readthedocs.io/en/master/locales/es_MX.html>`__ - Spanish (Mexico)
+- `et\_EE <https://faker.readthedocs.io/en/master/locales/et_EE.html>`__ - Estonian
- `fa\_IR <https://faker.readthedocs.io/en/master/locales/fa_IR.html>`__ - Persian (Iran)
- `fi\_FI <https://faker.readthedocs.io/en/master/locales/fi_FI.html>`__ - Finnish
- `fr\_FR <https://faker.readthedocs.io/en/master/locales/fr_FR.html>`__ - French
diff --git a/docs/index.rst b/docs/index.rst
index b1d8c0976e..44cac1aeb8 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -141,6 +141,7 @@ Included localized providers:
- `en\_US <https://faker.readthedocs.io/en/master/locales/en_US.html>`__ - English (United States)
- `es\_ES <https://faker.readthedocs.io/en/master/locales/es_ES.html>`__ - Spanish (Spain)
- `es\_MX <https://faker.readthedocs.io/en/master/locales/es_MX.html>`__ - Spanish (Mexico)
+- `et\_EE <https://faker.readthedocs.io/en/master/locales/et_EE.html>`__ - Estonian
- `fa\_IR <https://faker.readthedocs.io/en/master/locales/fa_IR.html>`__ - Persian (Iran)
- `fi\_FI <https://faker.readthedocs.io/en/master/locales/fi_FI.html>`__ - Finnish
- `fr\_FR <https://faker.readthedocs.io/en/master/locales/fr_FR.html>`__ - French
| [
{
"components": [
{
"doc": "",
"lines": [
6,
190
],
"name": "Provider",
"signature": "class Provider(PersonProvider):",
"type": "class"
},
{
"doc": "",
"lines": [
161,
162
],
... | [
"tests/providers/ssn.py::TestEtEE::test_ssn",
"tests/providers/ssn.py::TestEtEE::test_ssn_checksum",
"tests/providers/ssn.py::TestHrHR::test_ssn",
"tests/providers/ssn.py::TestHrHR::test_ssn_checksum",
"tests/providers/ssn.py::TestPtBR::test_pt_BR_cpf",
"tests/providers/ssn.py::TestPtBR::test_pt_BR_ssn",
... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add et_EE (Estonian) provider: names and ssn
Created the provider for Estonian language (et_EE).
Providers implemented for names and ssn.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/person/et_EE/__init__.py]
(definition of Provider:)
class Provider(PersonProvider):
(definition of Provider.first_name_male_est:)
def first_name_male_est(cls):
(definition of Provider.first_name_female_est:)
def first_name_female_est(cls):
(definition of Provider.first_name_male_rus:)
def first_name_male_rus(cls):
(definition of Provider.first_name_female_rus:)
def first_name_female_rus(cls):
(definition of Provider.first_name_est:)
def first_name_est(cls):
(definition of Provider.first_name_rus:)
def first_name_rus(cls):
(definition of Provider.last_name_est:)
def last_name_est(cls):
(definition of Provider.last_name_rus:)
def last_name_rus(cls):
[end of new definitions in faker/providers/person/et_EE/__init__.py]
[start of new definitions in faker/providers/ssn/et_EE/__init__.py]
(definition of checksum:)
def checksum(digits):
"""Calculate checksum of Estonian personal identity code.
Checksum is calculated with "Modulo 11" method using level I or II scale:
Level I scale: 1 2 3 4 5 6 7 8 9 1
Level II scale: 3 4 5 6 7 8 9 1 2 3
The digits of the personal code are multiplied by level I scale and summed;
if remainder of modulo 11 of the sum is less than 10, checksum is the
remainder.
If remainder is 10, then level II scale is used; checksum is remainder if
remainder < 10 or 0 if remainder is 10.
See also https://et.wikipedia.org/wiki/Isikukood"""
(definition of Provider:)
class Provider(SsnProvider):
(definition of Provider.ssn:)
def ssn(cls):
"""Returns 11 character Estonian personal identity code (isikukood, IK).
Age of person is between 16 and 90 years, based on local computer date.
This function assigns random sex to person.
An Estonian Personal identification code consists of 11 digits,
generally given without any whitespace or other delimiters.
The form is GYYMMDDSSSC, where G shows sex and century of birth (odd
number male, even number female, 1-2 19th century, 3-4 20th century,
5-6 21st century), SSS is a serial number separating persons born on
the same date and C a checksum.
https://en.wikipedia.org/wiki/National_identification_number#Estonia"""
[end of new definitions in faker/providers/ssn/et_EE/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | |
tornadoweb__tornado-2062 | 2,062 | tornadoweb/tornado | null | c2c0f383f491215eaf8129c8ebc6625c441761bf | 2017-05-27T23:18:17Z | diff --git a/tornado/ioloop.py b/tornado/ioloop.py
index 5997cef667..f4d4642077 100644
--- a/tornado/ioloop.py
+++ b/tornado/ioloop.py
@@ -249,15 +249,7 @@ def configurable_base(cls):
@classmethod
def configurable_default(cls):
- if hasattr(select, "epoll"):
- from tornado.platform.epoll import EPollIOLoop
- return EPollIOLoop
- if hasattr(select, "kqueue"):
- # Python 2.6+ on BSD or Mac
- from tornado.platform.kqueue import KQueueIOLoop
- return KQueueIOLoop
- from tornado.platform.select import SelectIOLoop
- return SelectIOLoop
+ return PollIOLoop
def initialize(self, make_current=None):
if make_current is None:
@@ -722,6 +714,22 @@ def initialize(self, impl, time_func=None, **kwargs):
lambda fd, events: self._waker.consume(),
self.READ)
+ @classmethod
+ def configurable_base(cls):
+ return PollIOLoop
+
+ @classmethod
+ def configurable_default(cls):
+ if hasattr(select, "epoll"):
+ from tornado.platform.epoll import EPollIOLoop
+ return EPollIOLoop
+ if hasattr(select, "kqueue"):
+ # Python 2.6+ on BSD or Mac
+ from tornado.platform.kqueue import KQueueIOLoop
+ return KQueueIOLoop
+ from tornado.platform.select import SelectIOLoop
+ return SelectIOLoop
+
def close(self, all_fds=False):
self._closing = True
self.remove_handler(self._waker.fileno())
diff --git a/tornado/platform/twisted.py b/tornado/platform/twisted.py
index 79608e5fb9..7c337e8a59 100644
--- a/tornado/platform/twisted.py
+++ b/tornado/platform/twisted.py
@@ -382,6 +382,8 @@ def connectionLost(self, reason):
self.handler(self.fileobj, tornado.ioloop.IOLoop.ERROR)
self.lost = True
+ writeConnectionLost = readConnectionLost = connectionLost
+
def logPrefix(self):
return ''
diff --git a/tornado/util.py b/tornado/util.py
index 981b94c8ea..bfd80beb6f 100644
--- a/tornado/util.py
+++ b/tornado/util.py
@@ -286,6 +286,9 @@ def __new__(cls, *args, **kwargs):
else:
impl = cls
init_kwargs.update(kwargs)
+ if impl.configurable_base() is not base:
+ # The impl class is itself configurable, so recurse.
+ return impl(*args, **init_kwargs)
instance = super(Configurable, cls).__new__(impl)
# initialize vs __init__ chosen for compatibility with AsyncHTTPClient
# singleton magic. If we get rid of that we can switch to __init__
@@ -343,7 +346,10 @@ def configured_class(cls):
# type: () -> type
"""Returns the currently configured class."""
base = cls.configurable_base()
- if cls.__impl_class is None:
+ # Manually mangle the private name to see whether this base
+ # has been configured (and not another base higher in the
+ # hierarchy).
+ if base.__dict__.get('_Configurable__impl_class') is None:
base.__impl_class = cls.configurable_default()
return base.__impl_class
| diff --git a/tornado/test/twisted_test.py b/tornado/test/twisted_test.py
index 10afebb7bc..4b88eca862 100644
--- a/tornado/test/twisted_test.py
+++ b/tornado/test/twisted_test.py
@@ -32,9 +32,8 @@
from tornado import gen
from tornado.httpclient import AsyncHTTPClient
from tornado.httpserver import HTTPServer
-from tornado.ioloop import IOLoop
+from tornado.ioloop import IOLoop, PollIOLoop
from tornado.platform.auto import set_close_exec
-from tornado.platform.select import SelectIOLoop
from tornado.testing import bind_unused_port
from tornado.test.util import unittest
from tornado.util import import_object, PY3
@@ -690,7 +689,7 @@ def unbuildReactor(self, reactor):
if have_twisted:
class LayeredTwistedIOLoop(TwistedIOLoop):
- """Layers a TwistedIOLoop on top of a TornadoReactor on a SelectIOLoop.
+ """Layers a TwistedIOLoop on top of a TornadoReactor on a PollIOLoop.
This is of course silly, but is useful for testing purposes to make
sure we're implementing both sides of the various interfaces
@@ -698,10 +697,7 @@ class LayeredTwistedIOLoop(TwistedIOLoop):
of the whole stack.
"""
def initialize(self, **kwargs):
- # When configured to use LayeredTwistedIOLoop we can't easily
- # get the next-best IOLoop implementation, so use the lowest common
- # denominator.
- self.real_io_loop = SelectIOLoop(make_current=False) # type: ignore
+ self.real_io_loop = PollIOLoop(make_current=False) # type: ignore
reactor = self.real_io_loop.run_sync(gen.coroutine(TornadoReactor))
super(LayeredTwistedIOLoop, self).initialize(reactor=reactor, **kwargs)
self.add_callback(self.make_current)
diff --git a/tornado/test/util_test.py b/tornado/test/util_test.py
index 459cb9c327..48924d44be 100644
--- a/tornado/test/util_test.py
+++ b/tornado/test/util_test.py
@@ -58,12 +58,35 @@ def initialize(self, pos_arg=None, b=None):
self.pos_arg = pos_arg
+class TestConfig3(TestConfigurable):
+ # TestConfig3 is a configuration option that is itself configurable.
+ @classmethod
+ def configurable_base(cls):
+ return TestConfig3
+
+ @classmethod
+ def configurable_default(cls):
+ return TestConfig3A
+
+
+class TestConfig3A(TestConfig3):
+ def initialize(self, a=None):
+ self.a = a
+
+
+class TestConfig3B(TestConfig3):
+ def initialize(self, b=None):
+ self.b = b
+
+
class ConfigurableTest(unittest.TestCase):
def setUp(self):
self.saved = TestConfigurable._save_configuration()
+ self.saved3 = TestConfig3._save_configuration()
def tearDown(self):
TestConfigurable._restore_configuration(self.saved)
+ TestConfig3._restore_configuration(self.saved3)
def checkSubclasses(self):
# no matter how the class is configured, it should always be
@@ -131,6 +154,39 @@ def test_config_class_args(self):
obj = TestConfig2()
self.assertIs(obj.b, None)
+ def test_config_multi_level(self):
+ TestConfigurable.configure(TestConfig3, a=1)
+ obj = TestConfigurable()
+ self.assertIsInstance(obj, TestConfig3A)
+ self.assertEqual(obj.a, 1)
+
+ TestConfigurable.configure(TestConfig3)
+ TestConfig3.configure(TestConfig3B, b=2)
+ obj = TestConfigurable()
+ self.assertIsInstance(obj, TestConfig3B)
+ self.assertEqual(obj.b, 2)
+
+ def test_config_inner_level(self):
+ # The inner level can be used even when the outer level
+ # doesn't point to it.
+ obj = TestConfig3()
+ self.assertIsInstance(obj, TestConfig3A)
+
+ TestConfig3.configure(TestConfig3B)
+ obj = TestConfig3()
+ self.assertIsInstance(obj, TestConfig3B)
+
+ # Configuring the base doesn't configure the inner.
+ obj = TestConfigurable()
+ self.assertIsInstance(obj, TestConfig1)
+ TestConfigurable.configure(TestConfig2)
+
+ obj = TestConfigurable()
+ self.assertIsInstance(obj, TestConfig2)
+
+ obj = TestConfig3()
+ self.assertIsInstance(obj, TestConfig3B)
+
class UnicodeLiteralTest(unittest.TestCase):
def test_unicode_escapes(self):
| [
{
"components": [
{
"doc": "",
"lines": [
718,
719
],
"name": "PollIOLoop.configurable_base",
"signature": "def configurable_base(cls):",
"type": "function"
},
{
"doc": "",
"lines": [
722,
7... | [
"tornado/test/util_test.py::ConfigurableTest::test_config_multi_level"
] | [
"tornado/test/util_test.py::RaiseExcInfoTest::test_two_arg_exception",
"tornado/test/util_test.py::ConfigurableTest::test_config_args",
"tornado/test/util_test.py::ConfigurableTest::test_config_class",
"tornado/test/util_test.py::ConfigurableTest::test_config_class_args",
"tornado/test/util_test.py::Configu... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
ioloop: Make PollIOLoop separately configurable
This makes it possible to construct a PollIOLoop even when the default
IOLoop is configured to something else.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in tornado/ioloop.py]
(definition of PollIOLoop.configurable_base:)
def configurable_base(cls):
(definition of PollIOLoop.configurable_default:)
def configurable_default(cls):
[end of new definitions in tornado/ioloop.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | b5dad636aaba94f86a3c00ca6ec49c79ff4313b2 | ||
falconry__falcon-1067 | 1,067 | falconry/falcon | null | fe1dfd0fad0685296db9a2dea8ccd380316bde64 | 2017-05-26T20:27:19Z | diff --git a/docs/api/api.rst b/docs/api/api.rst
index c27887205..0d2e4c25a 100644
--- a/docs/api/api.rst
+++ b/docs/api/api.rst
@@ -21,3 +21,5 @@ standard-compliant WSGI server.
.. autoclass:: falcon.ResponseOptions
:members:
+.. autoclass:: falcon.routing.CompiledRouterOptions
+ :noindex:
diff --git a/docs/api/routing.rst b/docs/api/routing.rst
index c01cb2ea8..57096fa1b 100644
--- a/docs/api/routing.rst
+++ b/docs/api/routing.rst
@@ -63,4 +63,5 @@ A custom routing engine may be specified when instantiating
api = API(router=fancy)
.. automodule:: falcon.routing
- :members: create_http_method_map, compile_uri_template, CompiledRouter
+ :members: create_http_method_map, compile_uri_template,
+ CompiledRouter, CompiledRouterOptions
diff --git a/falcon/api.py b/falcon/api.py
index 837e0c73c..7a589916a 100644
--- a/falcon/api.py
+++ b/falcon/api.py
@@ -113,6 +113,11 @@ def process_response(self, req, resp, resource, req_succeeded)
requests. See also: :py:class:`~.RequestOptions`
resp_options: A set of behavioral options related to outgoing
responses. See also: :py:class:`~.ResponseOptions`
+ router_options: Configuration options for the router. If a
+ custom router is in use, and it does not expose any
+ configurable options, referencing this attribute will raise
+ an instance of ``AttributeError``. See also:
+ :py:class:`falcon.routing.CompiledRouterOptions`.
"""
# PERF(kgriffs): Reference via self since that is faster than
@@ -282,6 +287,10 @@ def __call__(self, env, start_response): # noqa: C901
start_response(resp_status, headers)
return body
+ @property
+ def router_options(self):
+ return self._router.options
+
def add_route(self, uri_template, resource, *args, **kwargs):
"""Associates a templatized URI path with a resource.
diff --git a/falcon/routing/__init__.py b/falcon/routing/__init__.py
index 228ce7684..24e442fc9 100644
--- a/falcon/routing/__init__.py
+++ b/falcon/routing/__init__.py
@@ -19,7 +19,7 @@
routers.
"""
-from falcon.routing.compiled import CompiledRouter
+from falcon.routing.compiled import CompiledRouter, CompiledRouterOptions # NOQA
from falcon.routing.util import create_http_method_map # NOQA
from falcon.routing.util import compile_uri_template # NOQA
diff --git a/falcon/routing/compiled.py b/falcon/routing/compiled.py
index 2c6a53fbe..f11c057ae 100644
--- a/falcon/routing/compiled.py
+++ b/falcon/routing/compiled.py
@@ -16,10 +16,18 @@
import keyword
import re
+import textwrap
+
+from six.moves import UserDict
+
+from falcon.routing import converters
-_FIELD_PATTERN = re.compile('{([^}]*)}')
_TAB_STR = ' ' * 4
+_FIELD_PATTERN = re.compile(
+ '{((?P<fname>[^}:]*)((?P<cname_sep>:(?P<cname>[^}\(]*))(\((?P<argstr>.*)\))?)?)}'
+)
+_IDENTIFIER_PATTERN = re.compile('[A-Za-z_][A-Za-z0-9_]*$')
class CompiledRouter(object):
@@ -37,19 +45,38 @@ class CompiledRouter(object):
__slots__ = (
'_ast',
+ '_converter_map',
+ '_converters',
'_find',
'_finder_src',
+ '_options',
'_patterns',
'_return_values',
'_roots',
)
def __init__(self):
- self._roots = []
- self._find = self._compile()
+ self._ast = None
+ self._converters = None
self._finder_src = None
+
+ self._options = CompiledRouterOptions()
+
+ # PERF(kgriffs): This is usually an anti-pattern, but we do it
+ # here to reduce lookup time.
+ self._converter_map = self._options.converters.data
+
self._patterns = None
self._return_values = None
+ self._roots = []
+
+ # NOTE(kgriffs): Call _compile() last since it depends on
+ # the variables above.
+ self._find = self._compile()
+
+ @property
+ def options(self):
+ return self._options
@property
def finder_src(self):
@@ -66,34 +93,17 @@ def add_route(self, uri_template, method_map, resource):
the URI template.
"""
- if re.search('\s', uri_template):
+ # NOTE(kgriffs): Fields may have whitespace in them, so sub
+ # those before checking the rest of the URI template.
+ if re.search('\s', _FIELD_PATTERN.sub('{FIELD}', uri_template)):
raise ValueError('URI templates may not include whitespace.')
- # NOTE(kgriffs): Ensure fields are valid Python identifiers,
- # since they will be passed as kwargs to responders. Also
- # ensure there are no duplicate names, since that causes the
- # following problems:
- #
- # 1. For simple nodes, values from deeper nodes overwrite
- # values from more shallow nodes.
- # 2. For complex nodes, re.compile() raises a nasty error
- #
- fields = _FIELD_PATTERN.findall(uri_template)
- used_names = set()
- for name in fields:
- is_identifier = re.match('[A-Za-z_][A-Za-z0-9_]*$', name)
- if not is_identifier or name in keyword.kwlist:
- raise ValueError('Field names must be valid identifiers '
- "('{}' is not valid).".format(name))
-
- if name in used_names:
- raise ValueError('Field names may not be duplicated '
- "('{}' was used more than once)".format(name))
-
- used_names.add(name)
-
path = uri_template.strip('/').split('/')
+ used_names = set()
+ for segment in path:
+ self._validate_template_segment(segment, used_names)
+
def insert(nodes, path_index=0):
for node in nodes:
segment = path[path_index]
@@ -110,16 +120,13 @@ def insert(nodes, path_index=0):
return
if node.conflicts_with(segment):
- msg = (
- 'The URI template for this route conflicts with another'
- "route's template. This is usually caused by using "
- 'different field names at the same level in the path. '
- 'For example, given the route paths '
- "'/parents/{id}' and '/parents/{parent_id}/children', "
- 'the conflict can be resolved by renaming one of the '
- 'fields to match the other, i.e.: '
- "'/parents/{parent_id}' and '/parents/{parent_id}/children'."
- )
+ msg = textwrap.dedent("""
+ The URI template for this route is inconsistent or conflicts with another
+ route's template. This is usually caused by configuring a field converter
+ differently for the same field in two different routes, or by using
+ different field names at the same level in the path (e.g.,
+ '/parents/{id}' and '/parents/{parent_id}/children')
+ """).strip().replace('\n', ' ')
raise ValueError(msg)
# NOTE(richardolsson): If we got this far, the node doesn't already
@@ -157,7 +164,8 @@ def find(self, uri, req=None):
path = uri.lstrip('/').split('/')
params = {}
- node = self._find(path, self._return_values, self._patterns, params)
+ node = self._find(path, self._return_values, self._patterns,
+ self._converters, params)
if node is not None:
return node.resource, node.method_map, params, node.uri_template
@@ -168,6 +176,48 @@ def find(self, uri, req=None):
# Private
# -----------------------------------------------------------------
+ def _validate_template_segment(self, segment, used_names):
+ """Validates a single path segment of a URI template.
+
+ 1. Ensure field names are valid Python identifiers, since they
+ will be passed as kwargs to responders.
+ 2. Check that there are no duplicate names, since that causes
+ (at least) the following problems:
+
+ a. For simple nodes, values from deeper nodes overwrite
+ values from more shallow nodes.
+ b. For complex nodes, re.compile() raises a nasty error
+ 3. Check that when the converter syntax is used, the named
+ converter exists.
+ """
+
+ for field in _FIELD_PATTERN.finditer(segment):
+ name = field.group('fname')
+
+ is_identifier = _IDENTIFIER_PATTERN.match(name)
+ if not is_identifier or name in keyword.kwlist:
+ msg_template = ('Field names must be valid identifiers '
+ '("{0}" is not valid)')
+ msg = msg_template.format(name)
+ raise ValueError(msg)
+
+ if name in used_names:
+ msg_template = ('Field names may not be duplicated '
+ '("{0}" was used more than once)')
+ msg = msg_template.format(name)
+ raise ValueError(msg)
+
+ used_names.add(name)
+
+ if field.group('cname_sep') == ':':
+ msg = 'Missing converter for field "{0}"'.format(name)
+ raise ValueError(msg)
+
+ name = field.group('cname')
+ if name and name not in self._converter_map:
+ msg = 'Unknown converter: "{0}"'.format(name)
+ raise ValueError(msg)
+
def _generate_ast(self, nodes, parent, return_values, patterns, level=0, fast_return=True):
"""Generates a coarse AST for the router."""
@@ -176,7 +226,7 @@ def _generate_ast(self, nodes, parent, return_values, patterns, level=0, fast_re
return
outer_parent = _CxIfPathLength('>', level)
- parent.append(outer_parent)
+ parent.append_child(outer_parent)
parent = outer_parent
found_simple = False
@@ -212,13 +262,45 @@ def _generate_ast(self, nodes, parent, return_values, patterns, level=0, fast_re
construct = _CxIfPathSegmentPattern(level, pattern_idx,
node.var_pattern.pattern)
- parent.append(construct)
+ parent.append_child(construct)
parent = construct
+ if node.var_converter_map:
+ parent.append_child(_CxPrefetchGroupsFromPatternMatch())
+ parent = self._generate_conversion_ast(parent, node)
+
+ else:
+ parent.append_child(_CxSetParamsFromPatternMatch())
+
else:
# NOTE(kgriffs): Simple nodes just capture the entire path
# segment as the value for the param.
- parent.append(_CxSetParam(node.var_name, level))
+
+ if node.var_converter_map:
+ assert len(node.var_converter_map) == 1
+
+ parent.append_child(_CxSetFragmentFromPath(level))
+
+ field_name = node.var_name
+ __, converter_name, converter_argstr = node.var_converter_map[0]
+ converter_class = self._converter_map[converter_name]
+
+ converter_obj = self._instantiate_converter(
+ converter_class,
+ converter_argstr
+ )
+ converter_idx = len(self._converters)
+ self._converters.append(converter_obj)
+
+ construct = _CxIfConverterField(
+ field_name,
+ converter_idx,
+ )
+
+ parent.append_child(construct)
+ parent = construct
+ else:
+ parent.append_child(_CxSetParam(node.var_name, level))
# NOTE(kgriffs): We don't allow multiple simple var nodes
# to exist at the same level, e.g.:
@@ -233,7 +315,7 @@ def _generate_ast(self, nodes, parent, return_values, patterns, level=0, fast_re
else:
# NOTE(kgriffs): Not a param, so must match exactly
construct = _CxIfPathSegmentLiteral(level, node.raw_segment)
- parent.append(construct)
+ parent.append_child(construct)
parent = construct
if node.resource is not None:
@@ -253,22 +335,52 @@ def _generate_ast(self, nodes, parent, return_values, patterns, level=0, fast_re
if node.resource is None:
if fast_return:
- parent.append(_CxReturnNone())
+ parent.append_child(_CxReturnNone())
else:
# NOTE(kgriffs): Make sure that we have consumed all of
# the segments for the requested route; otherwise we could
# mistakenly match "/foo/23/bar" against "/foo/{id}".
construct = _CxIfPathLength('==', level + 1)
- construct.append(_CxReturnValue(resource_idx))
- parent.append(construct)
+ construct.append_child(_CxReturnValue(resource_idx))
+ parent.append_child(construct)
if fast_return:
- parent.append(_CxReturnNone())
+ parent.append_child(_CxReturnNone())
parent = outer_parent
if not found_simple and fast_return:
- parent.append(_CxReturnNone())
+ parent.append_child(_CxReturnNone())
+
+ def _generate_conversion_ast(self, parent, node):
+ # NOTE(kgriffs): Unroll the converter loop into
+ # a series of nested "if" constructs.
+ for field_name, converter_name, converter_argstr in node.var_converter_map:
+ converter_class = self._converter_map[converter_name]
+
+ converter_obj = self._instantiate_converter(
+ converter_class,
+ converter_argstr
+ )
+ converter_idx = len(self._converters)
+ self._converters.append(converter_obj)
+
+ parent.append_child(_CxSetFragmentFromField(field_name))
+
+ construct = _CxIfConverterField(
+ field_name,
+ converter_idx,
+ )
+
+ parent.append_child(construct)
+ parent = construct
+
+ # NOTE(kgriffs): Add remaining fields that were not
+ # converted, if any.
+ if node.num_fields > len(node.var_converter_map):
+ parent.append_child(_CxSetParamsFromPatternMatchPrefetched())
+
+ return parent
def _compile(self):
"""Generates Python code for the entire routing tree.
@@ -278,12 +390,13 @@ def _compile(self):
"""
src_lines = [
- 'def find(path, return_values, patterns, params):',
+ 'def find(path, return_values, patterns, converters, params):',
_TAB_STR + 'path_len = len(path)',
]
self._return_values = []
self._patterns = []
+ self._converters = []
self._ast = _CxParent()
self._generate_ast(
@@ -307,6 +420,14 @@ def _compile(self):
return scope['find']
+ def _instantiate_converter(self, klass, argstr=None):
+ if argstr is None:
+ return klass()
+
+ # NOTE(kgriffs): Don't try this at home. ;)
+ src = '{0}({1})'.format(klass.__name__, argstr)
+ return eval(src, {klass.__name__: klass})
+
class CompiledRouterNode(object):
"""Represents a single URI segment in a URI."""
@@ -322,8 +443,13 @@ def __init__(self, raw_segment,
self.is_var = False
self.is_complex = False
+ self.num_fields = 0
+
+ # TODO(kgriffs): Rename these since the docs talk about "fields"
+ # or "field expressions", not "vars" or "variables".
self.var_name = None
self.var_pattern = None
+ self.var_converter_map = []
# NOTE(kgriffs): CompiledRouter.add_route validates field names,
# so here we can just assume they are OK and use the simple
@@ -334,14 +460,34 @@ def __init__(self, raw_segment,
self.is_var = False
else:
self.is_var = True
+ self.num_fields = len(matches)
+
+ for field in matches:
+ # NOTE(kgriffs): We already validated the field
+ # expression to disallow blank converter names, or names
+ # that don't match a known converter, so if a name is
+ # given, we can just go ahead and use it.
+ if field.group('cname'):
+ self.var_converter_map.append(
+ (
+ field.group('fname'),
+ field.group('cname'),
+ field.group('argstr'),
+ )
+ )
+
+ if matches[0].span() == (0, len(raw_segment)):
+ # NOTE(kgriffs): Single field, spans entire segment
+ assert len(matches) == 1
- if len(matches) == 1 and matches[0].span() == (0, len(raw_segment)):
- # NOTE(richardolsson): if there is a single variable and
- # it spans the entire segment, the segment is not
- # complex and the variable name is simply the string
- # contained within curly braces.
+ # TODO(kgriffs): It is not "complex" because it only
+ # contains a single field. Rename this variable to make
+ # it more descriptive.
self.is_complex = False
- self.var_name = raw_segment[1:-1]
+
+ field = matches[0]
+ self.var_name = field.group('fname')
+
else:
# NOTE(richardolsson): Complex segments need to be
# converted into regular expressions in order to match
@@ -364,12 +510,15 @@ def __init__(self, raw_segment,
# trick the parser into doing the right thing.
escaped_segment = re.sub(r'[\.\(\)\[\]\?\$\*\+\^\|]', r'\\\g<0>', raw_segment)
- pattern_text = _FIELD_PATTERN.sub(r'(?P<\1>.+)', escaped_segment)
+ pattern_text = _FIELD_PATTERN.sub(r'(?P<\2>.+)', escaped_segment)
pattern_text = '^' + pattern_text + '$'
self.is_complex = True
self.var_pattern = re.compile(pattern_text)
+ if self.is_complex:
+ assert self.is_var
+
def matches(self, segment):
"""Returns True if this node matches the supplied template segment."""
@@ -431,6 +580,111 @@ def conflicts_with(self, segment):
return False
+class ConverterDict(UserDict):
+ """A dict-like class for storing field converters."""
+
+ def update(self, other):
+ try:
+ # NOTE(kgriffs): If it is a mapping type, it should
+ # implement keys().
+ names = other.keys()
+ except AttributeError:
+ # NOTE(kgriffs): Not a mapping type, so assume it is an
+ # iterable of 2-item iterables. But we need to make it
+ # re-iterable if it is a generator, for when we pass
+ # it on to the parent's update().
+ other = list(other)
+ names = [n for n, __ in other]
+
+ for n in names:
+ self._validate(n)
+
+ UserDict.update(self, other)
+
+ def __setitem__(self, name, converter):
+ self._validate(name)
+ UserDict.__setitem__(self, name, converter)
+
+ def _validate(self, name):
+ if not _IDENTIFIER_PATTERN.match(name):
+ raise ValueError(
+ 'Invalid converter name. Names may not be blank, and may '
+ 'only use ASCII letters, digits, and underscores. Names'
+ 'must begin with a letter or underscore.'
+ )
+
+
+class CompiledRouterOptions(object):
+ """Defines a set of configurable router options.
+
+ An instance of this class is exposed via :any:`API.router_options`
+ for configuring certain :py:class:`~.CompiledRouter` behaviors.
+
+ Attributes:
+ converters: Represents the collection of named
+ converters that may be referenced in URI template field
+ expressions. Adding additional converters is simply a
+ matter of mapping a name to a converter class::
+
+ api.router_options.converters['myconverter'] = MyConverter
+
+ Note:
+
+ Converter names may only contain ASCII letters, digits,
+ and underscores, and must start with either a letter or
+ an underscore.
+
+ A converter is any class that implements the following
+ method::
+
+ def convert(self, fragment):
+ # TODO: Convert the matched URI path fragment and
+ # return the result, or None to reject the fragment
+ # if it is not in the expected format or otherwise
+ # can not be converted.
+ pass
+
+ Converters are instantiated with the argument specification
+ given in the field expression. These specifications follow
+ the standard Python syntax for passing arguments. For
+ example, the comments in the following code show how a
+ converter would be instantiated given different
+ argument specifications in the URI template::
+
+ # MyConverter()
+ api.add_route(
+ '/a/{some_field:myconverter}',
+ some_resource
+ )
+
+ # MyConverter(True)
+ api.add_route(
+ '/b/{some_field:myconverter(True)}',
+ some_resource
+ )
+
+ # MyConverter(True, some_kwarg=10)
+ api.add_route(
+ '/c/{some_field:myconverter(True, some_kwarg=10)}',
+ some_resource
+ )
+
+ Warning:
+
+ Converter instances are shared between requests.
+ Therefore, in threaded deployments, care must be taken
+ to implement custom converters in a thread-safe
+ manner.
+ """
+
+ __slots__ = ('converters',)
+
+ def __init__(self):
+ self.converters = ConverterDict(
+ (name, converter) for name, converter in converters.BUILTIN
+ )
+
+
# --------------------------------------------------------------------
# AST Constructs
#
@@ -448,7 +702,7 @@ class _CxParent(object):
def __init__(self):
self._children = []
- def append(self, construct):
+ def append_child(self, construct):
self._children.append(construct)
def src(self, indentation):
@@ -503,27 +757,86 @@ def __init__(self, segment_idx, pattern_idx, pattern_text):
self._pattern_text = pattern_text
def src(self, indentation):
- lines = []
-
- lines.append(
+ lines = [
'{0}match = patterns[{1}].match(path[{2}]) # {3}'.format(
_TAB_STR * indentation,
self._pattern_idx,
self._segment_idx,
self._pattern_text,
- )
- )
+ ),
+ '{0}if match is not None:'.format(_TAB_STR * indentation),
+ self._children_src(indentation + 1),
+ ]
+
+ return '\n'.join(lines)
- lines.append('{0}if match is not None:'.format(_TAB_STR * indentation))
- lines.append('{0}params.update(match.groupdict())'.format(
- _TAB_STR * (indentation + 1)
- ))
- lines.append(self._children_src(indentation + 1))
+class _CxIfConverterField(_CxParent):
+ def __init__(self, field_name, converter_idx):
+ super(_CxIfConverterField, self).__init__()
+ self._field_name = field_name
+ self._converter_idx = converter_idx
+
+ def src(self, indentation):
+ lines = [
+ '{0}field_value = converters[{1}].convert(fragment)'.format(
+ _TAB_STR * indentation,
+ self._converter_idx,
+ ),
+ '{0}if field_value is not None:'.format(_TAB_STR * indentation),
+ "{0}params['{1}'] = field_value".format(
+ _TAB_STR * (indentation + 1),
+ self._field_name,
+ ),
+ self._children_src(indentation + 1),
+ ]
return '\n'.join(lines)
+class _CxSetFragmentFromField(object):
+ def __init__(self, field_name):
+ self._field_name = field_name
+
+ def src(self, indentation):
+ return "{0}fragment = groups.pop('{1}')".format(
+ _TAB_STR * indentation,
+ self._field_name,
+ )
+
+
+class _CxSetFragmentFromPath(object):
+ def __init__(self, segment_idx):
+ self._segment_idx = segment_idx
+
+ def src(self, indentation):
+ return '{0}fragment = path[{1}]'.format(
+ _TAB_STR * indentation,
+ self._segment_idx,
+ )
+
+
+class _CxSetParamsFromPatternMatch(object):
+ def src(self, indentation):
+ return '{0}params.update(match.groupdict())'.format(
+ _TAB_STR * indentation
+ )
+
+
+class _CxSetParamsFromPatternMatchPrefetched(object):
+ def src(self, indentation):
+ return '{0}params.update(groups)'.format(
+ _TAB_STR * indentation
+ )
+
+
+class _CxPrefetchGroupsFromPatternMatch(object):
+ def src(self, indentation):
+ return '{0}groups = match.groupdict()'.format(
+ _TAB_STR * indentation
+ )
+
+
class _CxReturnNone(object):
def src(self, indentation):
return '{0}return None'.format(_TAB_STR * indentation)
diff --git a/falcon/routing/converters.py b/falcon/routing/converters.py
new file mode 100644
index 000000000..51c6b271c
--- /dev/null
+++ b/falcon/routing/converters.py
@@ -0,0 +1,64 @@
+# Copyright 2017 by Rackspace Hosting, Inc.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+class IntConverter(object):
+ """Converts a field value to an int.
+
+ Keyword Args:
+ num_digits (int): Require the value to have the given
+ number of digits.
+ min (int): Reject the value if it is less than this value.
+ max (int): Reject the value if it is greater than this value.
+ """
+
+ __slots__ = ('_num_digits', '_min', '_max')
+
+ def __init__(self, num_digits=None, min=None, max=None):
+ if num_digits is not None and num_digits < 1:
+ raise ValueError('num_digits must be at least 1')
+
+ self._num_digits = num_digits
+ self._min = min
+ self._max = max
+
+ def convert(self, fragment):
+ if self._num_digits is not None and len(fragment) != self._num_digits:
+ return None
+
+ # NOTE(kgriffs): int() will accept numbers with preceding or
+ # trailing whitespace, so we need to do our own check. Using
+ # strip() is faster than either a regex or a series of or'd
+ # membership checks via "in", esp. as the length of contiguous
+ # numbers in the fragment grows.
+ if fragment.strip() != fragment:
+ return None
+
+ try:
+ value = int(fragment)
+ except ValueError:
+ return None
+
+ if self._min is not None and value < self._min:
+ return None
+
+ if self._max is not None and value > self._max:
+ return None
+
+ return value
+
+
+BUILTIN = (
+ ('int', IntConverter),
+)
| diff --git a/tests/conftest.py b/tests/conftest.py
index b6b17f247..b7d67747a 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -2,7 +2,6 @@
import pytest
-
import falcon
diff --git a/tests/test_default_router.py b/tests/test_default_router.py
index e62ca7950..062e4e829 100644
--- a/tests/test_default_router.py
+++ b/tests/test_default_router.py
@@ -88,6 +88,29 @@ def router():
router.add_route('/item/{q}', {}, ResourceWithId(28))
+ # ----------------------------------------------------------------
+ # Routes with field converters
+ # ----------------------------------------------------------------
+
+ router.add_route(
+ '/cvt/teams/{id:int(min=7)}', {}, ResourceWithId(29))
+ router.add_route(
+ '/cvt/teams/{id:int(min=7)}/members', {}, ResourceWithId(30))
+ router.add_route(
+ '/cvt/teams/default', {}, ResourceWithId(31))
+ router.add_route(
+ '/cvt/teams/default/members/{id:int}-{tenure:int}', {}, ResourceWithId(32))
+
+ router.add_route(
+ '/cvt/repos/{org}/{repo}/compare/{usr0}:{branch0:int}...{usr1}:{branch1:int}/part',
+ {}, ResourceWithId(33))
+ router.add_route(
+ '/cvt/repos/{org}/{repo}/compare/{usr0}:{branch0:int}',
+ {}, ResourceWithId(34))
+ router.add_route(
+ '/cvt/repos/{org}/{repo}/compare/{usr0}:{branch0:int}/full',
+ {}, ResourceWithId(35))
+
return router
@@ -102,6 +125,19 @@ def on_get(self, req, resp):
resp.body = self.resource_id
+class SpamConverter(object):
+ def __init__(self, times, eggs=False):
+ self._times = times
+ self._eggs = eggs
+
+ def convert(self, fragment):
+ item = fragment
+ if self._eggs:
+ item += '&eggs'
+
+ return ', '.join(item for i in range(self._times))
+
+
# =====================================================================
# Regression tests for use cases reported by users
# =====================================================================
@@ -233,7 +269,7 @@ def test_root_path():
assert resource.resource_id == 42
expected_src = textwrap.dedent("""
- def find(path, return_values, patterns, params):
+ def find(path, return_values, patterns, converters, params):
path_len = len(path)
if path_len > 0:
if path[0] == '':
@@ -275,11 +311,12 @@ def test_match_entire_path(uri_template, path):
@pytest.mark.parametrize('uri_template', [
- '/teams/{collision}', # simple vs simple
+ '/teams/{conflict}', # simple vs simple
'/emojis/signs/{id_too}', # another simple vs simple
- '/repos/{org}/{repo}/compare/{complex}:{vs}...{complex2}:{collision}',
+ '/repos/{org}/{repo}/compare/{complex}:{vs}...{complex2}:{conflict}',
+ '/teams/{id:int}/settings', # converted vs. non-converted
])
-def test_collision(router, uri_template):
+def test_conflict(router, uri_template):
with pytest.raises(ValueError):
router.add_route(uri_template, {}, ResourceWithId(-1))
@@ -289,7 +326,7 @@ def test_collision(router, uri_template):
'/repos/{complex}.{vs}.{simple}',
'/repos/{org}/{repo}/compare/{complex}:{vs}...{complex2}/full',
])
-def test_non_collision(router, uri_template):
+def test_non_conflict(router, uri_template):
router.add_route(uri_template, {}, ResourceWithId(-1))
@@ -365,23 +402,23 @@ def test_literal_segment(router):
assert route is None
-@pytest.mark.parametrize('uri_template', [
+@pytest.mark.parametrize('path', [
'/teams',
'/emojis/signs',
'/gists',
'/gists/42',
])
-def test_dead_segment(router, uri_template):
- route = router.find(uri_template)
+def test_dead_segment(router, path):
+ route = router.find(path)
assert route is None
-@pytest.mark.parametrize('uri_template', [
+@pytest.mark.parametrize('path', [
'/repos/racker/falcon/compare/foo',
'/repos/racker/falcon/compare/foo/full',
])
-def test_malformed_pattern(router, uri_template):
- route = router.find(uri_template)
+def test_malformed_pattern(router, path):
+ route = router.find(path)
assert route is None
@@ -390,6 +427,58 @@ def test_literal(router):
assert resource.resource_id == 8
+@pytest.mark.parametrize('path,expected_params', [
+ ('/cvt/teams/007', {'id': 7}),
+ ('/cvt/teams/1234/members', {'id': 1234}),
+ ('/cvt/teams/default/members/700-5', {'id': 700, 'tenure': 5}),
+ (
+ '/cvt/repos/org/repo/compare/xkcd:353',
+ {'org': 'org', 'repo': 'repo', 'usr0': 'xkcd', 'branch0': 353},
+ ),
+ (
+ '/cvt/repos/org/repo/compare/gunmachan:1234...kumamon:5678/part',
+ {
+ 'org': 'org',
+ 'repo': 'repo',
+ 'usr0': 'gunmachan',
+ 'branch0': 1234,
+ 'usr1': 'kumamon',
+ 'branch1': 5678,
+ }
+ ),
+ (
+ '/cvt/repos/xkcd/353/compare/susan:0001/full',
+ {'org': 'xkcd', 'repo': '353', 'usr0': 'susan', 'branch0': 1},
+ )
+])
+def test_converters(router, path, expected_params):
+ __, __, params, __ = router.find(path)
+ assert params == expected_params
+
+
+@pytest.mark.parametrize('uri_template', [
+ '/foo/{bar:int(0)}',
+ '/foo/{bar:int(num_digits=0)}',
+ '/foo/{bar:int(-1)}/baz',
+ '/foo/{bar:int(num_digits=-1)}/baz',
+])
+def test_converters_with_invalid_options(router, uri_template):
+ # NOTE(kgriffs): Sanity-check that errors are properly bubbled up
+ # when calling add_route(). Additional checks can be found
+ # in test_uri_converters.py
+ with pytest.raises(ValueError):
+ router.add_route(uri_template, {}, ResourceWithId(1))
+
+
+@pytest.mark.parametrize('uri_template', [
+ '/foo/{bar:}',
+ '/foo/{bar:unknown}/baz',
+])
+def test_converters_malformed_specification(router, uri_template):
+ with pytest.raises(ValueError):
+ router.add_route(uri_template, {}, ResourceWithId(1))
+
+
def test_variable(router):
resource, __, params, __ = router.find('/teams/42')
assert resource.resource_id == 6
@@ -413,8 +502,10 @@ def test_single_character_field_name(router):
@pytest.mark.parametrize('path,expected_id', [
('/teams/default', 19),
('/teams/default/members', 7),
- ('/teams/foo', 6),
- ('/teams/foo/members', 7),
+ ('/cvt/teams/default', 31),
+ ('/cvt/teams/default/members/1234-10', 32),
+ ('/teams/1234', 6),
+ ('/teams/1234/members', 7),
('/gists/first', 20),
('/gists/first/raw', 18),
('/gists/first/pdf', 21),
@@ -446,6 +537,11 @@ def test_literal_vs_variable(router, path, expected_id):
'/teams/default/undefined',
'/teams/default/undefined/segments',
+ # Literal vs. variable (converters)
+ '/cvt/teams/default/members', # 'default' can't be converted to an int
+ '/cvt/teams/NaN',
+ '/cvt/teams/default/members/NaN',
+
# Literal vs variable (emojis)
'/emojis/signs',
'/emojis/signs/0/small',
@@ -512,3 +608,51 @@ def test_complex_alt(router, url_postfix, resource_id, expected_template):
'branch0': 'master',
})
assert uri_template == expected_template
+
+
+def test_options_converters_set(router):
+ router.options.converters['spam'] = SpamConverter
+
+ router.add_route('/{food:spam(3, eggs=True)}', {}, ResourceWithId(1))
+ resource, __, params, __ = router.find('/spam')
+
+ assert params == {'food': 'spam&eggs, spam&eggs, spam&eggs'}
+
+
+@pytest.mark.parametrize('converter_name', [
+ 'spam',
+ 'spam_2'
+])
+def test_options_converters_update(router, converter_name):
+ router.options.converters.update({
+ 'spam': SpamConverter,
+ 'spam_2': SpamConverter,
+ })
+
+ template = '/{food:' + converter_name + '(3, eggs=True)}'
+ router.add_route(template, {}, ResourceWithId(1))
+ resource, __, params, __ = router.find('/spam')
+
+ assert params == {'food': 'spam&eggs, spam&eggs, spam&eggs'}
+
+
+@pytest.mark.parametrize('name', [
+ 'has whitespace',
+ 'whitespace ',
+ ' whitespace ',
+ ' whitespace',
+ 'funky$character',
+ '42istheanswer',
+ 'with-hyphen',
+])
+def test_options_converters_invalid_name(router, name):
+ with pytest.raises(ValueError):
+ router.options.converters[name] = object
+
+
+def test_options_converters_invalid_name_on_update(router):
+ with pytest.raises(ValueError):
+ router.options.converters.update({
+ 'valid_name': SpamConverter,
+ '7eleven': SpamConverter,
+ })
diff --git a/tests/test_uri_converters.py b/tests/test_uri_converters.py
new file mode 100644
index 000000000..66a6f712d
--- /dev/null
+++ b/tests/test_uri_converters.py
@@ -0,0 +1,54 @@
+import string
+
+import pytest
+
+from falcon.routing import converters
+
+
+@pytest.mark.parametrize('segment,num_digits,min,max,expected', [
+ ('123', None, None, None, 123),
+ ('01', None, None, None, 1),
+ ('001', None, None, None, 1),
+ ('0', None, None, None, 0),
+ ('00', None, None, None, 00),
+
+ ('1', 1, None, None, 1),
+ ('12', 1, None, None, None),
+ ('12', 2, None, None, 12),
+
+ ('1', 1, 1, 1, 1),
+ ('1', 1, 1, None, 1),
+ ('1', 1, 1, 2, 1),
+ ('1', 1, 2, None, None),
+ ('1', 1, 2, 1, None),
+ ('2', 1, 1, 2, 2),
+ ('2', 1, 2, 2, 2),
+ ('3', 1, 1, 2, None),
+
+ ('12', 1, None, None, None),
+ ('12', 1, 1, 12, None),
+ ('12', 2, None, None, 12),
+ ('12', 2, 1, 12, 12),
+ ('12', 2, 12, 12, 12),
+ ('12', 2, 13, 12, None),
+ ('12', 2, 13, 13, None),
+])
+def test_int_filter(segment, num_digits, min, max, expected):
+ c = converters.IntConverter(num_digits, min, max)
+ assert c.convert(segment) == expected
+
+
+@pytest.mark.parametrize('segment', (
+ ['0x0F', 'something', '', ' '] +
+ ['123' + w for w in string.whitespace] +
+ [w + '123' for w in string.whitespace]
+))
+def test_int_filter_malformed(segment):
+ c = converters.IntConverter()
+ assert c.convert(segment) is None
+
+
+@pytest.mark.parametrize('num_digits', [0, -1, -10])
+def test_int_filter_invalid_config(num_digits):
+ with pytest.raises(ValueError):
+ converters.IntConverter(num_digits)
diff --git a/tests/test_uri_templates.py b/tests/test_uri_templates.py
index 484f296d3..f18c2297d 100644
--- a/tests/test_uri_templates.py
+++ b/tests/test_uri_templates.py
@@ -131,6 +131,54 @@ def test_single(client, resource, field_name):
assert resource.captured_kwargs[field_name] == '123'
+@pytest.mark.parametrize('uri_template,', [
+ '/{id:int}',
+ '/{id:int(3)}',
+ '/{id:int(min=123)}',
+ '/{id:int(min=123, max=123)}',
+])
+def test_converter(client, uri_template):
+ resource1 = IDResource()
+ client.app.add_route(uri_template, resource1)
+
+ result = client.simulate_get('/123')
+
+ assert result.status_code == 200
+ assert resource1.called
+ assert resource1.id == 123
+ assert resource1.req.path == '/123'
+
+
+@pytest.mark.parametrize('uri_template,', [
+ '/{id:int(2)}',
+ '/{id:int(min=124)}',
+ '/{id:int(num_digits=3, max=100)}',
+])
+def test_converter_rejections(client, uri_template):
+ resource1 = IDResource()
+ client.app.add_route(uri_template, resource1)
+
+ result = client.simulate_get('/123')
+
+ assert result.status_code == 404
+ assert not resource1.called
+
+
+def test_converter_custom(client, resource):
+ class SpamConverter(object):
+ def convert(self, fragment):
+ return 'spam!'
+
+ client.app.router_options.converters['spam'] = SpamConverter
+ client.app.add_route('/{food:spam}', resource)
+
+ result = client.simulate_get('/something')
+
+ assert result.status_code == 200
+ assert resource.called
+ assert resource.captured_kwargs['food'] == 'spam!'
+
+
def test_single_trailing_slash(client):
resource1 = IDResource()
client.app.add_route('/1/{id}/', resource1)
| diff --git a/docs/api/api.rst b/docs/api/api.rst
index c27887205..0d2e4c25a 100644
--- a/docs/api/api.rst
+++ b/docs/api/api.rst
@@ -21,3 +21,5 @@ standard-compliant WSGI server.
.. autoclass:: falcon.ResponseOptions
:members:
+.. autoclass:: falcon.routing.CompiledRouterOptions
+ :noindex:
diff --git a/docs/api/routing.rst b/docs/api/routing.rst
index c01cb2ea8..57096fa1b 100644
--- a/docs/api/routing.rst
+++ b/docs/api/routing.rst
@@ -63,4 +63,5 @@ A custom routing engine may be specified when instantiating
api = API(router=fancy)
.. automodule:: falcon.routing
- :members: create_http_method_map, compile_uri_template, CompiledRouter
+ :members: create_http_method_map, compile_uri_template,
+ CompiledRouter, CompiledRouterOptions
| [
{
"components": [
{
"doc": "",
"lines": [
291,
292
],
"name": "API.router_options",
"signature": "def router_options(self):",
"type": "function"
}
],
"file": "falcon/api.py"
},
{
"components": [
{
"doc"... | [
"tests/test_default_router.py::test_user_regression_versioned_url",
"tests/test_default_router.py::test_user_regression_recipes",
"tests/test_default_router.py::test_user_regression_special_chars[/serviceRoot/People|{field}-/serviceRoot/People|susie-expected_params0]",
"tests/test_default_router.py::test_user... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat(API): Foundational support for URI template field converters
Provides an initial converter (int) along with relevant plumbing and tests. Additional converters and docs to come in future PRs.
Partially implements: #423
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/api.py]
(definition of API.router_options:)
def router_options(self):
[end of new definitions in falcon/api.py]
[start of new definitions in falcon/routing/compiled.py]
(definition of CompiledRouter.options:)
def options(self):
(definition of CompiledRouter._validate_template_segment:)
def _validate_template_segment(self, segment, used_names):
"""Validates a single path segment of a URI template.
1. Ensure field names are valid Python identifiers, since they
will be passed as kwargs to responders.
2. Check that there are no duplicate names, since that causes
(at least) the following problems:
a. For simple nodes, values from deeper nodes overwrite
values from more shallow nodes.
b. For complex nodes, re.compile() raises a nasty error
3. Check that when the converter syntax is used, the named
converter exists."""
(definition of CompiledRouter._generate_conversion_ast:)
def _generate_conversion_ast(self, parent, node):
(definition of CompiledRouter._instantiate_converter:)
def _instantiate_converter(self, klass, argstr=None):
(definition of ConverterDict:)
class ConverterDict(UserDict):
"""A dict-like class for storing field converters."""
(definition of ConverterDict.update:)
def update(self, other):
(definition of ConverterDict.__setitem__:)
def __setitem__(self, name, converter):
(definition of ConverterDict._validate:)
def _validate(self, name):
(definition of CompiledRouterOptions:)
class CompiledRouterOptions(object):
"""Defines a set of configurable router options.
An instance of this class is exposed via :any:`API.router_options`
for configuring certain :py:class:`~.CompiledRouter` behaviors.
Attributes:
converters: Represents the collection of named
converters that may be referenced in URI template field
expressions. Adding additional converters is simply a
matter of mapping a name to a converter class::
api.router_options.converters['myconverter'] = MyConverter
Note:
Converter names may only contain ASCII letters, digits,
and underscores, and must start with either a letter or
an underscore.
A converter is any class that implements the following
method::
def convert(self, fragment):
# TODO: Convert the matched URI path fragment and
# return the result, or None to reject the fragment
# if it is not in the expected format or otherwise
# can not be converted.
pass
Converters are instantiated with the argument specification
given in the field expression. These specifications follow
the standard Python syntax for passing arguments. For
example, the comments in the following code show how a
converter would be instantiated given different
argument specifications in the URI template::
# MyConverter()
api.add_route(
'/a/{some_field:myconverter}',
some_resource
)
# MyConverter(True)
api.add_route(
'/b/{some_field:myconverter(True)}',
some_resource
)
# MyConverter(True, some_kwarg=10)
api.add_route(
'/c/{some_field:myconverter(True, some_kwarg=10)}',
some_resource
)
Warning:
Converter instances are shared between requests.
Therefore, in threaded deployments, care must be taken
to implement custom converters in a thread-safe
manner."""
(definition of CompiledRouterOptions.__init__:)
def __init__(self):
(definition of _CxParent.append_child:)
def append_child(self, construct):
(definition of _CxIfConverterField:)
class _CxIfConverterField(_CxParent):
(definition of _CxIfConverterField.__init__:)
def __init__(self, field_name, converter_idx):
(definition of _CxIfConverterField.src:)
def src(self, indentation):
(definition of _CxSetFragmentFromField:)
class _CxSetFragmentFromField(object):
(definition of _CxSetFragmentFromField.__init__:)
def __init__(self, field_name):
(definition of _CxSetFragmentFromField.src:)
def src(self, indentation):
(definition of _CxSetFragmentFromPath:)
class _CxSetFragmentFromPath(object):
(definition of _CxSetFragmentFromPath.__init__:)
def __init__(self, segment_idx):
(definition of _CxSetFragmentFromPath.src:)
def src(self, indentation):
(definition of _CxSetParamsFromPatternMatch:)
class _CxSetParamsFromPatternMatch(object):
(definition of _CxSetParamsFromPatternMatch.src:)
def src(self, indentation):
(definition of _CxSetParamsFromPatternMatchPrefetched:)
class _CxSetParamsFromPatternMatchPrefetched(object):
(definition of _CxSetParamsFromPatternMatchPrefetched.src:)
def src(self, indentation):
(definition of _CxPrefetchGroupsFromPatternMatch:)
class _CxPrefetchGroupsFromPatternMatch(object):
(definition of _CxPrefetchGroupsFromPatternMatch.src:)
def src(self, indentation):
[end of new definitions in falcon/routing/compiled.py]
[start of new definitions in falcon/routing/converters.py]
(definition of IntConverter:)
class IntConverter(object):
"""Converts a field value to an int.
Keyword Args:
num_digits (int): Require the value to have the given
number of digits.
min (int): Reject the value if it is less than this value.
max (int): Reject the value if it is greater than this value."""
(definition of IntConverter.__init__:)
def __init__(self, num_digits=None, min=None, max=None):
(definition of IntConverter.convert:)
def convert(self, fragment):
[end of new definitions in falcon/routing/converters.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | |
falconry__falcon-1066 | 1,066 | falconry/falcon | null | ef03f4c8803dda415e8ed1e60aa8698b6a7d9956 | 2017-05-26T15:43:04Z | diff --git a/falcon/errors.py b/falcon/errors.py
index 4457827d1..73a2c97a0 100644
--- a/falcon/errors.py
+++ b/falcon/errors.py
@@ -1250,6 +1250,57 @@ def __init__(self, title=None, description=None, **kwargs):
description, **kwargs)
+class HTTPNotImplemented(HTTPError):
+ """501 Not Implemented.
+
+ The 501 (Not Implemented) status code indicates that the server does
+ not support the functionality required to fulfill the request. This
+ is the appropriate response when the server does not recognize the
+ request method and is not capable of supporting it for any resource.
+
+ A 501 response is cacheable by default; i.e., unless otherwise
+ indicated by the method definition or explicit cache controls (see
+ Section 4.2.2 of [RFC7234]).
+
+ (See also: RFC 7231, Section 6.6.2)
+
+ Keyword Args:
+ title (str): Error title (default '500 Internal Server Error').
+ description (str): Human-friendly description of the error, along with
+ a helpful suggestion or two.
+ headers (dict or list): A ``dict`` of header names and values
+ to set, or a ``list`` of (*name*, *value*) tuples. Both *name* and
+ *value* must be of type ``str`` or ``StringType``, and only
+ character values 0x00 through 0xFF may be used on platforms that
+ use wide characters.
+
+ Note:
+ The Content-Type header, if present, will be overridden. If
+ you wish to return custom error messages, you can create
+ your own HTTP error class, and install an error handler
+ to convert it into an appropriate HTTP response for the
+ client
+
+ Note:
+ Falcon can process a list of ``tuple`` slightly faster
+ than a ``dict``.
+
+ href (str): A URL someone can visit to find out more information
+ (default ``None``). Unicode characters are percent-encoded.
+ href_text (str): If href is given, use this as the friendly
+ title/description for the link (default 'API documentation
+ for this error').
+ code (int): An internal code that customers can reference in their
+ support request or to help them when searching for knowledge
+ base articles related to this error (default ``None``).
+
+ """
+
+ def __init__(self, title=None, description=None, **kwargs):
+ super(HTTPNotImplemented, self).__init__(status.HTTP_501, title,
+ description, **kwargs)
+
+
class HTTPBadGateway(HTTPError):
"""502 Bad Gateway.
@@ -1365,6 +1416,103 @@ def __init__(self, title=None, description=None, retry_after=None, **kwargs):
**kwargs)
+class HTTPGatewayTimeout(HTTPError):
+ """504 Gateway Timeout.
+
+ The 504 (Gateway Timeout) status code indicates that the server,
+ while acting as a gateway or proxy, did not receive a timely response
+ from an upstream server it needed to access in order to complete the
+ request.
+
+ (See also: RFC 7231, Section 6.6.5)
+
+ Keyword Args:
+ title (str): Error title (default '503 Service Unavailable').
+ description (str): Human-friendly description of the error, along with
+ a helpful suggestion or two.
+ headers (dict or list): A ``dict`` of header names and values
+ to set, or a ``list`` of (*name*, *value*) tuples. Both *name* and
+ *value* must be of type ``str`` or ``StringType``, and only
+ character values 0x00 through 0xFF may be used on platforms that
+ use wide characters.
+
+ Note:
+ The Content-Type header, if present, will be overridden. If
+ you wish to return custom error messages, you can create
+ your own HTTP error class, and install an error handler
+ to convert it into an appropriate HTTP response for the
+ client
+
+ Note:
+ Falcon can process a list of ``tuple`` slightly faster
+ than a ``dict``.
+
+ href (str): A URL someone can visit to find out more information
+ (default ``None``). Unicode characters are percent-encoded.
+ href_text (str): If href is given, use this as the friendly
+ title/description for the link (default 'API documentation
+ for this error').
+ code (int): An internal code that customers can reference in their
+ support request or to help them when searching for knowledge
+ base articles related to this error (default ``None``).
+ """
+
+ def __init__(self, title=None, description=None, **kwargs):
+ super(HTTPGatewayTimeout, self).__init__(status.HTTP_504, title,
+ description, **kwargs)
+
+
+class HTTPVersionNotSupported(HTTPError):
+ """505 HTTP Version Not Supported
+
+ The 505 (HTTP Version Not Supported) status code indicates that the
+ server does not support, or refuses to support, the major version of
+ HTTP that was used in the request message. The server is indicating
+ that it is unable or unwilling to complete the request using the same
+ major version as the client, as described in Section 2.6 of
+ [RFC7230], other than with this error message. The server SHOULD
+ generate a representation for the 505 response that describes why
+ that version is not supported and what other protocols are supported
+ by that server.
+
+ (See also: RFC 7231, Section 6.6.6)
+
+ Keyword Args:
+ title (str): Error title (default '503 Service Unavailable').
+ description (str): Human-friendly description of the error, along with
+ a helpful suggestion or two.
+ headers (dict or list): A ``dict`` of header names and values
+ to set, or a ``list`` of (*name*, *value*) tuples. Both *name* and
+ *value* must be of type ``str`` or ``StringType``, and only
+ character values 0x00 through 0xFF may be used on platforms that
+ use wide characters.
+
+ Note:
+ The Content-Type header, if present, will be overridden. If
+ you wish to return custom error messages, you can create
+ your own HTTP error class, and install an error handler
+ to convert it into an appropriate HTTP response for the
+ client
+
+ Note:
+ Falcon can process a list of ``tuple`` slightly faster
+ than a ``dict``.
+
+ href (str): A URL someone can visit to find out more information
+ (default ``None``). Unicode characters are percent-encoded.
+ href_text (str): If href is given, use this as the friendly
+ title/description for the link (default 'API documentation
+ for this error').
+ code (int): An internal code that customers can reference in their
+ support request or to help them when searching for knowledge
+ base articles related to this error (default ``None``).
+ """
+
+ def __init__(self, title=None, description=None, **kwargs):
+ super(HTTPVersionNotSupported, self).__init__(status.HTTP_505, title,
+ description, **kwargs)
+
+
class HTTPInsufficientStorage(HTTPError):
"""507 Insufficient Storage.
diff --git a/falcon/status_codes.py b/falcon/status_codes.py
index 4cb6113fc..676e550e9 100644
--- a/falcon/status_codes.py
+++ b/falcon/status_codes.py
@@ -122,9 +122,9 @@
HTTP_BAD_GATEWAY = HTTP_502
HTTP_503 = '503 Service Unavailable'
HTTP_SERVICE_UNAVAILABLE = HTTP_503
-HTTP_504 = '504 Gateway Time-out'
+HTTP_504 = '504 Gateway Timeout'
HTTP_GATEWAY_TIMEOUT = HTTP_504
-HTTP_505 = '505 HTTP Version not supported'
+HTTP_505 = '505 HTTP Version Not Supported'
HTTP_HTTP_VERSION_NOT_SUPPORTED = HTTP_505
HTTP_507 = '507 Insufficient Storage'
HTTP_INSUFFICIENT_STORAGE = HTTP_507
| diff --git a/tests/test_error.py b/tests/test_error.py
index 2d398fb99..73aefa9f2 100644
--- a/tests/test_error.py
+++ b/tests/test_error.py
@@ -20,8 +20,11 @@
(falcon.HTTPRequestHeaderFieldsTooLarge, status.HTTP_431),
(falcon.HTTPUnavailableForLegalReasons, status.HTTP_451),
(falcon.HTTPInternalServerError, status.HTTP_500),
+ (falcon.HTTPNotImplemented, status.HTTP_501),
(falcon.HTTPBadGateway, status.HTTP_502),
(falcon.HTTPServiceUnavailable, status.HTTP_503),
+ (falcon.HTTPGatewayTimeout, status.HTTP_504),
+ (falcon.HTTPVersionNotSupported, status.HTTP_505),
(falcon.HTTPInsufficientStorage, status.HTTP_507),
(falcon.HTTPLoopDetected, status.HTTP_508),
(falcon.HTTPNetworkAuthenticationRequired, status.HTTP_511),
@@ -49,9 +52,13 @@ def test_with_default_title_and_desc(err, title):
falcon.HTTPLocked,
falcon.HTTPFailedDependency,
falcon.HTTPRequestHeaderFieldsTooLarge,
- falcon.HTTPInternalServerError,
falcon.HTTPUnavailableForLegalReasons,
+ falcon.HTTPInternalServerError,
+ falcon.HTTPNotImplemented,
falcon.HTTPBadGateway,
+ falcon.HTTPServiceUnavailable,
+ falcon.HTTPGatewayTimeout,
+ falcon.HTTPVersionNotSupported,
falcon.HTTPInsufficientStorage,
falcon.HTTPLoopDetected,
falcon.HTTPNetworkAuthenticationRequired,
| [
{
"components": [
{
"doc": "501 Not Implemented.\n\nThe 501 (Not Implemented) status code indicates that the server does\nnot support the functionality required to fulfill the request. This\nis the appropriate response when the server does not recognize the\nrequest method and is not capable of s... | [
"tests/test_error.py::test_with_default_title_and_desc[HTTPBadRequest-400",
"tests/test_error.py::test_with_default_title_and_desc[HTTPForbidden-403",
"tests/test_error.py::test_with_default_title_and_desc[HTTPConflict-409",
"tests/test_error.py::test_with_default_title_and_desc[HTTPLengthRequired-411",
"te... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat: Add 5xx error classes
Adds following error classes, together with tests:
- [x] 501 Not Implemented
- [x] 504 Gateway Timeout
- [x] 505 HTTP Version Not Supported
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/errors.py]
(definition of HTTPNotImplemented:)
class HTTPNotImplemented(HTTPError):
"""501 Not Implemented.
The 501 (Not Implemented) status code indicates that the server does
not support the functionality required to fulfill the request. This
is the appropriate response when the server does not recognize the
request method and is not capable of supporting it for any resource.
A 501 response is cacheable by default; i.e., unless otherwise
indicated by the method definition or explicit cache controls (see
Section 4.2.2 of [RFC7234]).
(See also: RFC 7231, Section 6.6.2)
Keyword Args:
title (str): Error title (default '500 Internal Server Error').
description (str): Human-friendly description of the error, along with
a helpful suggestion or two.
headers (dict or list): A ``dict`` of header names and values
to set, or a ``list`` of (*name*, *value*) tuples. Both *name* and
*value* must be of type ``str`` or ``StringType``, and only
character values 0x00 through 0xFF may be used on platforms that
use wide characters.
Note:
The Content-Type header, if present, will be overridden. If
you wish to return custom error messages, you can create
your own HTTP error class, and install an error handler
to convert it into an appropriate HTTP response for the
client
Note:
Falcon can process a list of ``tuple`` slightly faster
than a ``dict``.
href (str): A URL someone can visit to find out more information
(default ``None``). Unicode characters are percent-encoded.
href_text (str): If href is given, use this as the friendly
title/description for the link (default 'API documentation
for this error').
code (int): An internal code that customers can reference in their
support request or to help them when searching for knowledge
base articles related to this error (default ``None``)."""
(definition of HTTPNotImplemented.__init__:)
def __init__(self, title=None, description=None, **kwargs):
(definition of HTTPGatewayTimeout:)
class HTTPGatewayTimeout(HTTPError):
"""504 Gateway Timeout.
The 504 (Gateway Timeout) status code indicates that the server,
while acting as a gateway or proxy, did not receive a timely response
from an upstream server it needed to access in order to complete the
request.
(See also: RFC 7231, Section 6.6.5)
Keyword Args:
title (str): Error title (default '503 Service Unavailable').
description (str): Human-friendly description of the error, along with
a helpful suggestion or two.
headers (dict or list): A ``dict`` of header names and values
to set, or a ``list`` of (*name*, *value*) tuples. Both *name* and
*value* must be of type ``str`` or ``StringType``, and only
character values 0x00 through 0xFF may be used on platforms that
use wide characters.
Note:
The Content-Type header, if present, will be overridden. If
you wish to return custom error messages, you can create
your own HTTP error class, and install an error handler
to convert it into an appropriate HTTP response for the
client
Note:
Falcon can process a list of ``tuple`` slightly faster
than a ``dict``.
href (str): A URL someone can visit to find out more information
(default ``None``). Unicode characters are percent-encoded.
href_text (str): If href is given, use this as the friendly
title/description for the link (default 'API documentation
for this error').
code (int): An internal code that customers can reference in their
support request or to help them when searching for knowledge
base articles related to this error (default ``None``)."""
(definition of HTTPGatewayTimeout.__init__:)
def __init__(self, title=None, description=None, **kwargs):
(definition of HTTPVersionNotSupported:)
class HTTPVersionNotSupported(HTTPError):
"""505 HTTP Version Not Supported
The 505 (HTTP Version Not Supported) status code indicates that the
server does not support, or refuses to support, the major version of
HTTP that was used in the request message. The server is indicating
that it is unable or unwilling to complete the request using the same
major version as the client, as described in Section 2.6 of
[RFC7230], other than with this error message. The server SHOULD
generate a representation for the 505 response that describes why
that version is not supported and what other protocols are supported
by that server.
(See also: RFC 7231, Section 6.6.6)
Keyword Args:
title (str): Error title (default '503 Service Unavailable').
description (str): Human-friendly description of the error, along with
a helpful suggestion or two.
headers (dict or list): A ``dict`` of header names and values
to set, or a ``list`` of (*name*, *value*) tuples. Both *name* and
*value* must be of type ``str`` or ``StringType``, and only
character values 0x00 through 0xFF may be used on platforms that
use wide characters.
Note:
The Content-Type header, if present, will be overridden. If
you wish to return custom error messages, you can create
your own HTTP error class, and install an error handler
to convert it into an appropriate HTTP response for the
client
Note:
Falcon can process a list of ``tuple`` slightly faster
than a ``dict``.
href (str): A URL someone can visit to find out more information
(default ``None``). Unicode characters are percent-encoded.
href_text (str): If href is given, use this as the friendly
title/description for the link (default 'API documentation
for this error').
code (int): An internal code that customers can reference in their
support request or to help them when searching for knowledge
base articles related to this error (default ``None``)."""
(definition of HTTPVersionNotSupported.__init__:)
def __init__(self, title=None, description=None, **kwargs):
[end of new definitions in falcon/errors.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | ||
falconry__falcon-1053 | 1,053 | falconry/falcon | null | 975565ba53355d8a563b698e1ca15f416f260f72 | 2017-05-21T06:40:53Z | diff --git a/falcon/request.py b/falcon/request.py
index 8b8f972a7..40ed7e49f 100644
--- a/falcon/request.py
+++ b/falcon/request.py
@@ -1181,17 +1181,17 @@ def get_param_as_list(self, name,
raise errors.HTTPMissingParam(name)
- def get_param_as_date(self, name, format_string='%Y-%m-%d',
- required=False, store=None):
- """Return the value of a query string parameter as a date.
+ def get_param_as_datetime(self, name, format_string='%Y-%m-%dT%H:%M:%SZ',
+ required=False, store=None):
+ """Return the value of a query string parameter as a datetime.
Args:
name (str): Parameter name, case-sensitive (e.g., 'ids').
Keyword Args:
format_string (str): String used to parse the param value
- into a date. Any format recognized by strptime() is
- supported (default ``"%Y-%m-%d"``).
+ into a datetime. Any format recognized by strptime() is
+ supported (default ``'%Y-%m-%dT%H:%M:%SZ'``).
required (bool): Set to ``True`` to raise
``HTTPBadRequest`` instead of returning ``None`` when the
parameter is not found (default ``False``).
@@ -1199,8 +1199,8 @@ def get_param_as_date(self, name, format_string='%Y-%m-%d',
the value of the param, but only if the param is found (default
``None``).
Returns:
- datetime.date: The value of the param if it is found and can be
- converted to a ``date`` according to the supplied format
+ datetime.datetime: The value of the param if it is found and can be
+ converted to a ``datetime`` according to the supplied format
string. If the param is not found, returns ``None`` unless
required is ``True``.
@@ -1216,11 +1216,51 @@ def get_param_as_date(self, name, format_string='%Y-%m-%d',
return None
try:
- date = strptime(param_value, format_string).date()
+ date_time = strptime(param_value, format_string)
except ValueError:
msg = 'The date value does not match the required format.'
raise errors.HTTPInvalidParam(msg, name)
+ if store is not None:
+ store[name] = date_time
+
+ return date_time
+
+ def get_param_as_date(self, name, format_string='%Y-%m-%d',
+ required=False, store=None):
+ """Return the value of a query string parameter as a date.
+
+ Args:
+ name (str): Parameter name, case-sensitive (e.g., 'ids').
+
+ Keyword Args:
+ format_string (str): String used to parse the param value
+ into a date. Any format recognized by strptime() is
+ supported (default ``"%Y-%m-%d"``).
+ required (bool): Set to ``True`` to raise
+ ``HTTPBadRequest`` instead of returning ``None`` when the
+ parameter is not found (default ``False``).
+ store (dict): A ``dict``-like object in which to place
+ the value of the param, but only if the param is found (default
+ ``None``).
+ Returns:
+ datetime.date: The value of the param if it is found and can be
+ converted to a ``date`` according to the supplied format
+ string. If the param is not found, returns ``None`` unless
+ required is ``True``.
+
+ Raises:
+ HTTPBadRequest: A required param is missing from the request.
+ HTTPInvalidParam: A transform function raised an instance of
+ ``ValueError``.
+ """
+
+ date_time = self.get_param_as_datetime(name, format_string, required)
+ if date_time:
+ date = date_time.date()
+ else:
+ return None
+
if store is not None:
store[name] = date
| diff --git a/tests/test_query_params.py b/tests/test_query_params.py
index 4be4674b3..e90567d4b 100644
--- a/tests/test_query_params.py
+++ b/tests/test_query_params.py
@@ -1,4 +1,5 @@
-from datetime import date
+from datetime import date, datetime
+
try:
import ujson as json
except ImportError:
@@ -578,6 +579,52 @@ def test_get_date_invalid(self, simulate_request, client, resource):
with pytest.raises(HTTPInvalidParam):
req.get_param_as_date('thedate', format_string=format_string)
+ def test_get_datetime_valid(self, simulate_request, client, resource):
+ client.app.add_route('/', resource)
+ date_value = '2015-04-20T10:10:10Z'
+ query_string = 'thedate={0}'.format(date_value)
+ simulate_request(client=client, path='/', query_string=query_string)
+ req = resource.captured_req
+ assert req.get_param_as_datetime('thedate') == datetime(2015, 4, 20, 10, 10, 10)
+
+ def test_get_datetime_missing_param(self, simulate_request, client, resource):
+ client.app.add_route('/', resource)
+ query_string = 'notthedate=2015-04-20T10:10:10Z'
+ simulate_request(client=client, path='/', query_string=query_string)
+ req = resource.captured_req
+ assert req.get_param_as_datetime('thedate') is None
+
+ def test_get_datetime_valid_with_format(self, simulate_request, client, resource):
+ client.app.add_route('/', resource)
+ date_value = '20150420 10:10:10'
+ query_string = 'thedate={0}'.format(date_value)
+ format_string = '%Y%m%d %H:%M:%S'
+ simulate_request(client=client, path='/', query_string=query_string)
+ req = resource.captured_req
+ assert req.get_param_as_datetime(
+ 'thedate', format_string=format_string) == datetime(2015, 4, 20, 10, 10, 10)
+
+ def test_get_datetime_store(self, simulate_request, client, resource):
+ client.app.add_route('/', resource)
+ datetime_value = '2015-04-20T10:10:10Z'
+ query_string = 'thedate={0}'.format(datetime_value)
+ simulate_request(client=client, path='/', query_string=query_string)
+ req = resource.captured_req
+ store = {}
+ req.get_param_as_datetime('thedate', store=store)
+ assert len(store) != 0
+ assert store.get('thedate') == datetime(2015, 4, 20, 10, 10, 10)
+
+ def test_get_datetime_invalid(self, simulate_request, client, resource):
+ client.app.add_route('/', resource)
+ date_value = 'notarealvalue'
+ query_string = 'thedate={0}'.format(date_value)
+ format_string = '%Y%m%dT%H:%M:%S'
+ simulate_request(client=client, path='/', query_string=query_string)
+ req = resource.captured_req
+ with pytest.raises(HTTPInvalidParam):
+ req.get_param_as_datetime('thedate', format_string=format_string)
+
def test_get_dict_valid(self, simulate_request, client, resource):
client.app.add_route('/', resource)
payload_dict = {'foo': 'bar'}
| [
{
"components": [
{
"doc": "Return the value of a query string parameter as a datetime.\n\nArgs:\n name (str): Parameter name, case-sensitive (e.g., 'ids').\n\nKeyword Args:\n format_string (str): String used to parse the param value\n into a datetime. Any format recognized by strptim... | [
"tests/test_query_params.py::TestQueryParams::test_get_datetime_valid[simulate_request_get_query_params]",
"tests/test_query_params.py::TestQueryParams::test_get_datetime_valid[simulate_request_post_query_params]",
"tests/test_query_params.py::TestQueryParams::test_get_datetime_missing_param[simulate_request_ge... | [
"tests/test_query_params.py::TestQueryParams::test_none[simulate_request_get_query_params]",
"tests/test_query_params.py::TestQueryParams::test_none[simulate_request_post_query_params]",
"tests/test_query_params.py::TestQueryParams::test_blank[simulate_request_get_query_params]",
"tests/test_query_params.py::... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Added a get_param_as_datetime() method to fetch query param as datetime
Solution to #1007.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/request.py]
(definition of Request.get_param_as_datetime:)
def get_param_as_datetime(self, name, format_string='%Y-%m-%dT%H:%M:%SZ', required=False, store=None):
"""Return the value of a query string parameter as a datetime.
Args:
name (str): Parameter name, case-sensitive (e.g., 'ids').
Keyword Args:
format_string (str): String used to parse the param value
into a datetime. Any format recognized by strptime() is
supported (default ``'%Y-%m-%dT%H:%M:%SZ'``).
required (bool): Set to ``True`` to raise
``HTTPBadRequest`` instead of returning ``None`` when the
parameter is not found (default ``False``).
store (dict): A ``dict``-like object in which to place
the value of the param, but only if the param is found (default
``None``).
Returns:
datetime.datetime: The value of the param if it is found and can be
converted to a ``datetime`` according to the supplied format
string. If the param is not found, returns ``None`` unless
required is ``True``.
Raises:
HTTPBadRequest: A required param is missing from the request.
HTTPInvalidParam: A transform function raised an instance of
``ValueError``."""
[end of new definitions in falcon/request.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | ||
falconry__falcon-1040 | 1,040 | falconry/falcon | null | e0abfcfc642b3b255f7803aff026f8ad60afbc3a | 2017-05-03T18:08:08Z | diff --git a/falcon/routing/compiled.py b/falcon/routing/compiled.py
index 01488932f..2c6a53fbe 100644
--- a/falcon/routing/compiled.py
+++ b/falcon/routing/compiled.py
@@ -18,7 +18,7 @@
import re
-_FIELD_REGEX = re.compile('{([^}]*)}')
+_FIELD_PATTERN = re.compile('{([^}]*)}')
_TAB_STR = ' ' * 4
@@ -35,14 +35,26 @@ class CompiledRouter(object):
processing quite fast.
"""
+ __slots__ = (
+ '_ast',
+ '_find',
+ '_finder_src',
+ '_patterns',
+ '_return_values',
+ '_roots',
+ )
+
def __init__(self):
self._roots = []
self._find = self._compile()
- self._code_lines = None
- self._src = None
- self._expressions = None
+ self._finder_src = None
+ self._patterns = None
self._return_values = None
+ @property
+ def finder_src(self):
+ return self._finder_src
+
def add_route(self, uri_template, method_map, resource):
"""Adds a route between a URI path template and a resource.
@@ -66,7 +78,7 @@ def add_route(self, uri_template, method_map, resource):
# values from more shallow nodes.
# 2. For complex nodes, re.compile() raises a nasty error
#
- fields = _FIELD_REGEX.findall(uri_template)
+ fields = _FIELD_PATTERN.findall(uri_template)
used_names = set()
for name in fields:
is_identifier = re.match('[A-Za-z_][A-Za-z0-9_]*$', name)
@@ -145,28 +157,28 @@ def find(self, uri, req=None):
path = uri.lstrip('/').split('/')
params = {}
- node = self._find(path, self._return_values, self._expressions, params)
+ node = self._find(path, self._return_values, self._patterns, params)
if node is not None:
return node.resource, node.method_map, params, node.uri_template
else:
return None
- def _compile_tree(self, nodes, indent=1, level=0, fast_return=True):
- """Generates Python code for a routing tree or subtree."""
+ # -----------------------------------------------------------------
+ # Private
+ # -----------------------------------------------------------------
- def line(text, indent_offset=0):
- pad = _TAB_STR * (indent + indent_offset)
- self._code_lines.append(pad + text)
+ def _generate_ast(self, nodes, parent, return_values, patterns, level=0, fast_return=True):
+ """Generates a coarse AST for the router."""
# NOTE(kgriffs): Base case
if not nodes:
return
- line('if path_len > %d:' % level)
- indent += 1
+ outer_parent = _CxIfPathLength('>', level)
+ parent.append(outer_parent)
+ parent = outer_parent
- level_indent = indent
found_simple = False
# NOTE(kgriffs & philiptzou): Sort nodes in this sequence:
@@ -195,20 +207,18 @@ def line(text, indent_offset=0):
# contain anything more than a single literal or variable,
# and they need to be checked using a pre-compiled regular
# expression.
- expression_idx = len(self._expressions)
- self._expressions.append(node.var_regex)
-
- line('match = expressions[%d].match(path[%d]) # %s' % (
- expression_idx, level, node.var_regex.pattern))
+ pattern_idx = len(patterns)
+ patterns.append(node.var_pattern)
- line('if match is not None:')
- indent += 1
- line('params.update(match.groupdict())')
+ construct = _CxIfPathSegmentPattern(level, pattern_idx,
+ node.var_pattern.pattern)
+ parent.append(construct)
+ parent = construct
else:
# NOTE(kgriffs): Simple nodes just capture the entire path
# segment as the value for the param.
- line('params["%s"] = path[%d]' % (node.var_name, level))
+ parent.append(_CxSetParam(node.var_name, level))
# NOTE(kgriffs): We don't allow multiple simple var nodes
# to exist at the same level, e.g.:
@@ -222,59 +232,78 @@ def line(text, indent_offset=0):
else:
# NOTE(kgriffs): Not a param, so must match exactly
- line('if path[%d] == "%s":' % (level, node.raw_segment))
- indent += 1
+ construct = _CxIfPathSegmentLiteral(level, node.raw_segment)
+ parent.append(construct)
+ parent = construct
if node.resource is not None:
# NOTE(kgriffs): This is a valid route, so we will want to
# return the relevant information.
- resource_idx = len(self._return_values)
- self._return_values.append(node)
-
- self._compile_tree(node.children, indent, level + 1, fast_return)
+ resource_idx = len(return_values)
+ return_values.append(node)
+
+ self._generate_ast(
+ node.children,
+ parent,
+ return_values,
+ patterns,
+ level + 1,
+ fast_return
+ )
if node.resource is None:
if fast_return:
- line('return None')
+ parent.append(_CxReturnNone())
else:
# NOTE(kgriffs): Make sure that we have consumed all of
# the segments for the requested route; otherwise we could
# mistakenly match "/foo/23/bar" against "/foo/{id}".
- line('if path_len == %d:' % (level + 1))
- line('return return_values[%d]' % resource_idx, 1)
+ construct = _CxIfPathLength('==', level + 1)
+ construct.append(_CxReturnValue(resource_idx))
+ parent.append(construct)
if fast_return:
- line('return None')
+ parent.append(_CxReturnNone())
- indent = level_indent
+ parent = outer_parent
if not found_simple and fast_return:
- line('return None')
+ parent.append(_CxReturnNone())
def _compile(self):
- """Generates Python code for entire routing tree.
+ """Generates Python code for the entire routing tree.
- The generated code is compiled and the resulting Python method is
- returned.
+ The generated code is compiled and the resulting Python method
+ is returned.
"""
- self._return_values = []
- self._expressions = []
- self._code_lines = [
- 'def find(path, return_values, expressions, params):',
+
+ src_lines = [
+ 'def find(path, return_values, patterns, params):',
_TAB_STR + 'path_len = len(path)',
]
- self._compile_tree(self._roots)
+ self._return_values = []
+ self._patterns = []
+
+ self._ast = _CxParent()
+ self._generate_ast(
+ self._roots,
+ self._ast,
+ self._return_values,
+ self._patterns
+ )
+
+ src_lines.append(self._ast.src(0))
- self._code_lines.append(
+ src_lines.append(
# PERF(kgriffs): Explicit return of None is faster than implicit
_TAB_STR + 'return None'
)
- self._src = '\n'.join(self._code_lines)
+ self._finder_src = '\n'.join(src_lines)
scope = {}
- exec(compile(self._src, '<string>', 'exec'), scope)
+ exec(compile(self._finder_src, '<string>', 'exec'), scope)
return scope['find']
@@ -294,11 +323,12 @@ def __init__(self, raw_segment,
self.is_var = False
self.is_complex = False
self.var_name = None
+ self.var_pattern = None
# NOTE(kgriffs): CompiledRouter.add_route validates field names,
# so here we can just assume they are OK and use the simple
- # _FIELD_REGEX to match them.
- matches = list(_FIELD_REGEX.finditer(raw_segment))
+ # _FIELD_PATTERN to match them.
+ matches = list(_FIELD_PATTERN.finditer(raw_segment))
if not matches:
self.is_var = False
@@ -334,11 +364,11 @@ def __init__(self, raw_segment,
# trick the parser into doing the right thing.
escaped_segment = re.sub(r'[\.\(\)\[\]\?\$\*\+\^\|]', r'\\\g<0>', raw_segment)
- seg_pattern = _FIELD_REGEX.sub(r'(?P<\1>.+)', escaped_segment)
- seg_pattern = '^' + seg_pattern + '$'
+ pattern_text = _FIELD_PATTERN.sub(r'(?P<\1>.+)', escaped_segment)
+ pattern_text = '^' + pattern_text + '$'
self.is_complex = True
- self.var_regex = re.compile(seg_pattern)
+ self.var_pattern = re.compile(pattern_text)
def matches(self, segment):
"""Returns True if this node matches the supplied template segment."""
@@ -360,7 +390,7 @@ def conflicts_with(self, segment):
# simple, complex ==> False
# simple, string ==> False
# complex, simple ==> False
- # complex, complex ==> (Depend)
+ # complex, complex ==> (Maybe)
# complex, string ==> False
# string, simple ==> False
# string, complex ==> False
@@ -389,8 +419,8 @@ def conflicts_with(self, segment):
#
if self.is_complex:
if other.is_complex:
- return (_FIELD_REGEX.sub('v', self.raw_segment) ==
- _FIELD_REGEX.sub('v', segment))
+ return (_FIELD_PATTERN.sub('v', self.raw_segment) ==
+ _FIELD_PATTERN.sub('v', segment))
return False
else:
@@ -399,3 +429,125 @@ def conflicts_with(self, segment):
# NOTE(kgriffs): If self is a static string match, then all the cases
# for other are False, so no need to check.
return False
+
+
+# --------------------------------------------------------------------
+# AST Constructs
+#
+# NOTE(kgriffs): These constructs are used to create a very coarse
+# AST that can then be used to generate Python source code for the
+# router. Using an AST like this makes it easier to reason about
+# the compilation process, and affords syntactical transformations
+# that would otherwise be at best confusing and at worst extremely
+# tedious and error-prone if they were to be attempted directly
+# against the Python source code.
+# --------------------------------------------------------------------
+
+
+class _CxParent(object):
+ def __init__(self):
+ self._children = []
+
+ def append(self, construct):
+ self._children.append(construct)
+
+ def src(self, indentation):
+ return self._children_src(indentation + 1)
+
+ def _children_src(self, indentation):
+ src_lines = [
+ child.src(indentation)
+ for child in self._children
+ ]
+
+ return '\n'.join(src_lines)
+
+
+class _CxIfPathLength(_CxParent):
+ def __init__(self, comparison, length):
+ super(_CxIfPathLength, self).__init__()
+ self._comparison = comparison
+ self._length = length
+
+ def src(self, indentation):
+ template = '{0}if path_len {1} {2}:\n{3}'
+ return template.format(
+ _TAB_STR * indentation,
+ self._comparison,
+ self._length,
+ self._children_src(indentation + 1)
+ )
+
+
+class _CxIfPathSegmentLiteral(_CxParent):
+ def __init__(self, segment_idx, literal):
+ super(_CxIfPathSegmentLiteral, self).__init__()
+ self._segment_idx = segment_idx
+ self._literal = literal
+
+ def src(self, indentation):
+ template = "{0}if path[{1}] == '{2}':\n{3}"
+ return template.format(
+ _TAB_STR * indentation,
+ self._segment_idx,
+ self._literal,
+ self._children_src(indentation + 1)
+ )
+
+
+class _CxIfPathSegmentPattern(_CxParent):
+ def __init__(self, segment_idx, pattern_idx, pattern_text):
+ super(_CxIfPathSegmentPattern, self).__init__()
+ self._segment_idx = segment_idx
+ self._pattern_idx = pattern_idx
+ self._pattern_text = pattern_text
+
+ def src(self, indentation):
+ lines = []
+
+ lines.append(
+ '{0}match = patterns[{1}].match(path[{2}]) # {3}'.format(
+ _TAB_STR * indentation,
+ self._pattern_idx,
+ self._segment_idx,
+ self._pattern_text,
+ )
+ )
+
+ lines.append('{0}if match is not None:'.format(_TAB_STR * indentation))
+ lines.append('{0}params.update(match.groupdict())'.format(
+ _TAB_STR * (indentation + 1)
+ ))
+
+ lines.append(self._children_src(indentation + 1))
+
+ return '\n'.join(lines)
+
+
+class _CxReturnNone(object):
+ def src(self, indentation):
+ return '{0}return None'.format(_TAB_STR * indentation)
+
+
+class _CxReturnValue(object):
+ def __init__(self, value_idx):
+ self._value_idx = value_idx
+
+ def src(self, indentation):
+ return '{0}return return_values[{1}]'.format(
+ _TAB_STR * indentation,
+ self._value_idx
+ )
+
+
+class _CxSetParam(object):
+ def __init__(self, param_name, segment_idx):
+ self._param_name = param_name
+ self._segment_idx = segment_idx
+
+ def src(self, indentation):
+ return "{0}params['{1}'] = path[{2}]".format(
+ _TAB_STR * indentation,
+ self._param_name,
+ self._segment_idx,
+ )
| diff --git a/tests/test_custom_router.py b/tests/test_custom_router.py
index 29b43a6ce..b0610aa12 100644
--- a/tests/test_custom_router.py
+++ b/tests/test_custom_router.py
@@ -5,7 +5,6 @@
class TestCustomRouter(testing.TestBase):
def test_custom_router_add_route_should_be_used(self):
-
check = []
class CustomRouter(object):
diff --git a/tests/test_default_router.py b/tests/test_default_router.py
index 39fdd07d2..60fe9affe 100644
--- a/tests/test_default_router.py
+++ b/tests/test_default_router.py
@@ -1,3 +1,5 @@
+import textwrap
+
import pytest
import falcon
@@ -224,6 +226,20 @@ def test_root_path():
resource, __, __, __ = router.find('/')
assert resource.resource_id == 42
+ expected_src = textwrap.dedent("""
+ def find(path, return_values, patterns, params):
+ path_len = len(path)
+ if path_len > 0:
+ if path[0] == '':
+ if path_len == 1:
+ return return_values[0]
+ return None
+ return None
+ return None
+ """).strip()
+
+ assert router.finder_src == expected_src
+
@pytest.mark.parametrize('uri_template', [
'/{field}{field}',
@@ -309,8 +325,14 @@ def test_invalid_field_name(router, uri_template):
router.add_route(uri_template, {}, ResourceWithId(-1))
-def test_dump(router):
- print(router._src)
+def test_print_src(router):
+ """Diagnostic test that simply prints the router's find() source code.
+
+ Example:
+
+ $ tox -e py27_debug -- -k test_print_src -s
+ """
+ print('\n\n' + router.finder_src + '\n')
def test_override(router):
| [
{
"components": [
{
"doc": "",
"lines": [
55,
56
],
"name": "CompiledRouter.finder_src",
"signature": "def finder_src(self):",
"type": "function"
},
{
"doc": "Generates a coarse AST for the router.",
"lines": [... | [
"tests/test_default_router.py::test_root_path",
"tests/test_default_router.py::test_print_src"
] | [
"tests/test_custom_router.py::TestCustomRouter::test_can_pass_additional_params_to_add_route",
"tests/test_custom_router.py::TestCustomRouter::test_custom_router_add_route_should_be_used",
"tests/test_custom_router.py::TestCustomRouter::test_custom_router_find_should_be_used",
"tests/test_custom_router.py::Te... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat(CompiledRouter): Add an intermediate AST step to the compiler
Rather than compiling the routing tree directly to Python code, first generate an AST and then use it to produce the code. This provides several benefits, including:
* It makes the compilation process easier to reason about.
* It makes it easier to keep track of indentation and whitespace.
* It sets us up for being able to make transformations that we will need to do to support URI template filters, etc. in the future.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/routing/compiled.py]
(definition of CompiledRouter.finder_src:)
def finder_src(self):
(definition of CompiledRouter._generate_ast:)
def _generate_ast(self, nodes, parent, return_values, patterns, level=0, fast_return=True):
"""Generates a coarse AST for the router."""
(definition of _CxParent:)
class _CxParent(object):
(definition of _CxParent.__init__:)
def __init__(self):
(definition of _CxParent.append:)
def append(self, construct):
(definition of _CxParent.src:)
def src(self, indentation):
(definition of _CxParent._children_src:)
def _children_src(self, indentation):
(definition of _CxIfPathLength:)
class _CxIfPathLength(_CxParent):
(definition of _CxIfPathLength.__init__:)
def __init__(self, comparison, length):
(definition of _CxIfPathLength.src:)
def src(self, indentation):
(definition of _CxIfPathSegmentLiteral:)
class _CxIfPathSegmentLiteral(_CxParent):
(definition of _CxIfPathSegmentLiteral.__init__:)
def __init__(self, segment_idx, literal):
(definition of _CxIfPathSegmentLiteral.src:)
def src(self, indentation):
(definition of _CxIfPathSegmentPattern:)
class _CxIfPathSegmentPattern(_CxParent):
(definition of _CxIfPathSegmentPattern.__init__:)
def __init__(self, segment_idx, pattern_idx, pattern_text):
(definition of _CxIfPathSegmentPattern.src:)
def src(self, indentation):
(definition of _CxReturnNone:)
class _CxReturnNone(object):
(definition of _CxReturnNone.src:)
def src(self, indentation):
(definition of _CxReturnValue:)
class _CxReturnValue(object):
(definition of _CxReturnValue.__init__:)
def __init__(self, value_idx):
(definition of _CxReturnValue.src:)
def src(self, indentation):
(definition of _CxSetParam:)
class _CxSetParam(object):
(definition of _CxSetParam.__init__:)
def __init__(self, param_name, segment_idx):
(definition of _CxSetParam.src:)
def src(self, indentation):
[end of new definitions in falcon/routing/compiled.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | ||
joke2k__faker-494 | 494 | joke2k/faker | null | d06d05f415e97b15f21683c991511c24e12c8304 | 2017-04-06T18:16:03Z | diff --git a/faker/providers/file/__init__.py b/faker/providers/file/__init__.py
index b19a3c563c..4f4c14cef0 100644
--- a/faker/providers/file/__init__.py
+++ b/faker/providers/file/__init__.py
@@ -201,3 +201,16 @@ def file_extension(cls, category=None):
"""
category = category if category else cls.random_element(list(cls.file_extensions.keys()))
return cls.random_element(cls.file_extensions[category])
+
+ @classmethod
+ def file_path(cls, depth=1, category=None, extension=None):
+ """
+ :param category: audio|image|office|text|video
+ :param extension: file extension
+ :param depth: depth of the file (depth >= 0)
+ """
+ file = Provider.file_name(category, extension)
+ path = "/{0}".format(file)
+ for d in range(0, depth):
+ path = "/{0}{1}".format(WordProvider.word(), path)
+ return path
| diff --git a/tests/providers/file.py b/tests/providers/file.py
new file mode 100644
index 0000000000..1a1617bac6
--- /dev/null
+++ b/tests/providers/file.py
@@ -0,0 +1,25 @@
+from __future__ import unicode_literals
+
+import unittest
+import re
+
+from faker import Factory
+from faker.providers.file import Provider as FileProvider
+
+
+class TestFile(unittest.TestCase):
+ """ Tests file """
+
+ def setUp(self):
+ self.factory = Factory.create()
+
+ def test_file_path(self):
+ for _ in range(100):
+ file_path = FileProvider.file_path()
+ self.assertTrue(re.search(r'\/\w+\/\w+\.\w+', file_path))
+ file_path = FileProvider.file_path(depth=3)
+ self.assertTrue(re.search(r'\/\w+\/\w+\/\w+\.\w+', file_path))
+ file_path = FileProvider.file_path(extension='pdf')
+ self.assertTrue(re.search(r'\/\w+\/\w+\.pdf', file_path))
+ file_path = FileProvider.file_path(category='image')
+ self.assertTrue(re.search(r'\/\w+\/\w+\.(bmp|gif|jpeg|jpg|png|tiff)', file_path))
| [
{
"components": [
{
"doc": ":param category: audio|image|office|text|video\n:param extension: file extension\n:param depth: depth of the file (depth >= 0)",
"lines": [
206,
216
],
"name": "Provider.file_path",
"signature": "def file_path(cls, dep... | [
"tests/providers/file.py::TestFile::test_file_path"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
[close #493] adding_file_path_provider
Issue #493
Adding a file path provider in file provider.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/file/__init__.py]
(definition of Provider.file_path:)
def file_path(cls, depth=1, category=None, extension=None):
""":param category: audio|image|office|text|video
:param extension: file extension
:param depth: depth of the file (depth >= 0)"""
[end of new definitions in faker/providers/file/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Add a file path provider
In the file providers would be nice to have a file_path provider who would return a path like ```/lorem/ipsum/lorem.pdf```.
----------
--------------------
</issues> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | |
conan-io__conan-1165 | 1,165 | conan-io/conan | null | 5b70b92f842dc1013ee7ab2577ea607a8fac3b20 | 2017-03-30T20:36:01Z | diff --git a/conans/client/build_requires.py b/conans/client/build_requires.py
new file mode 100644
index 00000000000..a63a359d434
--- /dev/null
+++ b/conans/client/build_requires.py
@@ -0,0 +1,79 @@
+from conans.client.remote_registry import RemoteRegistry
+from conans.client.printer import Printer
+from conans.client.installer import ConanInstaller
+from conans.client.require_resolver import RequireResolver
+from conans.client.deps_builder import DepsGraphBuilder
+import fnmatch
+import copy
+
+
+def _apply_build_requires(deps_graph, conanfile):
+ requires_nodes = deps_graph.direct_requires()
+ assert len(requires_nodes) == 1
+ node = requires_nodes[0]
+ conan_ref, build_require_conanfile = node
+
+ conanfile.deps_cpp_info.update(build_require_conanfile.cpp_info, conan_ref)
+ conanfile.deps_cpp_info.update(build_require_conanfile.deps_cpp_info, conan_ref)
+
+ conanfile.deps_env_info.update(build_require_conanfile.env_info, conan_ref)
+ conanfile.deps_env_info.update(build_require_conanfile.deps_env_info, conan_ref)
+
+
+class BuildRequires(object):
+ def __init__(self, loader, remote_proxy, output, client_cache, search_manager, build_requires,
+ current_path):
+ self._remote_proxy = remote_proxy
+ self._client_cache = client_cache
+ self._output = output
+ self._current_path = current_path
+ self._loader = loader
+ self._cached_graphs = {}
+ self._search_manager = search_manager
+ self._build_requires = build_requires
+
+ def install(self, reference, conanfile):
+ build_requires = []
+ str_ref = str(reference)
+ for pattern, req_list in self._build_requires.items():
+ if fnmatch.fnmatch(str_ref, pattern):
+ build_requires.extend(req_list)
+ if not build_requires:
+ return
+ self._output.info("%s: Build requires are: [%s]"
+ % (str(reference), ", ".join(str(r) for r in build_requires)))
+
+ for build_require in build_requires:
+ cached_graph = self._cached_graphs.get(build_require)
+ if not cached_graph:
+ cached_graph = self._install(build_require)
+ self._cached_graphs[reference] = cached_graph
+
+ _apply_build_requires(cached_graph, conanfile)
+
+ def _install(self, build_require):
+ self._output.info("Installing build_require: %s" % str(build_require))
+ conanfile = self._loader.load_virtual(build_require, None) # No need current path
+
+ # FIXME: Forced update=True, build_mode, Where to define it?
+ update = False
+ build_modes = ["missing"]
+
+ local_search = None if update else self._search_manager
+ resolver = RequireResolver(self._output, local_search, self._remote_proxy)
+ graph_builder = DepsGraphBuilder(self._remote_proxy, self._output, self._loader, resolver)
+ deps_graph = graph_builder.load(conanfile)
+
+ registry = RemoteRegistry(self._client_cache.registry, self._output)
+ Printer(self._output).print_graph(deps_graph, registry)
+
+ # Make sure we recursively do not propagate the "*" pattern
+ build_requires = copy.copy(self)
+ build_requires._build_requires = self._build_requires.copy()
+ build_requires._build_requires.pop("*", None)
+
+ installer = ConanInstaller(self._client_cache, self._output, self._remote_proxy,
+ build_requires)
+ installer.install(deps_graph, build_modes, self._current_path)
+ self._output.info("Installed build_require: %s" % str(build_require))
+ return deps_graph
diff --git a/conans/client/deps_builder.py b/conans/client/deps_builder.py
index 79854b6fd86..dfe1ed8368f 100644
--- a/conans/client/deps_builder.py
+++ b/conans/client/deps_builder.py
@@ -260,9 +260,6 @@ class DepsGraphBuilder(object):
def __init__(self, retriever, output, loader, resolver):
""" param retriever: something that implements retrieve_conanfile for installed conans
:param loader: helper ConanLoader to be able to load user space conanfile
- :param initial_deps_infos: Initial cpp_infos and env_infos (usually from build_requires.
- {dest_package_name: {build_dep_reference: (cpp_info, env_info)},
- None: {build_dep_reference: (cpp_info, env_info)}}
"""
self._retriever = retriever
self._output = output
diff --git a/conans/client/installer.py b/conans/client/installer.py
index 62def16b53b..dd94d58adb7 100644
--- a/conans/client/installer.py
+++ b/conans/client/installer.py
@@ -20,11 +20,7 @@
from conans.util.tracer import log_package_built
-def _init_package_info(deps_graph, paths):
- """ Made external so it is independent of installer and can called
- in testing too
- """
- # Assign export root folders
+def _init_package_info(deps_graph, paths, current_path):
for node in deps_graph.nodes:
conan_ref, conan_file = node
if conan_ref:
@@ -34,6 +30,9 @@ def _init_package_info(deps_graph, paths):
conan_file.package_folder = package_folder
conan_file.cpp_info = CppInfo(package_folder)
conan_file.env_info = EnvInfo(package_folder)
+ else:
+ conan_file.cpp_info = CppInfo(current_path)
+ conan_file.env_info = EnvInfo(current_path)
def build_id(conanfile):
@@ -108,18 +107,19 @@ class ConanInstaller(object):
""" main responsible of retrieving binary packages or building them from source
locally in case they are not found in remotes
"""
- def __init__(self, client_cache, user_io, remote_proxy):
+ def __init__(self, client_cache, output, remote_proxy, build_requires):
self._client_cache = client_cache
- self._out = user_io.out
+ self._out = output
self._remote_proxy = remote_proxy
+ self._build_requires = build_requires
- def install(self, deps_graph, build_modes):
+ def install(self, deps_graph, build_modes, current_path):
""" given a DepsGraph object, build necessary nodes or retrieve them
"""
self._deps_graph = deps_graph # necessary for _build_package
t1 = time.time()
- _init_package_info(deps_graph, self._client_cache)
+ _init_package_info(deps_graph, self._client_cache, current_path)
# order by levels and propagate exports as download imports
nodes_by_level = deps_graph.by_levels()
logger.debug("Install-Process buildinfo %s" % (time.time() - t1))
@@ -211,6 +211,8 @@ def _build(self, nodes_by_level, skip_private_nodes, build_mode):
elif build_mode.forced(conan_ref, conan_file):
output.warn('Forced build from source')
+ self._build_requires.install(conan_ref, conan_file)
+
t1 = time.time()
# Assign to node the propagated info
self._propagate_info(conan_ref, conan_file, flat)
diff --git a/conans/client/loader.py b/conans/client/loader.py
index 6818de42325..a4c46ddb715 100644
--- a/conans/client/loader.py
+++ b/conans/client/loader.py
@@ -13,24 +13,6 @@
from conans.client.loader_parse import ConanFileTextLoader, load_conanfile_class
-def _apply_initial_deps_infos_to_conanfile(conanfile, initial_deps_infos):
- if not initial_deps_infos:
- return
-
- def apply_infos(infos):
- for build_dep_reference, info in infos.items(): # List of tuples (cpp_info, env_info)
- cpp_info, env_info = info
- conanfile.deps_cpp_info.update(cpp_info, build_dep_reference)
- conanfile.deps_env_info.update(env_info, build_dep_reference)
-
- # If there are some specific package-level deps infos apply them
- if conanfile.name and conanfile.name in initial_deps_infos.keys():
- apply_infos(initial_deps_infos[conanfile.name])
-
- # And also apply the global ones
- apply_infos(initial_deps_infos[None])
-
-
def _load_info_file(current_path, conanfile, output, error=False):
info_file_path = os.path.join(current_path, BUILD_INFO)
try:
@@ -80,7 +62,6 @@ def __init__(self, runner, settings, profile):
self._package_settings = profile.package_settings_values
self._env_values = profile.env_values
- self.initial_deps_infos = None
def load_conan(self, conanfile_path, output, consumer=False, reference=None):
""" loads a ConanFile object from the given file
@@ -112,7 +93,6 @@ def load_conan(self, conanfile_path, output, consumer=False, reference=None):
else:
result.scope = self._scopes.package_scope(result.name)
- _apply_initial_deps_infos_to_conanfile(result, self.initial_deps_infos)
return result
except Exception as e: # re-raise with file name
raise ConanException("%s: %s" % (conanfile_path, str(e)))
@@ -155,19 +135,18 @@ def _parse_conan_txt(self, contents, path, output):
conanfile._env_values.update(self._env_values)
return conanfile
- def load_virtual(self, references, path):
+ def load_virtual(self, reference, current_path):
# If user don't specify namespace in options, assume that it is
# for the reference (keep compatibility)
- conanfile = ConanFile(None, self._runner, self._settings.copy(), path)
+ conanfile = ConanFile(None, self._runner, self._settings.copy(), current_path)
# Assign environment
conanfile._env_values.update(self._env_values)
- for ref in references:
- conanfile.requires.add(str(ref)) # Convert to string necessary
- # Allows options without package namespace in conan install commands:
- # conan install zlib/1.2.8@lasote/stable -o shared=True
- self._user_options.scope_options(ref.name) # FIXME: This only scope the 1st require
+ conanfile.requires.add(str(reference)) # Convert to string necessary
+ # Allows options without package namespace in conan install commands:
+ # conan install zlib/1.2.8@lasote/stable -o shared=True
+ self._user_options.scope_options(reference.name)
conanfile.options.initialize_upstream(self._user_options)
conanfile.generators = [] # remove the default txt generator
diff --git a/conans/client/manager.py b/conans/client/manager.py
index 52f0de3560d..58f27eb7183 100644
--- a/conans/client/manager.py
+++ b/conans/client/manager.py
@@ -1,6 +1,6 @@
import os
import time
-from collections import OrderedDict, Counter, defaultdict
+from collections import OrderedDict, Counter
from conans.client import packager
from conans.client.client_cache import ClientCache
@@ -24,14 +24,13 @@
from conans.client.uploader import ConanUploader
from conans.client.userio import UserIO
from conans.errors import NotFoundException, ConanException
-from conans.model.build_info import DepsCppInfo, CppInfo
-from conans.model.env_info import EnvInfo, DepsEnvInfo
from conans.model.ref import ConanFileReference, PackageReference
from conans.paths import CONANFILE, CONANINFO, CONANFILE_TXT
from conans.tools import environment_append
from conans.util.files import save, rmdir, normalize, mkdir
from conans.util.log import logger
from conans.client.loader_parse import load_conanfile_class
+from conans.client.build_requires import BuildRequires
class ConanManager(object):
@@ -101,7 +100,7 @@ def download(self, reference, package_ids, remote=None):
def _get_conanfile_object(self, loader, reference_or_path, conanfile_filename, current_path):
if isinstance(reference_or_path, ConanFileReference):
- conanfile = loader.load_virtual([reference_or_path], current_path)
+ conanfile = loader.load_virtual(reference_or_path, current_path)
else:
output = ScopedOutput("PROJECT", self._user_io.out)
try:
@@ -112,8 +111,6 @@ def _get_conanfile_object(self, loader, reference_or_path, conanfile_filename, c
except NotFoundException: # Load conanfile.txt
conan_path = os.path.join(reference_or_path, conanfile_filename or CONANFILE_TXT)
conanfile = loader.load_conan_txt(conan_path, output)
- conanfile.cpp_info = CppInfo(current_path)
- conanfile.env_info = EnvInfo(current_path)
return conanfile
@@ -151,7 +148,7 @@ def info(self, reference, current_path, profile, remote=None,
return
if build_modes is not None:
- installer = ConanInstaller(self._client_cache, self._user_io, remote_proxy)
+ installer = ConanInstaller(self._client_cache, self._user_io.out, remote_proxy, None)
nodes = installer.nodes_to_build(deps_graph, build_modes)
counter = Counter(ref.conan.name for ref, _ in nodes)
self._user_io.out.info(", ".join((str(ref)
@@ -197,60 +194,6 @@ def read_dates(deps_graph):
remote, read_dates(deps_graph),
self._client_cache, package_filter, show_paths)
- def _install_build_requires(self, loader, remote_proxy, requires, current_path):
- self._user_io.out.info("---------- Installing build_requires -------------")
- conanfile = loader.load_virtual(requires, current_path)
- conanfile.cpp_info = CppInfo(current_path)
- conanfile.env_info = EnvInfo(current_path)
-
- # FIXME: Forced update=True, build_mode, Where to define it?
- update = True
- build_modes = ["missing"]
-
- graph_builder = self._get_graph_builder(loader, update, remote_proxy)
- deps_graph = graph_builder.load(conanfile)
-
- registry = RemoteRegistry(self._client_cache.registry, self._user_io.out)
- Printer(self._user_io.out).print_graph(deps_graph, registry)
-
- installer = ConanInstaller(self._client_cache, self._user_io, remote_proxy)
- installer.install(deps_graph, build_modes)
- self._user_io.out.info("-------------------------------------------------\n")
- return deps_graph
-
- def _install_build_requires_and_get_infos(self, profile, loader, remote_proxy, current_path):
- # Build the graph and install the build_require dependency tree
- deps_graph = self._install_build_requires(loader, remote_proxy, profile.all_requires, current_path)
- # Build a dict with
- # {"zlib": {"cmake/2.8@lasote/stable": (deps_cpp_info, deps_env_info),
- # "other_tool/3.2@lasote/stable": (deps_cpp_info, deps_env_info)}
- # "None": {"cmake/3.1@lasote/stable": (deps_cpp_info, deps_env_info)}
- # taking into account the package level requires
- refs_objects = {}
- requires_nodes = deps_graph.direct_requires()
- # We have all the cpp_info and env_info in the virtual conanfile, but we need those info objects at
- # requires level to filter later (profiles allow to filter targeting a library in the real deps tree)
- for node in requires_nodes:
- deps_cpp_info = DepsCppInfo()
- deps_cpp_info.public_deps = [] # FIXME: spaguetti
- deps_cpp_info.update(node.conanfile.cpp_info, node.conan_ref)
- deps_cpp_info.update(node.conanfile.deps_cpp_info, node.conan_ref)
-
- deps_env_info = DepsEnvInfo()
- deps_env_info.update(node.conanfile.env_info, node.conan_ref)
- deps_env_info.update(node.conanfile.deps_env_info, node.conan_ref)
-
- refs_objects[node.conan_ref] = (deps_cpp_info, deps_env_info)
-
- build_dep_infos = defaultdict(dict)
- for dest_package, references in profile.package_requires.items(): # Package level ones
- for ref in references:
- build_dep_infos[dest_package][ref] = refs_objects[ref]
- for global_reference in profile.requires:
- build_dep_infos[None][global_reference] = refs_objects[global_reference]
-
- return build_dep_infos
-
def install(self, reference, current_path, profile, remote=None,
build_modes=None, filename=None, update=False,
manifest_folder=None, manifest_verify=False, manifest_interactive=False,
@@ -277,17 +220,6 @@ def install(self, reference, current_path, profile, remote=None,
remote_proxy = ConanProxy(self._client_cache, self._user_io, self._remote_manager, remote,
update=update, check_updates=False, manifest_manager=manifest_manager)
loader = ConanFileLoader(self._runner, self._client_cache.settings, profile)
-
- # Install the build_requires and get the info objets
- build_dep_infos = None
- if profile.all_requires:
- # {"zlib": {"cmake/2.8@lasote/stable": (deps_cpp_info, deps_env_info),
- # "other_tool/3.2@lasote/stable": (deps_cpp_info, deps_env_info)}
- # "None": {"cmake/3.1@lasote/stable": (deps_cpp_info, deps_env_info)}
- build_dep_infos = self._install_build_requires_and_get_infos(profile, loader, remote_proxy, current_path)
- loader.initial_deps_infos = build_dep_infos
-
- # Build the graph for the real dependency tree
conanfile = self._get_conanfile_object(loader, reference, filename, current_path)
graph_builder = self._get_graph_builder(loader, update, remote_proxy)
deps_graph = graph_builder.load(conanfile)
@@ -306,9 +238,12 @@ def install(self, reference, current_path, profile, remote=None,
except ConanException: # Setting os doesn't exist
pass
- installer = ConanInstaller(self._client_cache, self._user_io, remote_proxy)
+ build_requires = BuildRequires(loader, remote_proxy, self._user_io.out, self._client_cache,
+ self._search_manager, profile.build_requires, current_path)
+ installer = ConanInstaller(self._client_cache, self._user_io.out, remote_proxy,
+ build_requires)
- installer.install(deps_graph, build_modes)
+ installer.install(deps_graph, build_modes, current_path)
prefix = "PROJECT" if not isinstance(reference, ConanFileReference) else str(reference)
output = ScopedOutput(prefix, self._user_io.out)
diff --git a/conans/model/profile.py b/conans/model/profile.py
index 64c1e562609..bd0de1dffd3 100644
--- a/conans/model/profile.py
+++ b/conans/model/profile.py
@@ -25,16 +25,7 @@ def __init__(self):
self.env_values = EnvValues()
self.scopes = Scopes()
self.options = OptionsValues()
- self.requires = []
- self.package_requires = defaultdict(list)
-
- @property
- def all_requires(self):
- ret = []
- for references in self.package_requires.values():
- ret.extend(references)
- ret.extend(self.requires)
- return set(ret)
+ self.build_requires = OrderedDict() # conan_ref Pattern: list of conan_ref
@property
def settings_values(self):
@@ -131,17 +122,9 @@ def get_package_name_value(item):
if doc.build_requires:
# FIXME CHECKS OF DUPLICATED?
for req in doc.build_requires.splitlines():
- if ":" in req:
- package_name, req = req.split(":", 1)
- try:
- ref = ConanFileReference.loads(req.strip())
- obj.package_requires[package_name.strip()].append(ref)
- except:
- raise ConanException("Invalid requirement reference '%s' specified in profile" % req)
-
- else:
- ref = ConanFileReference.loads(req.strip())
- obj.requires.append(ref)
+ pattern, req_list = req.split(":", 1)
+ req_list = [ConanFileReference.loads(r.strip()) for r in req_list.split(",")]
+ obj.build_requires[pattern] = req_list
if doc.scopes:
obj.scopes = Scopes.from_list(doc.scopes.splitlines())
@@ -159,12 +142,8 @@ def get_package_name_value(item):
def dumps(self):
result = ["[build_requires]"]
- for ref in sorted(self.requires):
- result.append(ref)
- for package in sorted(self.package_requires.keys()):
- refs = sorted(self.package_requires[package])
- for ref in refs:
- result.append("%s:%s" % (package, ref))
+ for pattern, req_list in self.build_requires.items():
+ result.append("%s: %s" % (pattern, ", ".join(str(r) for r in req_list)))
result.append("[settings]")
for name, value in self.settings.items():
result.append("%s=%s" % (name, value))
| diff --git a/conans/test/command/install_test.py b/conans/test/command/install_test.py
index 0ccc2a937a2..cef9396fd79 100644
--- a/conans/test/command/install_test.py
+++ b/conans/test/command/install_test.py
@@ -52,6 +52,18 @@ def install_combined_test(self):
self.assertIn("Hello1/0.1@lasote/stable: WARN: Forced build from source",
self.client.user_io.out)
+ def install_transitive_cache_test(self):
+ self._create("Hello0", "0.1")
+ self._create("Hello1", "0.1", ["Hello0/0.1@lasote/stable"])
+ self._create("Hello2", "0.1", ["Hello1/0.1@lasote/stable"])
+ self.client.run("install Hello2/0.1@lasote/stable %s --build=missing" % (self.settings))
+ self.assertIn("Hello0/0.1@lasote/stable: Generating the package",
+ self.client.user_io.out)
+ self.assertIn("Hello1/0.1@lasote/stable: Generating the package",
+ self.client.user_io.out)
+ self.assertIn("Hello2/0.1@lasote/stable: Generating the package",
+ self.client.user_io.out)
+
def partials_test(self):
self._create("Hello0", "0.1")
self._create("Hello1", "0.1", ["Hello0/0.1@lasote/stable"])
diff --git a/conans/test/integration/profile_requires_test.py b/conans/test/integration/profile_requires_test.py
index 04309abd4df..34c4f35579d 100644
--- a/conans/test/integration/profile_requires_test.py
+++ b/conans/test/integration/profile_requires_test.py
@@ -156,9 +156,10 @@ def build(self):
assert(os.environ["ENV_VAR"] == "ENV_VALUE_FROM_BUILD_REQUIRE")
assert(os.environ["ENV_VAR_REQ2"] == "ENV_VALUE_FROM_BUILD_REQUIRE2")
- tmp = "ENV_VALUE_MULTI_FROM_BUILD_REQUIRE2" + os.pathsep + "ENV_VALUE_MULTI_FROM_BUILD_REQUIRE" + os.pathsep + "ENV_VALUE_MULTI_FROM_BUILD_REQUIRE_PARENT"
+ tmp = "ENV_VALUE_MULTI_FROM_BUILD_REQUIRE" + os.pathsep + "ENV_VALUE_MULTI_FROM_BUILD_REQUIRE_PARENT" + os.pathsep + "ENV_VALUE_MULTI_FROM_BUILD_REQUIRE2"
+
assert(os.environ.get("ENV_VAR_MULTI", None) == tmp)
- assert(self.deps_cpp_info.cflags == ["A_C_FLAG_FROM_BUILD_REQUIRE_PARENT", "A_C_FLAG_FROM_BUILD_REQUIRE", "A_C_FLAG_FROM_BUILD_REQUIRE2"])
+ assert(self.deps_cpp_info.cflags == ["A_C_FLAG_FROM_BUILD_REQUIRE2", "A_C_FLAG_FROM_BUILD_REQUIRE_PARENT", "A_C_FLAG_FROM_BUILD_REQUIRE"])
assert(os.environ["FOO_VAR"] == "1")
@@ -167,8 +168,8 @@ def build(self):
profile = """
[build_requires]
-BuildRequire/0.1@lasote/stable
-MyLib2:BuildRequire2/0.1@lasote/stable
+*: BuildRequire/0.1@lasote/stable
+MyLib2/*: BuildRequire2/0.1@lasote/stable
[settings]
os=Windows
diff --git a/conans/test/model/profile_test.py b/conans/test/model/profile_test.py
index 38bfe6d6929..0bf60502ef3 100644
--- a/conans/test/model/profile_test.py
+++ b/conans/test/model/profile_test.py
@@ -31,9 +31,9 @@ def profile_test(self):
profile.scopes["p1"]["conaning"] = "1"
profile.scopes["p2"]["testing"] = "2"
- profile.requires.append("android_toolchain/1.2.8@lasote/testing")
- profile.package_requires["zlib"].append("cmake/1.0.2@lasote/stable")
- profile.package_requires["zlib"].append("autotools/1.0.3@lasote/stable")
+ profile.build_requires["*"] = ["android_toolchain/1.2.8@lasote/testing"]
+ profile.build_requires["zlib/*"] = ["cmake/1.0.2@lasote/stable",
+ "autotools/1.0.3@lasote/stable"]
dump = profile.dumps()
new_profile = Profile.loads(dump)
@@ -48,14 +48,11 @@ def profile_test(self):
self.assertEquals(dict(new_profile.scopes)["p1"]["conaning"], '1')
self.assertEquals(dict(new_profile.scopes)["p2"]["testing"], '2')
- self.assertEquals(new_profile.package_requires, {"zlib": [
- ConanFileReference.loads("autotools/1.0.3@lasote/stable"),
- ConanFileReference.loads("cmake/1.0.2@lasote/stable")]})
-
- self.assertEquals(new_profile.all_requires,
- set([ConanFileReference.loads("cmake/1.0.2@lasote/stable"),
- ConanFileReference.loads("android_toolchain/1.2.8@lasote/testing"),
- ConanFileReference.loads("autotools/1.0.3@lasote/stable")]))
+ self.assertEquals(new_profile.build_requires["zlib/*"],
+ [ConanFileReference.loads("cmake/1.0.2@lasote/stable"),
+ ConanFileReference.loads("autotools/1.0.3@lasote/stable")])
+ self.assertEquals(new_profile.build_requires["*"],
+ [ConanFileReference.loads("android_toolchain/1.2.8@lasote/testing")])
def profile_settings_update_test(self):
prof = '''[settings]
@@ -147,16 +144,19 @@ def profile_dump_order_test(self):
profile.settings["arch"] = "x86_64"
profile.settings["compiler"] = "Visual Studio"
profile.settings["compiler.version"] = "12"
- profile.requires.append("zlib/1.2.8@lasote/testing")
- profile.requires.append("aaaa/1.2.3@lasote/testing")
- profile.package_requires["zlib"].append("zlib/1.2.11@lasote/testing")
- profile.package_requires["zlib"].append("aaa/1.2.11@lasote/testing")
-
- self.assertEqual('[build_requires]\naaaa/1.2.3@lasote/testing\nzlib/1.2.8@lasote/testing\nzlib:'
- 'aaa/1.2.11@lasote/testing\nzlib:zlib/1.2.11@lasote/testing'
- '\n[settings]\narch=x86_64\ncompiler=Visual Studio\n'
- 'compiler.version=12\nzlib:compiler=gcc\n[options]\n[scopes]\n[env]\n',
- profile.dumps())
+ profile.build_requires["*"] = ["zlib/1.2.8@lasote/testing"]
+ profile.build_requires["zlib/*"] = ["aaaa/1.2.3@lasote/testing", "bb/1.2@lasote/testing"]
+ self.assertEqual("""[build_requires]
+*: zlib/1.2.8@lasote/testing
+zlib/*: aaaa/1.2.3@lasote/testing, bb/1.2@lasote/testing
+[settings]
+arch=x86_64
+compiler=Visual Studio
+compiler.version=12
+zlib:compiler=gcc
+[options]
+[scopes]
+[env]""".splitlines(), profile.dumps().splitlines())
def profile_loads_win_test(self):
prof = '''[env]
| [
{
"components": [
{
"doc": "",
"lines": [
10,
20
],
"name": "_apply_build_requires",
"signature": "def _apply_build_requires(deps_graph, conanfile):",
"type": "function"
},
{
"doc": "",
"lines": [
23,... | [
"conans/test/integration/profile_requires_test.py::ProfileRequiresTest::test_profile_requires"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Feature/new build requires
- build_requires only applied when necessary to build a package, otherwise not installed
- New pattern based scheme in profile for build_requires. Using lists.
- Simplifications over the initialization of CppInfo, EnvInfo
- Test for bug in ``conan install <reference>`` with dependencies (and bug fix)
This would be the "core" change, probably now we want to analyze how to define options for build_requires (like for the cmake_installer package, in which the cmake version is defined by options), or how to update build_requires.
More testing is desiderable, specially for the local commands ``build``
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/client/build_requires.py]
(definition of _apply_build_requires:)
def _apply_build_requires(deps_graph, conanfile):
(definition of BuildRequires:)
class BuildRequires(object):
(definition of BuildRequires.__init__:)
def __init__(self, loader, remote_proxy, output, client_cache, search_manager, build_requires, current_path):
(definition of BuildRequires.install:)
def install(self, reference, conanfile):
(definition of BuildRequires._install:)
def _install(self, build_require):
[end of new definitions in conans/client/build_requires.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | ||
sympy__sympy-12455 | 12,455 | sympy/sympy | 1.0 | 8800fd2ab1553cd768ad743c44b3ed00c111c368 | 2017-03-29T22:36:07Z | diff --git a/sympy/combinatorics/fp_groups.py b/sympy/combinatorics/fp_groups.py
index 7fd815fc190c..421334089bcf 100644
--- a/sympy/combinatorics/fp_groups.py
+++ b/sympy/combinatorics/fp_groups.py
@@ -842,8 +842,6 @@ def switch(self, beta, gamma):
"""
A = self.A
A_dict = self.A_dict
- A_dict_inv = self.A_dict_inv
- X = self.fp_group.generators
table = self.table
for x in A:
z = table[gamma][A_dict[x]]
diff --git a/sympy/combinatorics/perm_groups.py b/sympy/combinatorics/perm_groups.py
index 75704dcc1e90..8ad90f1efa67 100644
--- a/sympy/combinatorics/perm_groups.py
+++ b/sympy/combinatorics/perm_groups.py
@@ -642,6 +642,8 @@ def basic_stabilizers(self):
self.schreier_sims()
strong_gens = self._strong_gens
base = self._base
+ if not base: # e.g. if self is trivial
+ return []
strong_gens_distr = _distribute_gens_by_base(base, strong_gens)
basic_stabilizers = []
for gens in strong_gens_distr:
@@ -678,6 +680,137 @@ def basic_transversals(self):
self.schreier_sims()
return self._transversals
+ def coset_transversal(self, H):
+ """Return a transversal of the right cosets of self by its subgroup H
+ using the second method described in [1], Subsection 4.6.7
+
+ """
+
+ if not H.is_subgroup(self):
+ raise ValueError("The argument must be a subgroup")
+
+ if H.order() == 1:
+ return self._elements
+
+ self._schreier_sims(base=H.base) # make G.base an extension of H.base
+
+ base = self.base
+ base_ordering = _base_ordering(base, self.degree)
+ identity = Permutation(self.degree - 1)
+
+ transversals = self.basic_transversals[:]
+ # transversals is a list of dictionaries. Get rid of the keys
+ # so that it is a list of lists and sort each list in
+ # the increasing order of base[l]^x
+ for l, t in enumerate(transversals):
+ transversals[l] = sorted(t.values(),
+ key = lambda x: base_ordering[base[l]^x])
+
+ orbits = H.basic_orbits
+ h_stabs = H.basic_stabilizers
+ g_stabs = self.basic_stabilizers
+
+ indices = [x.order()//y.order() for x, y in zip(g_stabs, h_stabs)]
+
+ # T^(l) should be a right transversal of H^(l) in G^(l) for
+ # 1<=l<=len(base). While H^(l) is the trivial group, T^(l)
+ # contains all the elements of G^(l) so we might just as well
+ # start with l = len(h_stabs)-1
+ T = g_stabs[len(h_stabs)]._elements
+ t_len = len(T)
+ l = len(h_stabs)-1
+ while l > -1:
+ T_next = []
+ for u in transversals[l]:
+ if u == identity:
+ continue
+ b = base_ordering[base[l]^u]
+ for t in T:
+ p = t*u
+ if all([base_ordering[h^p] >= b for h in orbits[l]]):
+ T_next.append(p)
+ if t_len + len(T_next) == indices[l]:
+ break
+ if t_len + len(T_next) == indices[l]:
+ break
+ T += T_next
+ t_len += len(T_next)
+ l -= 1
+ return T
+
+ def _coset_representative(self, g, H):
+ """Return the representative of Hg from the transversal that
+ would be computed by `self.coset_transversal(H)`.
+
+ """
+ if H.order() == 1:
+ return g
+ # The base of self must be an extension of H.base.
+ if not(self.base[:len(H.base)] == H.base):
+ self._schreier_sims(base=H.base)
+ orbits = H.basic_orbits[:]
+ h_transversals = [list(_.values()) for _ in H.basic_transversals]
+ transversals = [list(_.values()) for _ in self.basic_transversals]
+ base = self.base
+ base_ordering = _base_ordering(base, self.degree)
+ def step(l, x):
+ gamma = sorted(orbits[l], key = lambda y: base_ordering[y^x])[0]
+ i = [base[l]^h for h in h_transversals[l]].index(gamma)
+ x = h_transversals[l][i]*x
+ if l < len(orbits)-1:
+ for u in transversals[l]:
+ if base[l]^u == base[l]^x:
+ break
+ x = step(l+1, x*u**-1)*u
+ return x
+ return step(0, g)
+
+ def coset_table(self, H):
+ """Return the standardised (right) coset table of self in H as
+ a list of lists.
+ """
+ # Maybe this should be made to return an instance of CosetTable
+ # from fp_groups.py but the class would need to be changed first
+ # to be compatible with PermutationGroups
+
+ from itertools import chain, product
+ if not H.is_subgroup(self):
+ raise ValueError("The argument must be a subgroup")
+ T = self.coset_transversal(H)
+ n = len(T)
+
+ A = list(chain.from_iterable((gen, gen**-1)
+ for gen in self.generators))
+
+ table = []
+ for i in range(n):
+ row = [self._coset_representative(T[i]*x, H) for x in A]
+ row = [T.index(r) for r in row]
+ table.append(row)
+
+
+ # standardize (this is the same as the algorithm used in fp_groups)
+ # If CosetTable is made compatible with PermutationGroups, this
+ # should be replaced by table.standardize()
+ A = range(len(A))
+ gamma = 1
+ for alpha, a in product(range(n), A):
+ beta = table[alpha][a]
+ if beta >= gamma:
+ if beta > gamma:
+ for x in A:
+ z = table[gamma][x]
+ table[gamma][x] = table[beta][x]
+ table[beta][x] = z
+ for i in range(n):
+ if table[i][x] == beta:
+ table[i][x] = gamma
+ elif table[i][x] == gamma:
+ table[i][x] = beta
+ gamma += 1
+ if gamma == n-1:
+ return table
+
def center(self):
r"""
Return the center of a permutation group.
@@ -1061,7 +1194,7 @@ def degree(self):
@property
def elements(self):
- """Returns all the elements of the permutation group in a list
+ """Returns all the elements of the permutation group as a set
Examples
========
@@ -1072,7 +1205,22 @@ def elements(self):
{(3), (2 3), (3)(1 2), (1 2 3), (1 3 2), (1 3)}
"""
- return set(list(islice(self.generate(), None)))
+ return set(self._elements)
+
+ @property
+ def _elements(self):
+ """Returns all the elements of the permutation group as a list
+
+ Examples
+ ========
+
+ >>> from sympy.combinatorics import Permutation, PermutationGroup
+ >>> p = PermutationGroup(Permutation(1, 3), Permutation(1, 2))
+ >>> p._elements
+ [(3), (3)(1 2), (1 3), (2 3), (1 2 3), (1 3 2)]
+
+ """
+ return list(islice(self.generate(), None))
def derived_series(self):
r"""Return the derived series for the group.
@@ -2467,7 +2615,11 @@ def schreier_sims(self):
"""
if self._transversals:
return
- base, strong_gens = self.schreier_sims_incremental()
+ self._schreier_sims()
+ return
+
+ def _schreier_sims(self, base=None):
+ base, strong_gens = self.schreier_sims_incremental(base=base)
self._base = base
self._strong_gens = strong_gens
if not base:
| diff --git a/sympy/combinatorics/tests/test_perm_groups.py b/sympy/combinatorics/tests/test_perm_groups.py
index 45f41a8cb822..182e062a668b 100644
--- a/sympy/combinatorics/tests/test_perm_groups.py
+++ b/sympy/combinatorics/tests/test_perm_groups.py
@@ -726,3 +726,24 @@ def test_is_group():
def test_PermutationGroup():
assert PermutationGroup() == PermutationGroup(Permutation())
+
+def test_coset_transvesal():
+ G = AlternatingGroup(5)
+ H = PermutationGroup(Permutation(0,1,2),Permutation(1,2)(3,4))
+ assert G.coset_transversal(H) == \
+ [Permutation(4), Permutation(2, 3, 4), Permutation(2, 4, 3),
+ Permutation(1, 2, 4), Permutation(4)(1, 2, 3), Permutation(1, 3)(2, 4),
+ Permutation(0, 1, 2, 3, 4), Permutation(0, 1, 2, 4, 3),
+ Permutation(0, 1, 3, 2, 4), Permutation(0, 2, 4, 1, 3)]
+
+def test_coset_table():
+ G = PermutationGroup(Permutation(0,1,2,3), Permutation(0,1,2),
+ Permutation(0,4,2,7), Permutation(5,6), Permutation(0,7));
+ H = PermutationGroup(Permutation(0,1,2,3), Permutation(0,7))
+ assert G.coset_table(H) == \
+ [[0, 0, 0, 0, 1, 2, 3, 3, 0, 0], [4, 5, 2, 5, 6, 0, 7, 7, 1, 1],
+ [5, 4, 5, 1, 0, 6, 8, 8, 6, 6], [3, 3, 3, 3, 7, 8, 0, 0, 3, 3],
+ [2, 1, 4, 4, 4, 4, 9, 9, 4, 4], [1, 2, 1, 2, 5, 5, 10, 10, 5, 5],
+ [6, 6, 6, 6, 2, 1, 11, 11, 2, 2], [9, 10, 8, 10, 11, 3, 1, 1, 7, 7],
+ [10, 9, 10, 7, 3, 11, 2, 2, 11, 11], [8, 7, 9, 9, 9, 9, 4, 4, 9, 9],
+ [7, 8, 7, 8, 10, 10, 5, 5, 10, 10], [11, 11, 11, 11, 8, 7, 6, 6, 8, 8]]
| [
{
"components": [
{
"doc": "Return a transversal of the right cosets of self by its subgroup H\nusing the second method described in [1], Subsection 4.6.7",
"lines": [
683,
739
],
"name": "PermutationGroup.coset_transversal",
"signature": "def co... | [
"test_coset_transvesal"
] | [
"test_has",
"test_generate",
"test_order",
"test_equality",
"test_stabilizer",
"test_center",
"test_centralizer",
"test_coset_rank",
"test_coset_factor",
"test_orbits",
"test_is_normal",
"test_eq",
"test_derived_subgroup",
"test_is_solvable",
"test_rubik1",
"test_direct_product",
"te... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
permutation groups: implemented coset transversal and coset table
This adds a method `G.coset_transversal(H)` that returns a transversal of the
right cosets of G by its subgroup H, and `G.coset_table(H)` which returns
a list of lists representing the standardised coset table of `G` by H. It makes use
of another new (internal) method, `_coset_representative` which finds the coset
representative of a given element in the transversal returned by
`coset_transversal`.
<!-- If this pull request fixes an issue please indicate which issue by typing "Fixes #NNNN" below. -->
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/combinatorics/perm_groups.py]
(definition of PermutationGroup.coset_transversal:)
def coset_transversal(self, H):
"""Return a transversal of the right cosets of self by its subgroup H
using the second method described in [1], Subsection 4.6.7"""
(definition of PermutationGroup._coset_representative:)
def _coset_representative(self, g, H):
"""Return the representative of Hg from the transversal that
would be computed by `self.coset_transversal(H)`."""
(definition of PermutationGroup._coset_representative.step:)
def step(l, x):
(definition of PermutationGroup.coset_table:)
def coset_table(self, H):
"""Return the standardised (right) coset table of self in H as
a list of lists."""
(definition of PermutationGroup._elements:)
def _elements(self):
"""Returns all the elements of the permutation group as a list
Examples
========
>>> from sympy.combinatorics import Permutation, PermutationGroup
>>> p = PermutationGroup(Permutation(1, 3), Permutation(1, 2))
>>> p._elements
[(3), (3)(1 2), (1 3), (2 3), (1 2 3), (1 3 2)]"""
(definition of PermutationGroup._schreier_sims:)
def _schreier_sims(self, base=None):
[end of new definitions in sympy/combinatorics/perm_groups.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 820363f5b17cbe5809ef0911ea539e135c179c62 | ||
prometheus__client_python-152 | 152 | prometheus/client_python | null | c2e4bdae72d4b48fa646a6a9438137986b9b06f3 | 2017-03-28T12:55:36Z | diff --git a/README.md b/README.md
index 430dffae..443dc7c3 100644
--- a/README.md
+++ b/README.md
@@ -206,6 +206,13 @@ other processes, for example:
ProcessCollector(namespace='mydaemon', pid=lambda: open('/var/run/daemon.pid').read())
```
+### Platform Collector
+
+The client also automatically exports some metadata about Python. If using Jython,
+metadata about the JVM in use is also included. This information is available as
+labels on the `python_info` metric. The value of the metric is 1, since it is the
+labels that carry information.
+
## Exporting
There are several options for exporting metrics.
diff --git a/prometheus_client/__init__.py b/prometheus_client/__init__.py
index 358146a4..b270a868 100644
--- a/prometheus_client/__init__.py
+++ b/prometheus_client/__init__.py
@@ -3,6 +3,7 @@
from . import core
from . import exposition
from . import process_collector
+from . import platform_collector
__all__ = ['Counter', 'Gauge', 'Summary', 'Histogram']
# http://stackoverflow.com/questions/19913653/no-unicode-in-all-for-a-packages-init
@@ -31,6 +32,9 @@
ProcessCollector = process_collector.ProcessCollector
PROCESS_COLLECTOR = process_collector.PROCESS_COLLECTOR
+PlatformCollector = platform_collector.PlatformCollector
+PLATFORM_COLLECTOR = platform_collector.PLATFORM_COLLECTOR
+
if __name__ == '__main__':
c = Counter('cc', 'A counter')
diff --git a/prometheus_client/platform_collector.py b/prometheus_client/platform_collector.py
new file mode 100644
index 00000000..f70159d6
--- /dev/null
+++ b/prometheus_client/platform_collector.py
@@ -0,0 +1,58 @@
+#!/usr/bin/env python
+# -*- coding: utf-8
+from __future__ import unicode_literals
+
+import platform as pf
+
+from . import core
+
+
+class PlatformCollector(object):
+ """Collector for python platform information"""
+
+ def __init__(self, registry=core.REGISTRY, platform=None):
+ self._platform = pf if platform is None else platform
+ info = self._info()
+ system = self._platform.system()
+ if system == "Java":
+ info.update(self._java())
+ self._metrics = [
+ self._add_metric("python_info", "Python platform information", info)
+ ]
+ if registry:
+ registry.register(self)
+
+ def collect(self):
+ return self._metrics
+
+ @staticmethod
+ def _add_metric(name, documentation, data):
+ labels = data.keys()
+ values = [data[k] for k in labels]
+ g = core.GaugeMetricFamily(name, documentation, labels=labels)
+ g.add_metric(values, 1)
+ return g
+
+ def _info(self):
+ major, minor, patchlevel = self._platform.python_version_tuple()
+ return {
+ "version": self._platform.python_version(),
+ "implementation": self._platform.python_implementation(),
+ "major": major,
+ "minor": minor,
+ "patchlevel": patchlevel
+ }
+
+ def _java(self):
+ java_version, _, vminfo, osinfo = self._platform.java_ver()
+ vm_name, vm_release, vm_vendor = vminfo
+ return {
+ "jvm_version": java_version,
+ "jvm_release": vm_release,
+ "jvm_vendor": vm_vendor,
+ "jvm_name": vm_name
+ }
+
+
+PLATFORM_COLLECTOR = PlatformCollector()
+"""PlatformCollector in default Registry REGISTRY"""
| diff --git a/tests/test_platform_collector.py b/tests/test_platform_collector.py
new file mode 100644
index 00000000..9eda5bb5
--- /dev/null
+++ b/tests/test_platform_collector.py
@@ -0,0 +1,69 @@
+from __future__ import unicode_literals
+
+import unittest
+
+from prometheus_client import CollectorRegistry, PlatformCollector
+
+
+class TestPlatformCollector(unittest.TestCase):
+ def setUp(self):
+ self.registry = CollectorRegistry()
+ self.platform = _MockPlatform()
+
+ def test_python_info(self):
+ PlatformCollector(registry=self.registry, platform=self.platform)
+ self.assertLabels("python_info", {
+ "version": "python_version",
+ "implementation": "python_implementation",
+ "major": "pvt_major",
+ "minor": "pvt_minor",
+ "patchlevel": "pvt_patchlevel"
+ })
+
+ def test_system_info_java(self):
+ self.platform._system = "Java"
+ PlatformCollector(registry=self.registry, platform=self.platform)
+ self.assertLabels("python_info", {
+ "version": "python_version",
+ "implementation": "python_implementation",
+ "major": "pvt_major",
+ "minor": "pvt_minor",
+ "patchlevel": "pvt_patchlevel",
+ "jvm_version": "jv_release",
+ "jvm_release": "vm_release",
+ "jvm_vendor": "vm_vendor",
+ "jvm_name": "vm_name"
+ })
+
+ def assertLabels(self, name, labels):
+ for metric in self.registry.collect():
+ for n, l, value in metric.samples:
+ if n == name:
+ assert l == labels
+ return
+ assert False
+
+
+class _MockPlatform(object):
+ def __init__(self):
+ self._system = "system"
+
+ def python_version_tuple(self):
+ return "pvt_major", "pvt_minor", "pvt_patchlevel"
+
+ def python_version(self):
+ return "python_version"
+
+ def python_implementation(self):
+ return "python_implementation"
+
+ def system(self):
+ return self._system
+
+ def java_ver(self):
+ return (
+ "jv_release",
+ "jv_vendor",
+ ("vm_name", "vm_release", "vm_vendor"),
+ ("os_name", "os_version", "os_arch")
+ )
| diff --git a/README.md b/README.md
index 430dffae..443dc7c3 100644
--- a/README.md
+++ b/README.md
@@ -206,6 +206,13 @@ other processes, for example:
ProcessCollector(namespace='mydaemon', pid=lambda: open('/var/run/daemon.pid').read())
```
+### Platform Collector
+
+The client also automatically exports some metadata about Python. If using Jython,
+metadata about the JVM in use is also included. This information is available as
+labels on the `python_info` metric. The value of the metric is 1, since it is the
+labels that carry information.
+
## Exporting
There are several options for exporting metrics.
| [
{
"components": [
{
"doc": "Collector for python platform information",
"lines": [
10,
53
],
"name": "PlatformCollector",
"signature": "class PlatformCollector(object):",
"type": "class"
},
{
"doc": "",
"lines"... | [
"tests/test_platform_collector.py::TestPlatformCollector::test_python_info",
"tests/test_platform_collector.py::TestPlatformCollector::test_system_info_java"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add PlatformCollector to collect information about the platform
In the same vein as https://github.com/prometheus/client_java/commit/bdc6f2507a81029c1aec6cd8b5276661bb6c7ee1 and https://github.com/siimon/prom-client/commit/ee561e28e14a62b0ef069f525174ee9a3b66d4d7, this PR adds Python version and some other metadata as labels on a small set of metrics.
This has turned out to be very useful for creating common dashboards across implementation languages, allowing the dashboard to extract metadata from the metrics exposed.
Python, machine and system information is collected once at startup, and
exposed as labels on a metric with value 1.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in prometheus_client/platform_collector.py]
(definition of PlatformCollector:)
class PlatformCollector(object):
"""Collector for python platform information"""
(definition of PlatformCollector.__init__:)
def __init__(self, registry=core.REGISTRY, platform=None):
(definition of PlatformCollector.collect:)
def collect(self):
(definition of PlatformCollector._add_metric:)
def _add_metric(name, documentation, data):
(definition of PlatformCollector._info:)
def _info(self):
(definition of PlatformCollector._java:)
def _java(self):
[end of new definitions in prometheus_client/platform_collector.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 09a5ae30602a7a81f6174dae4ba08b93ee7feed2 | |
conan-io__conan-1152 | 1,152 | conan-io/conan | null | feff1011f7a3f51904a73747919c2683e3d6b5fd | 2017-03-27T08:52:10Z | diff --git a/conans/client/command.py b/conans/client/command.py
index c14a2284ff5..8fa06150d74 100644
--- a/conans/client/command.py
+++ b/conans/client/command.py
@@ -99,13 +99,16 @@ def new(self, *args):
help='Create a package with embedded sources in "hello" folder, '
'using "exports_sources" instead of retrieving external code with '
'the "source()" method')
+ parser.add_argument("-b", "--bare", action='store_true', default=False,
+ help='Create the minimum package recipe, without build() or package()'
+ 'methods. Useful in combination with "package_files" command')
args = parser.parse_args(*args)
log_command("new", vars(args))
root_folder = os.getcwd()
files = get_files(args.name, header=args.header, pure_c=args.pure_c, test=args.test,
- exports_sources=args.sources)
+ exports_sources=args.sources, bare=args.bare)
save_files(root_folder, files)
for f in sorted(files):
@@ -208,6 +211,40 @@ def test(self, *args):
"""
self.test_package(*args)
+ def package_files(self, *args):
+ """Creates a package binary from given precompiled artifacts in user folder, skipping
+ the package recipe build() and package() methods
+ """
+ parser = argparse.ArgumentParser(description=self.package_files.__doc__, prog="conan package_files")
+ parser.add_argument("reference",
+ help='package recipe reference e.g., MyPackage/1.2@user/channel')
+ parser.add_argument("--path", "-p",
+ help='Get binaries from this path, relative to current or absolute')
+ parser.add_argument("--profile", "-pr",
+ help='Profile for this package')
+ parser.add_argument("--options", "-o",
+ help='Options for this package. e.g., -o with_qt=true',
+ nargs=1, action=Extender)
+ parser.add_argument("--settings", "-s",
+ help='Settings for this package e.g., -s compiler=gcc',
+ nargs=1, action=Extender)
+
+ args = parser.parse_args(*args)
+ args.env = None
+ args.scope = None
+ log_command("package_files", vars(args))
+ reference = ConanFileReference.loads(args.reference)
+ current_path = os.getcwd()
+ if args.path:
+ if os.path.isabs(args.path):
+ path = args.path
+ else:
+ path = os.path.join(current_path, args.path)
+ else:
+ path = current_path
+ profile = profile_from_args(args, current_path, self._client_cache.profiles_path)
+ self._manager.package_files(reference=reference, path=path, profile=profile)
+
def install(self, *args):
"""Installs the requirements specified in a 'conanfile.py' or 'conanfile.txt'.
It can also be used to install a concrete recipe/package specified by the reference parameter.
diff --git a/conans/client/manager.py b/conans/client/manager.py
index 58f27eb7183..9baaf52bbfe 100644
--- a/conans/client/manager.py
+++ b/conans/client/manager.py
@@ -1,5 +1,6 @@
import os
import time
+import shutil
from collections import OrderedDict, Counter
from conans.client import packager
@@ -25,10 +26,11 @@
from conans.client.userio import UserIO
from conans.errors import NotFoundException, ConanException
from conans.model.ref import ConanFileReference, PackageReference
-from conans.paths import CONANFILE, CONANINFO, CONANFILE_TXT
+from conans.paths import CONANFILE, CONANINFO, CONANFILE_TXT, CONAN_MANIFEST
from conans.tools import environment_append
from conans.util.files import save, rmdir, normalize, mkdir
from conans.util.log import logger
+from conans.model.manifest import FileTreeManifest
from conans.client.loader_parse import load_conanfile_class
from conans.client.build_requires import BuildRequires
@@ -75,6 +77,35 @@ def export(self, user, conan_file_path, keep_source=False):
output = ScopedOutput(str(conan_ref), self._user_io.out)
export_conanfile(output, self._client_cache, conanfile, src_folder, conan_ref, keep_source)
+ def package_files(self, reference, path, profile):
+ """ Bundle pre-existing binaries
+ @param reference: ConanFileReference
+ """
+ conan_file_path = self._client_cache.conanfile(reference)
+ if not os.path.exists(conan_file_path):
+ raise ConanException("Package recipe '%s' does not exist" % str(reference))
+
+ current_path = path
+ remote_proxy = ConanProxy(self._client_cache, self._user_io, self._remote_manager,
+ remote_name=None, update=False, check_updates=False,
+ manifest_manager=None)
+ loader = ConanFileLoader(self._runner, self._client_cache.settings, profile)
+ conanfile = loader.load_virtual(reference, current_path)
+ graph_builder = self._get_graph_builder(loader, False, remote_proxy)
+ deps_graph = graph_builder.load(conanfile)
+
+ # this is a bit tricky, but works. The loading of a cache package makes the referenced
+ # one, the first of the first level, always existing
+ nodes = deps_graph.direct_requires()
+ _, conanfile = nodes[0]
+ packages_folder = self._client_cache.packages(reference)
+ package_folder = os.path.join(packages_folder, conanfile.info.package_id())
+ shutil.copytree(path, package_folder)
+ save(os.path.join(package_folder, CONANINFO), conanfile.info.dumps())
+ # Create the digest for the package
+ digest = FileTreeManifest.create(package_folder)
+ save(os.path.join(package_folder, CONAN_MANIFEST), str(digest))
+
def download(self, reference, package_ids, remote=None):
""" Download conanfile and specified packages to local repository
@param reference: ConanFileReference
diff --git a/conans/client/new.py b/conans/client/new.py
index 3570309ad85..22a37869c41 100644
--- a/conans/client/new.py
+++ b/conans/client/new.py
@@ -1,4 +1,3 @@
-from conans.model.ref import ConanFileReference
from conans.errors import ConanException
import re
@@ -43,6 +42,20 @@ def package_info(self):
self.cpp_info.libs = ["hello"]
"""
+conanfile_bare = """from conans import ConanFile
+
+class {package_name}Conan(ConanFile):
+ name = "{name}"
+ version = "{version}"
+ settings = "os", "compiler", "build_type", "arch"
+ description = "Package for {package_name}"
+ url = "None"
+ license = "None"
+
+ def package_info(self):
+ self.cpp_info.libs = self.collect_libs()
+"""
+
conanfile_sources = """from conans import ConanFile, CMake, tools
import os
@@ -188,9 +201,15 @@ def test(self):
"""
-def get_files(ref, header=False, pure_c=False, test=False, exports_sources=False):
+def get_files(ref, header=False, pure_c=False, test=False, exports_sources=False, bare=False):
try:
- name, version, user, channel = ConanFileReference.loads(ref)
+ tokens = ref.split("@")
+ name, version = tokens[0].split("/")
+ if len(tokens) == 2:
+ user, channel = tokens[1].split("/")
+ else:
+ user, channel = "user", "channel"
+
pattern = re.compile('[\W_]+')
package_name = pattern.sub('', name).capitalize()
except:
@@ -201,6 +220,8 @@ def get_files(ref, header=False, pure_c=False, test=False, exports_sources=False
raise ConanException("--header and --sources are incompatible options")
if pure_c and (header or exports_sources):
raise ConanException("--pure_c is incompatible with --header and --sources")
+ if bare and (header or exports_sources):
+ raise ConanException("--bare is incompatible with --header and --sources")
if header:
files = {"conanfile.py": conanfile_header.format(name=name, version=version,
@@ -211,6 +232,9 @@ def get_files(ref, header=False, pure_c=False, test=False, exports_sources=False
"hello/hello.cpp": hello_cpp,
"hello/hello.h": hello_h,
"hello/CMakeLists.txt": cmake}
+ elif bare:
+ files = {"conanfile.py": conanfile_bare.format(name=name, version=version,
+ package_name=package_name)}
else:
files = {"conanfile.py": conanfile.format(name=name, version=version,
package_name=package_name)}
| diff --git a/conans/test/command/package_files_test.py b/conans/test/command/package_files_test.py
new file mode 100644
index 00000000000..6c5427a1a5f
--- /dev/null
+++ b/conans/test/command/package_files_test.py
@@ -0,0 +1,71 @@
+import unittest
+from conans.paths import CONANFILE
+from conans.test.utils.tools import TestClient
+from conans.model.ref import ConanFileReference, PackageReference
+from conans.util.files import load
+import os
+
+
+class PackageFilesTest(unittest.TestCase):
+
+ def test_basic(self):
+ client = TestClient()
+ conanfile = """
+from conans import ConanFile
+class TestConan(ConanFile):
+ name = "Hello"
+ version = "0.1"
+ settings = "os"
+"""
+ client.save({CONANFILE: conanfile})
+ client.run("export lasote/stable")
+
+ client.save({"include/header.h": "//Windows header"}, clean_first=True)
+ client.run("package_files Hello/0.1@lasote/stable -s os=Windows")
+ conan_ref = ConanFileReference.loads("Hello/0.1@lasote/stable")
+ win_package_ref = PackageReference(conan_ref, "3475bd55b91ae904ac96fde0f106a136ab951a5e")
+ package_folder = client.client_cache.package(win_package_ref)
+ self.assertEqual(load(os.path.join(package_folder, "include/header.h")),
+ "//Windows header")
+ self._consume(client, "-s os=Windows")
+ self.assertIn("Hello/0.1@lasote/stable:3475bd55b91ae904ac96fde0f106a136ab951a5e",
+ client.user_io.out)
+
+ def _consume(self, client, install_args):
+ consumer = """
+from conans import ConanFile
+class TestConan(ConanFile):
+ requires = "Hello/0.1@lasote/stable"
+ settings = "os"
+"""
+ client.save({CONANFILE: consumer}, clean_first=True)
+ client.run("install %s" % install_args)
+ self.assertIn("Hello/0.1@lasote/stable: Already installed!", client.user_io.out)
+
+ def test_new(self):
+ client = TestClient()
+ client.run("new Hello/0.1 --bare")
+ client.run("export lasote/stable")
+ client.save({"lib/libmycoollib.a": ""}, clean_first=True)
+ settings = ('-s os=Windows -s compiler=gcc -s compiler.version=4.9 '
+ '-s compiler.libcxx=libstdc++ -s build_type=Release -s arch=x86')
+ client.run("package_files Hello/0.1@lasote/stable %s" % settings)
+ self._consume(client, settings + " -g cmake")
+
+ cmakeinfo = load(os.path.join(client.current_folder, "conanbuildinfo.cmake"))
+ self.assertIn("set(CONAN_LIBS_HELLO mycoollib)", cmakeinfo)
+ self.assertIn("set(CONAN_LIBS mycoollib ${CONAN_LIBS})", cmakeinfo)
+
+ def test_paths(self):
+ client = TestClient()
+ client.run("new Hello/0.1 --bare")
+ client.run("export lasote/stable")
+ client.save({"Release_x86/lib/libmycoollib.a": ""}, clean_first=True)
+ settings = ('-s os=Windows -s compiler=gcc -s compiler.version=4.9 '
+ '-s compiler.libcxx=libstdc++ -s build_type=Release -s arch=x86')
+ client.run("package_files Hello/0.1@lasote/stable %s --path=Release_x86" % settings)
+ self._consume(client, settings + " -g cmake")
+
+ cmakeinfo = load(os.path.join(client.current_folder, "conanbuildinfo.cmake"))
+ self.assertIn("set(CONAN_LIBS_HELLO mycoollib)", cmakeinfo)
+ self.assertIn("set(CONAN_LIBS mycoollib ${CONAN_LIBS})", cmakeinfo)
| [
{
"components": [
{
"doc": "Creates a package binary from given precompiled artifacts in user folder, skipping\nthe package recipe build() and package() methods",
"lines": [
214,
246
],
"name": "Command.package_files",
"signature": "def package_f... | [
"conans/test/command/package_files_test.py::PackageFilesTest::test_basic",
"conans/test/command/package_files_test.py::PackageFilesTest::test_new",
"conans/test/command/package_files_test.py::PackageFilesTest::test_paths"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
initial proposal for package_files command
This would implement: https://github.com/conan-io/conan/issues/1112
~~It might be ready, but conflicting with ongoing work by @lasote, so it could wait until then.~~
Merged with develop, conflicts solved.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/client/command.py]
(definition of Command.package_files:)
def package_files(self, *args):
"""Creates a package binary from given precompiled artifacts in user folder, skipping
the package recipe build() and package() methods"""
[end of new definitions in conans/client/command.py]
[start of new definitions in conans/client/manager.py]
(definition of ConanManager.package_files:)
def package_files(self, reference, path, profile):
"""Bundle pre-existing binaries
@param reference: ConanFileReference"""
[end of new definitions in conans/client/manager.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | ||
sympy__sympy-12417 | 12,417 | sympy/sympy | 1.0 | 6dd369352b0bda17c2dfef5a8faf0bfb794982ec | 2017-03-25T07:42:04Z | diff --git a/sympy/vector/__init__.py b/sympy/vector/__init__.py
index 8e484bf3a739..dfa5168cc802 100644
--- a/sympy/vector/__init__.py
+++ b/sympy/vector/__init__.py
@@ -8,7 +8,7 @@
from sympy.vector.functions import (express, matrix_to_vector,
curl, divergence, gradient,
is_conservative, is_solenoidal,
- scalar_potential,
+ scalar_potential, directional_derivative,
scalar_potential_difference)
from sympy.vector.point import Point
from sympy.vector.orienters import (AxisOrienter, BodyOrienter,
diff --git a/sympy/vector/functions.py b/sympy/vector/functions.py
index 26c98503d6a3..e0df22c05593 100644
--- a/sympy/vector/functions.py
+++ b/sympy/vector/functions.py
@@ -214,6 +214,39 @@ def gradient(scalar, coord_sys):
return coord_sys.delop(scalar).doit()
+def directional_derivative(scalar, vect):
+ """
+ Returns the directional derivative of a scalar field computed along a given vector
+ in given coordinate system.
+
+ Parameters
+ ==========
+
+ scalar : SymPy Expr
+ The scalar field to compute the gradient of
+
+ vect : Vector
+ The vector operand
+
+ coord_sys : CoordSysCartesian
+ The coordinate system to calculate the gradient in
+
+ Examples
+ ========
+
+ >>> from sympy.vector import CoordSysCartesian, directional_derivative
+ >>> R = CoordSysCartesian('R')
+ >>> f1 = R.x*R.y*R.z
+ >>> v1 = 3*R.i + 4*R.j + R.k
+ >>> directional_derivative(f1, v1)
+ R.x*R.y + 4*R.x*R.z + 3*R.y*R.z
+ >>> f2 = 5*R.x**2*R.z
+ >>> directional_derivative(f2, v1)
+ 5*R.x**2 + 30*R.x*R.z
+
+ """
+ coord_sys = vect._sys
+ return gradient(scalar, coord_sys).dot(vect).doit()
def is_conservative(field):
"""
| diff --git a/sympy/vector/tests/test_field_functions.py b/sympy/vector/tests/test_field_functions.py
index 73946517324a..2d831cbdc489 100644
--- a/sympy/vector/tests/test_field_functions.py
+++ b/sympy/vector/tests/test_field_functions.py
@@ -7,7 +7,7 @@
from sympy import sin, cos
from sympy.vector.functions import (curl, divergence, gradient,
is_conservative, is_solenoidal,
- scalar_potential,
+ scalar_potential, directional_derivative,
scalar_potential_difference)
from sympy.utilities.pytest import raises
@@ -185,6 +185,10 @@ def test_solenoidal():
assert is_solenoidal(cos(q)*i + sin(q)*j + cos(q)*P.k) is True
assert is_solenoidal(z*P.i + P.x*k) is True
+def test_directional_derivative():
+ assert directional_derivative(C.x*C.y*C.z, 3*C.i + 4*C.j + C.k) == C.x*C.y + 4*C.x*C.z + 3*C.y*C.z
+ assert directional_derivative(5*C.x**2*C.z, 3*C.i + 4*C.j + C.k) == 5*C.x**2 + 30*C.x*C.z
+ assert directional_derivative(5*C.x**2*C.z, 4*C.j) == S.Zero
def test_scalar_potential():
assert scalar_potential(Vector.zero, C) == 0
| [
{
"components": [
{
"doc": "Returns the directional derivative of a scalar field computed along a given vector\nin given coordinate system.\n\nParameters\n==========\n\nscalar : SymPy Expr\n The scalar field to compute the gradient of\n\nvect : Vector\n The vector operand\n\ncoord_sys : Coor... | [
"test_del_operator",
"test_product_rules",
"test_conservative",
"test_solenoidal",
"test_directional_derivative",
"test_scalar_potential"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Addition of Directional derivatives
Fixes #12416
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/vector/functions.py]
(definition of directional_derivative:)
def directional_derivative(scalar, vect):
"""Returns the directional derivative of a scalar field computed along a given vector
in given coordinate system.
Parameters
==========
scalar : SymPy Expr
The scalar field to compute the gradient of
vect : Vector
The vector operand
coord_sys : CoordSysCartesian
The coordinate system to calculate the gradient in
Examples
========
>>> from sympy.vector import CoordSysCartesian, directional_derivative
>>> R = CoordSysCartesian('R')
>>> f1 = R.x*R.y*R.z
>>> v1 = 3*R.i + 4*R.j + R.k
>>> directional_derivative(f1, v1)
R.x*R.y + 4*R.x*R.z + 3*R.y*R.z
>>> f2 = 5*R.x**2*R.z
>>> directional_derivative(f2, v1)
5*R.x**2 + 30*R.x*R.z"""
[end of new definitions in sympy/vector/functions.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Addition of Directional Derivative Function
Sympy, currently doesn't support the directional derivative function which is used for finding the components of a scalar field along a vector.
Directional Derivative functions are of importance these are used in vector mathematics as well as in differential geometry.
I hope that after this the issue has been solved we would be able to do the following:
```
>>> from sympy.vector import CoordSysCartesian, directional_derivative
>>> R = CoordSysCartesian('R')
>>> f1 = R.x*R.y*R.z
>>> v1 = 3*R.i + 4*R.j + R.k
>>> directional_derivative(s1, v1, R)
3*R.y*R.z + 4*R.x*R.z + R.x*R.y
```
----------
--------------------
</issues> | 820363f5b17cbe5809ef0911ea539e135c179c62 | |
joke2k__faker-471 | 471 | joke2k/faker | null | 63a490b33381cfb5d66690d0e25c91f5b04824cd | 2017-03-09T01:26:10Z | diff --git a/faker/providers/isbn/__init__.py b/faker/providers/isbn/__init__.py
new file mode 100644
index 0000000000..ada84898fe
--- /dev/null
+++ b/faker/providers/isbn/__init__.py
@@ -0,0 +1,72 @@
+# coding=utf-8
+
+from __future__ import unicode_literals
+from .. import BaseProvider
+from .isbn import ISBN, ISBN10, ISBN13
+from .rules import RULES
+
+
+class Provider(BaseProvider):
+ """ Generates fake ISBNs. ISBN rules vary across languages/regions
+ so this class makes no attempt at replicating all of the rules. It
+ only replicates the 978 EAN prefix for the English registration
+ groups, meaning the first 4 digits of the ISBN-13 will either be
+ 978-0 or 978-1. Since we are only replicating 978 prefixes, every
+ ISBN-13 will have a direct mapping to an ISBN-10.
+
+ See https://www.isbn-international.org/content/what-isbn for the
+ format of ISBNs.
+ See https://www.isbn-international.org/range_file_generation for the
+ list of rules pertaining to each prefix/registration group.
+ """
+
+ def _body(self):
+ """ Generate the information required to create an ISBN-10 or
+ ISBN-13.
+ """
+ ean = self.random_element(RULES.keys())
+ reg_group = self.random_element(RULES[ean].keys())
+
+ # Given the chosen ean/group, decide how long the
+ # registrant/publication string may be.
+ # We must allocate for the calculated check digit, so
+ # subtract 1
+ reg_pub_len = ISBN.MAX_LENGTH - len(ean) - len(reg_group) - 1
+
+ # Generate a registrant/publication combination
+ reg_pub = self.numerify('#' * reg_pub_len)
+
+ # Use rules to separate the registrant from the publication
+ rules = RULES[ean][reg_group]
+ registrant, publication = self._registrant_publication(reg_pub, rules)
+ return [ean, reg_group, registrant, publication]
+
+ @staticmethod
+ def _registrant_publication(reg_pub, rules):
+ """ Separate the registration from the publication in a given
+ string.
+ :param reg_pub: A string of digits representing a registration
+ and publication.
+ :param rules: A list of RegistrantRules which designate where
+ to separate the values in the string.
+ :returns: A (registrant, publication) tuple of strings.
+ """
+ for rule in rules:
+ if rule.min <= reg_pub <= rule.max:
+ reg_len = rule.registrant_length
+ break
+ else:
+ raise Exception('Registrant/Publication not found in registrant '
+ 'rule list.')
+ registrant, publication = reg_pub[:reg_len], reg_pub[reg_len:]
+ return registrant, publication
+
+ def isbn13(self, separator='-'):
+ ean, group, registrant, publication = self._body()
+ isbn = ISBN13(ean, group, registrant, publication)
+ return isbn.format(separator)
+
+ def isbn10(self, separator='-'):
+ ean, group, registrant, publication = self._body()
+ isbn = ISBN10(ean, group, registrant, publication)
+ return isbn.format(separator)
diff --git a/faker/providers/isbn/en_US/__init__.py b/faker/providers/isbn/en_US/__init__.py
new file mode 100644
index 0000000000..e91928230f
--- /dev/null
+++ b/faker/providers/isbn/en_US/__init__.py
@@ -0,0 +1,5 @@
+from .. import Provider as ISBNProvider
+
+
+class Provider(ISBNProvider):
+ pass
\ No newline at end of file
diff --git a/faker/providers/isbn/isbn.py b/faker/providers/isbn/isbn.py
new file mode 100644
index 0000000000..56145e1e24
--- /dev/null
+++ b/faker/providers/isbn/isbn.py
@@ -0,0 +1,62 @@
+# coding=utf-8
+"""
+This module is responsible for generating the check digit and formatting
+ISBN numbers.
+"""
+
+
+class ISBN(object):
+
+ MAX_LENGTH = 13
+
+ def __init__(self, ean=None, group=None, registrant=None, publication=None):
+ self.ean = ean
+ self.group = group
+ self.registrant = registrant
+ self.publication = publication
+
+
+class ISBN13(ISBN):
+
+ def __init__(self, *args, **kwargs):
+ super(ISBN13, self).__init__(*args, **kwargs)
+ self.check_digit = self._check_digit()
+
+ def _check_digit(self):
+ """ Calculate the check digit for ISBN-13.
+ See https://en.wikipedia.org/wiki/International_Standard_Book_Number
+ for calculation.
+ """
+ weights = (1 if x % 2 == 0 else 3 for x in range(12))
+ body = ''.join([self.ean, self.group, self.registrant,
+ self.publication])
+ remainder = sum(int(b) * w for b, w in zip(body, weights)) % 10
+ diff = 10 - remainder
+ check_digit = 0 if diff == 10 else diff
+ return str(check_digit)
+
+ def format(self, separator=''):
+ return separator.join([self.ean, self.group, self.registrant,
+ self.publication, self.check_digit])
+
+
+class ISBN10(ISBN):
+
+ def __init__(self, *args, **kwargs):
+ super(ISBN10, self).__init__(*args, **kwargs)
+ self.check_digit = self._check_digit()
+
+ def _check_digit(self):
+ """ Calculate the check digit for ISBN-10.
+ See https://en.wikipedia.org/wiki/International_Standard_Book_Number
+ for calculation.
+ """
+ weights = range(1, 10)
+ body = ''.join([self.group, self.registrant, self.publication])
+ remainder = sum(int(b) * w for b, w in zip(body, weights)) % 11
+ check_digit = 'X' if remainder == 10 else str(remainder)
+ return str(check_digit)
+
+ def format(self, separator=''):
+ return separator.join([self.group, self.registrant, self.publication,
+ self.check_digit])
diff --git a/faker/providers/isbn/rules.py b/faker/providers/isbn/rules.py
new file mode 100644
index 0000000000..a18ecf4ea4
--- /dev/null
+++ b/faker/providers/isbn/rules.py
@@ -0,0 +1,45 @@
+# coding=utf-8
+"""
+This module exists solely to figure how long a registrant/publication
+number may be within an ISBN. The rules change based on the prefix and
+language/region. This list of rules only encapsulates the 978 prefix
+for English books. 978 is the largest and, until recently, the only
+prefix.
+
+The complete list of prefixes and rules can be found at
+https://www.isbn-international.org/range_file_generation
+"""
+
+from collections import namedtuple
+
+RegistrantRule = namedtuple('RegistrantRule', ['min', 'max', 'registrant_length'])
+
+# Structure: RULES[`EAN Prefix`][`Registration Group`] = [Rule1, Rule2, ...]
+RULES = {
+ '978': {
+ '0': [
+ RegistrantRule('0000000', '1999999', 2),
+ RegistrantRule('2000000', '2279999', 3),
+ RegistrantRule('2280000', '2289999', 4),
+ RegistrantRule('2290000', '6479999', 3),
+ RegistrantRule('6480000', '6489999', 7),
+ RegistrantRule('6490000', '6999999', 3),
+ RegistrantRule('7000000', '8499999', 4),
+ RegistrantRule('8500000', '8999999', 5),
+ RegistrantRule('9000000', '9499999', 6),
+ RegistrantRule('9500000', '9999999', 7)
+ ],
+ '1': [
+ RegistrantRule('0000000', '0999999', 2),
+ RegistrantRule('1000000', '3999999', 3),
+ RegistrantRule('4000000', '5499999', 4),
+ RegistrantRule('5500000', '7319999', 5),
+ RegistrantRule('7320000', '7399999', 7),
+ RegistrantRule('7400000', '8697999', 5),
+ RegistrantRule('8698000', '9729999', 6),
+ RegistrantRule('9730000', '9877999', 4),
+ RegistrantRule('9878000', '9989999', 6),
+ RegistrantRule('9990000', '9999999', 7)
+ ]
+ }
+}
| diff --git a/tests/__init__.py b/tests/__init__.py
index 6026772255..3606e0fe53 100644
--- a/tests/__init__.py
+++ b/tests/__init__.py
@@ -134,6 +134,7 @@ def test_find_available_providers(self):
'faker.providers.date_time',
'faker.providers.file',
'faker.providers.internet',
+ 'faker.providers.isbn',
'faker.providers.job',
'faker.providers.lorem',
'faker.providers.misc',
@@ -142,7 +143,7 @@ def test_find_available_providers(self):
'faker.providers.profile',
'faker.providers.python',
'faker.providers.ssn',
- 'faker.providers.user_agent',
+ 'faker.providers.user_agent'
]))
self.assertEqual(providers, expected_providers)
diff --git a/tests/isbn/__init__.py b/tests/isbn/__init__.py
new file mode 100644
index 0000000000..98e18f39a9
--- /dev/null
+++ b/tests/isbn/__init__.py
@@ -0,0 +1,52 @@
+import unittest
+from faker.providers.isbn.en_US import Provider as ISBNProvider
+from faker.providers.isbn import ISBN10, ISBN13
+from faker.providers.isbn.rules import RegistrantRule
+
+
+class TestISBN10(unittest.TestCase):
+
+ def test_check_digit_is_correct(self):
+ isbn = ISBN10(group='1', registrant='4516', publication='7331')
+ assert isbn.check_digit == '0'
+ isbn = ISBN10(group='0', registrant='06', publication='230125')
+ assert isbn.check_digit == 'X'
+ isbn = ISBN10(group='1', registrant='4936', publication='8222')
+ assert isbn.check_digit == '9'
+
+ def test_format_length(self):
+ isbn = ISBN10(group='1', registrant='4516', publication='7331')
+ assert len(isbn.format()) == 10
+
+
+class TestISBN13(unittest.TestCase):
+
+ def test_check_digit_is_correct(self):
+ isbn = ISBN13(ean='978', group='1', registrant='4516', publication='7331')
+ assert isbn.check_digit == '9'
+ isbn = ISBN13(ean='978', group='1', registrant='59327', publication='599')
+ assert isbn.check_digit == '0'
+ isbn = ISBN13(ean='978', group='1', registrant='4919', publication='2757')
+ assert isbn.check_digit == '1'
+
+ def test_format_length(self):
+ isbn = ISBN13(ean='978', group='1', registrant='4516', publication='7331')
+ assert len(isbn.format()) == 13
+
+
+class TestProvider(unittest.TestCase):
+
+ def setUp(self):
+ self.prov = ISBNProvider(None)
+
+ def test_reg_pub_separation(self):
+ r1 = RegistrantRule('0000000', '0000001', 1)
+ r2 = RegistrantRule('0000002', '0000003', 2)
+ assert self.prov._registrant_publication('0000000', [r1, r2]) == ('0', '000000')
+ assert self.prov._registrant_publication('0000002', [r1, r2]) == ('00', '00002')
+
+ def test_rule_not_found(self):
+ with self.assertRaises(Exception):
+ r = RegistrantRule('0000000', '0000001', 1)
+ self.prov._registrant_publication('0000002', [r])
+
| [
{
"components": [
{
"doc": "Generates fake ISBNs. ISBN rules vary across languages/regions\nso this class makes no attempt at replicating all of the rules. It\nonly replicates the 978 EAN prefix for the English registration\ngroups, meaning the first 4 digits of the ISBN-13 will either be\n978-0 o... | [
"tests/__init__.py::UtilsTestCase::test_add_dicts",
"tests/__init__.py::UtilsTestCase::test_choice_distribution",
"tests/__init__.py::UtilsTestCase::test_find_available_locales",
"tests/__init__.py::UtilsTestCase::test_find_available_providers",
"tests/__init__.py::FactoryTestCase::test_add_provider_gives_p... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Implement ISBN
Per discussion in #469 this PR allows faker to support the generation of valid ISBNs. See [ISBN International](https://www.isbn-international.org/content/what-isbn) for an overview of the individual numbers that make up a valid ISBN.
An important thing to consider with ISBN is that the hyphenation rules are dynamic, numerous, and subject to change by ISBN International. This PR only implements the ruleset for English language books. If we are interested in getting fancy, we can always utilize other python projects that actually keep up-to-date with information like this.
Example usage:
```python
from faker import Factory
fake = Factory.create()
fake.isbn10() # '1-67423-180-6'
fake.isbn13() # '978-0-584-45740-7'
fake.isbn13(separator=' ') # '978 1 917745 78 9'
```
This is my first ever PR to an open source project, so let me know if anything is out of order.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/isbn/__init__.py]
(definition of Provider:)
class Provider(BaseProvider):
"""Generates fake ISBNs. ISBN rules vary across languages/regions
so this class makes no attempt at replicating all of the rules. It
only replicates the 978 EAN prefix for the English registration
groups, meaning the first 4 digits of the ISBN-13 will either be
978-0 or 978-1. Since we are only replicating 978 prefixes, every
ISBN-13 will have a direct mapping to an ISBN-10.
See https://www.isbn-international.org/content/what-isbn for the
format of ISBNs.
See https://www.isbn-international.org/range_file_generation for the
list of rules pertaining to each prefix/registration group."""
(definition of Provider._body:)
def _body(self):
"""Generate the information required to create an ISBN-10 or
ISBN-13."""
(definition of Provider._registrant_publication:)
def _registrant_publication(reg_pub, rules):
"""Separate the registration from the publication in a given
string.
:param reg_pub: A string of digits representing a registration
and publication.
:param rules: A list of RegistrantRules which designate where
to separate the values in the string.
:returns: A (registrant, publication) tuple of strings."""
(definition of Provider.isbn13:)
def isbn13(self, separator='-'):
(definition of Provider.isbn10:)
def isbn10(self, separator='-'):
[end of new definitions in faker/providers/isbn/__init__.py]
[start of new definitions in faker/providers/isbn/en_US/__init__.py]
(definition of Provider:)
class Provider(ISBNProvider):
[end of new definitions in faker/providers/isbn/en_US/__init__.py]
[start of new definitions in faker/providers/isbn/isbn.py]
(definition of ISBN:)
class ISBN(object):
(definition of ISBN.__init__:)
def __init__(self, ean=None, group=None, registrant=None, publication=None):
(definition of ISBN13:)
class ISBN13(ISBN):
(definition of ISBN13.__init__:)
def __init__(self, *args, **kwargs):
(definition of ISBN13._check_digit:)
def _check_digit(self):
"""Calculate the check digit for ISBN-13.
See https://en.wikipedia.org/wiki/International_Standard_Book_Number
for calculation."""
(definition of ISBN13.format:)
def format(self, separator=''):
(definition of ISBN10:)
class ISBN10(ISBN):
(definition of ISBN10.__init__:)
def __init__(self, *args, **kwargs):
(definition of ISBN10._check_digit:)
def _check_digit(self):
"""Calculate the check digit for ISBN-10.
See https://en.wikipedia.org/wiki/International_Standard_Book_Number
for calculation."""
(definition of ISBN10.format:)
def format(self, separator=''):
[end of new definitions in faker/providers/isbn/isbn.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
sympy__sympy-12261 | 12,261 | sympy/sympy | 1.1 | ba4c04ea457de64915b925c21047bc6b067ca75d | 2017-03-06T20:57:01Z | diff --git a/sympy/vector/__init__.py b/sympy/vector/__init__.py
index 7d30954b4583..fccb4e4d14a1 100644
--- a/sympy/vector/__init__.py
+++ b/sympy/vector/__init__.py
@@ -6,8 +6,9 @@
from sympy.vector.deloperator import Del
from sympy.vector.coordsysrect import CoordSys3D, CoordSysCartesian
from sympy.vector.functions import (express, matrix_to_vector,
- is_conservative, is_solenoidal,
- scalar_potential, directional_derivative,
+ laplacian, is_conservative,
+ is_solenoidal, scalar_potential,
+ directional_derivative,
scalar_potential_difference)
from sympy.vector.point import Point
from sympy.vector.orienters import (AxisOrienter, BodyOrienter,
diff --git a/sympy/vector/functions.py b/sympy/vector/functions.py
index aa2cf7064cdf..d51dde83c2f8 100644
--- a/sympy/vector/functions.py
+++ b/sympy/vector/functions.py
@@ -1,4 +1,5 @@
from sympy.vector.coordsysrect import CoordSys3D
+from sympy.vector.deloperator import Del
from sympy.vector.scalar import BaseScalar
from sympy.vector.vector import Vector, BaseVector
from sympy.vector.operators import gradient, curl, divergence
@@ -173,6 +174,36 @@ def directional_derivative(field, direction_vector):
return S(0)
+def laplacian(expr):
+ """
+ Return the laplacian of the given field computed in terms of
+ the base scalars of the given coordinate system.
+
+ Parameters
+ ==========
+
+ expr : SymPy Expr or Vector
+ expr denotes a scalar or vector field.
+
+ Examples
+ ========
+
+ >>> from sympy.vector import CoordSys3D, laplacian
+ >>> R = CoordSys3D('R')
+ >>> f = R.x**2*R.y**5*R.z
+ >>> laplacian(f)
+ 20*R.x**2*R.y**3*R.z + 2*R.y**5*R.z
+ >>> f = R.x**2*R.i + R.y**3*R.j + R.z**4*R.k
+ >>> laplacian(f)
+ 2*R.i + 6*R.y*R.j + 12*R.z**2*R.k
+
+ """
+ delop = Del()
+ if expr.is_Vector:
+ return (gradient(divergence(expr)) - curl(curl(expr))).doit()
+ return delop.dot(delop(expr)).doit()
+
+
def is_conservative(field):
"""
Checks if a field is conservative.
| diff --git a/sympy/vector/tests/test_field_functions.py b/sympy/vector/tests/test_field_functions.py
index aad855df0fca..d8d95128c4ff 100644
--- a/sympy/vector/tests/test_field_functions.py
+++ b/sympy/vector/tests/test_field_functions.py
@@ -10,7 +10,7 @@
from sympy.vector.deloperator import Del
from sympy.vector.functions import (is_conservative, is_solenoidal,
scalar_potential, directional_derivative,
- scalar_potential_difference)
+ laplacian, scalar_potential_difference)
from sympy.utilities.pytest import raises
C = CoordSys3D('C')
@@ -99,6 +99,18 @@ def test_del_operator():
assert ((k & delop)(v)).doit() == k
assert ((v & delop)(Vector.zero)).doit() == Vector.zero
+ # Tests for laplacian on scalar fields
+ assert laplacian(x*y*z) == S.Zero
+ assert laplacian(x**2) == S(2)
+ assert laplacian(x**2*y**2*z**2) == \
+ 2*y**2*z**2 + 2*x**2*z**2 + 2*x**2*y**2
+
+ # Tests for laplacian on vector fields
+ assert laplacian(x*y*z*(i + j + k)) == Vector.zero
+ assert laplacian(x*y**2*z*(i + j + k)) == \
+ 2*x*z*i + 2*x*z*j + 2*x*z*k
+
+
def test_product_rules():
"""
| [
{
"components": [
{
"doc": "Return the laplacian of the given field computed in terms of\nthe base scalars of the given coordinate system.\n\nParameters\n==========\n\nexpr : SymPy Expr or Vector\n expr denotes a scalar or vector field.\n\nExamples\n========\n\n>>> from sympy.vector import Coor... | [
"test_del_operator",
"test_product_rules",
"test_conservative",
"test_solenoidal",
"test_directional_derivative",
"test_scalar_potential",
"test_scalar_potential_difference",
"test_differential_operators_curvilinear_system"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
vector : Add Laplacian function
This PR adds the laplacian function for both vector and scalar fields in the vector module.
https://en.wikipedia.org/wiki/Laplace_operator
https://en.wikipedia.org/wiki/Vector_Laplacian
Example :
```
>>> from sympy.vector import laplacian
>>> from sympy.vector.coordsysrect import CoordSysCartesian
>>> C = CoordSysCartesian('C')
>>> i, j, k = C.base_vectors()
>>> x, y, z = C.base_scalars()
>>> laplacian(x*y**2*z, C)
2*C.x*C.z
>>> laplacian(x*y**2*z*(i + j + k), C)
2*C.x*C.z*C.i + 2*C.x*C.z*C.j + 2*C.x*C.z*C.k
```
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/vector/functions.py]
(definition of laplacian:)
def laplacian(expr):
"""Return the laplacian of the given field computed in terms of
the base scalars of the given coordinate system.
Parameters
==========
expr : SymPy Expr or Vector
expr denotes a scalar or vector field.
Examples
========
>>> from sympy.vector import CoordSys3D, laplacian
>>> R = CoordSys3D('R')
>>> f = R.x**2*R.y**5*R.z
>>> laplacian(f)
20*R.x**2*R.y**3*R.z + 2*R.y**5*R.z
>>> f = R.x**2*R.i + R.y**3*R.j + R.z**4*R.k
>>> laplacian(f)
2*R.i + 6*R.y*R.j + 12*R.z**2*R.k"""
[end of new definitions in sympy/vector/functions.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | ec9e3c0436fbff934fa84e22bf07f1b3ef5bfac3 | ||
scikit-learn__scikit-learn-8478 | 8,478 | scikit-learn/scikit-learn | 0.20 | 312f64053d7249a326a19a07fa635ef5b5c6ed99 | 2017-02-28T22:55:57Z | diff --git a/doc/modules/classes.rst b/doc/modules/classes.rst
index 2e7dcba82e846..243c63ab0c7e2 100644
--- a/doc/modules/classes.rst
+++ b/doc/modules/classes.rst
@@ -646,6 +646,7 @@ Kernels:
:template: class.rst
impute.SimpleImputer
+ impute.MICEImputer
.. _kernel_approximation_ref:
diff --git a/doc/modules/impute.rst b/doc/modules/impute.rst
index e806cc2fd5b4a..c4e8a3395025c 100644
--- a/doc/modules/impute.rst
+++ b/doc/modules/impute.rst
@@ -15,6 +15,10 @@ values. However, this comes at the price of losing data which may be valuable
(even though incomplete). A better strategy is to impute the missing values,
i.e., to infer them from the known part of the data.
+
+Univariate feature imputation
+=============================
+
The :class:`SimpleImputer` class provides basic strategies for imputing missing
values, either using the mean, the median or the most frequent value of
the row or column in which the missing values are located. This class
@@ -52,5 +56,34 @@ Note that, here, missing values are encoded by 0 and are thus implicitly stored
in the matrix. This format is thus suitable when there are many more missing
values than observed values.
-:class:`SimpleImputer` can be used in a Pipeline as a way to build a composite
-estimator that supports imputation. See :ref:`sphx_glr_auto_examples_plot_missing_values.py`.
+
+Multivariate feature imputation
+===============================
+
+A more sophisticated approach is to use the :class:`MICEImputer` class, which
+implements the Multivariate Imputation by Chained Equations technique. MICE
+models each feature with missing values as a function of other features, and
+uses that estimate for imputation. It does so in a round-robin fashion: at
+each step, a feature column is designated as output `y` and the other feature
+columns are treated as inputs `X`. A regressor is fit on `(X, y)` for known `y`.
+Then, the regressor is used to predict the unknown values of `y`. This is
+repeated for each feature, and then is done for a number of imputation rounds.
+Here is an example snippet::
+
+ >>> import numpy as np
+ >>> from sklearn.impute import MICEImputer
+ >>> imp = MICEImputer(n_imputations=10, random_state=0)
+ >>> imp.fit([[1, 2], [np.nan, 3], [7, np.nan]])
+ MICEImputer(imputation_order='ascending', initial_strategy='mean',
+ max_value=None, min_value=None, missing_values='NaN', n_burn_in=10,
+ n_imputations=10, n_nearest_features=None, predictor=None,
+ random_state=0, verbose=False)
+ >>> X_test = [[np.nan, 2], [6, np.nan], [np.nan, 6]]
+ >>> print(np.round(imp.transform(X_test)))
+ [[ 1. 2.]
+ [ 6. 4.]
+ [13. 6.]]
+
+Both :class:`SimpleImputer` and :class:`MICEImputer` can be used in a Pipeline
+as a way to build a composite estimator that supports imputation.
+See :ref:`sphx_glr_auto_examples_plot_missing_values.py`.
diff --git a/examples/plot_missing_values.py b/examples/plot_missing_values.py
index 9cc07b75d2542..e32c19fae0846 100644
--- a/examples/plot_missing_values.py
+++ b/examples/plot_missing_values.py
@@ -1,72 +1,127 @@
"""
-======================================================
+====================================================
Imputing missing values before building an estimator
-======================================================
-
-This example shows that imputing the missing values can give better
-results than discarding the samples containing any missing value.
-Imputing does not always improve the predictions, so please check via
-cross-validation. Sometimes dropping rows or using marker values is
-more effective.
+====================================================
Missing values can be replaced by the mean, the median or the most frequent
-value using the ``strategy`` hyper-parameter.
+value using the basic ``SimpleImputer``.
The median is a more robust estimator for data with high magnitude variables
which could dominate results (otherwise known as a 'long tail').
-Script output::
-
- Score with the entire dataset = 0.56
- Score without the samples containing missing values = 0.48
- Score after imputation of the missing values = 0.55
-
-In this case, imputing helps the classifier get close to the original score.
-
+Another option is the MICE imputer. This uses round-robin linear regression,
+treating every variable as an output in turn. The version implemented assumes
+Gaussian (output) variables. If your features are obviously non-Normal,
+consider transforming them to look more Normal so as to improve performance.
"""
+
import numpy as np
+import matplotlib.pyplot as plt
+from sklearn.datasets import load_diabetes
from sklearn.datasets import load_boston
from sklearn.ensemble import RandomForestRegressor
from sklearn.pipeline import Pipeline
-from sklearn.impute import SimpleImputer
+from sklearn.impute import SimpleImputer, MICEImputer
from sklearn.model_selection import cross_val_score
rng = np.random.RandomState(0)
-dataset = load_boston()
-X_full, y_full = dataset.data, dataset.target
-n_samples = X_full.shape[0]
-n_features = X_full.shape[1]
-
-# Estimate the score on the entire dataset, with no missing values
-estimator = RandomForestRegressor(random_state=0, n_estimators=100)
-score = cross_val_score(estimator, X_full, y_full).mean()
-print("Score with the entire dataset = %.2f" % score)
-
-# Add missing values in 75% of the lines
-missing_rate = 0.75
-n_missing_samples = int(np.floor(n_samples * missing_rate))
-missing_samples = np.hstack((np.zeros(n_samples - n_missing_samples,
- dtype=np.bool),
- np.ones(n_missing_samples,
- dtype=np.bool)))
-rng.shuffle(missing_samples)
-missing_features = rng.randint(0, n_features, n_missing_samples)
-
-# Estimate the score without the lines containing missing values
-X_filtered = X_full[~missing_samples, :]
-y_filtered = y_full[~missing_samples]
-estimator = RandomForestRegressor(random_state=0, n_estimators=100)
-score = cross_val_score(estimator, X_filtered, y_filtered).mean()
-print("Score without the samples containing missing values = %.2f" % score)
-
-# Estimate the score after imputation of the missing values
-X_missing = X_full.copy()
-X_missing[np.where(missing_samples)[0], missing_features] = 0
-y_missing = y_full.copy()
-estimator = Pipeline([("imputer", SimpleImputer(missing_values=0,
- strategy="mean")),
- ("forest", RandomForestRegressor(random_state=0,
- n_estimators=100))])
-score = cross_val_score(estimator, X_missing, y_missing).mean()
-print("Score after imputation of the missing values = %.2f" % score)
+
+def get_results(dataset):
+ X_full, y_full = dataset.data, dataset.target
+ n_samples = X_full.shape[0]
+ n_features = X_full.shape[1]
+
+ # Estimate the score on the entire dataset, with no missing values
+ estimator = RandomForestRegressor(random_state=0, n_estimators=100)
+ full_scores = cross_val_score(estimator, X_full, y_full,
+ scoring='neg_mean_squared_error')
+
+ # Add missing values in 75% of the lines
+ missing_rate = 0.75
+ n_missing_samples = int(np.floor(n_samples * missing_rate))
+ missing_samples = np.hstack((np.zeros(n_samples - n_missing_samples,
+ dtype=np.bool),
+ np.ones(n_missing_samples,
+ dtype=np.bool)))
+ rng.shuffle(missing_samples)
+ missing_features = rng.randint(0, n_features, n_missing_samples)
+
+ # Estimate the score after replacing missing values by 0
+ X_missing = X_full.copy()
+ X_missing[np.where(missing_samples)[0], missing_features] = 0
+ y_missing = y_full.copy()
+ estimator = RandomForestRegressor(random_state=0, n_estimators=100)
+ zero_impute_scores = cross_val_score(estimator, X_missing, y_missing,
+ scoring='neg_mean_squared_error')
+
+ # Estimate the score after imputation (mean strategy) of the missing values
+ X_missing = X_full.copy()
+ X_missing[np.where(missing_samples)[0], missing_features] = 0
+ y_missing = y_full.copy()
+ estimator = Pipeline([("imputer", SimpleImputer(missing_values=0,
+ strategy="mean")),
+ ("forest", RandomForestRegressor(random_state=0,
+ n_estimators=100))])
+ mean_impute_scores = cross_val_score(estimator, X_missing, y_missing,
+ scoring='neg_mean_squared_error')
+
+ # Estimate the score after imputation (MICE strategy) of the missing values
+ estimator = Pipeline([("imputer", MICEImputer(missing_values=0,
+ random_state=0)),
+ ("forest", RandomForestRegressor(random_state=0,
+ n_estimators=100))])
+ mice_impute_scores = cross_val_score(estimator, X_missing, y_missing,
+ scoring='neg_mean_squared_error')
+
+ return ((full_scores.mean(), full_scores.std()),
+ (zero_impute_scores.mean(), zero_impute_scores.std()),
+ (mean_impute_scores.mean(), mean_impute_scores.std()),
+ (mice_impute_scores.mean(), mice_impute_scores.std()))
+
+
+results_diabetes = np.array(get_results(load_diabetes()))
+mses_diabetes = results_diabetes[:, 0] * -1
+stds_diabetes = results_diabetes[:, 1]
+
+results_boston = np.array(get_results(load_boston()))
+mses_boston = results_boston[:, 0] * -1
+stds_boston = results_boston[:, 1]
+
+n_bars = len(mses_diabetes)
+xval = np.arange(n_bars)
+
+x_labels = ['Full data',
+ 'Zero imputation',
+ 'Mean Imputation',
+ 'MICE Imputation']
+colors = ['r', 'g', 'b', 'orange']
+
+# plot diabetes results
+plt.figure(figsize=(12, 6))
+ax1 = plt.subplot(121)
+for j in xval:
+ ax1.barh(j, mses_diabetes[j], xerr=stds_diabetes[j],
+ color=colors[j], alpha=0.6, align='center')
+
+ax1.set_title('Feature Selection Techniques with Diabetes Data')
+ax1.set_xlim(left=np.min(mses_diabetes) * 0.9,
+ right=np.max(mses_diabetes) * 1.1)
+ax1.set_yticks(xval)
+ax1.set_xlabel('MSE')
+ax1.invert_yaxis()
+ax1.set_yticklabels(x_labels)
+
+# plot boston results
+ax2 = plt.subplot(122)
+for j in xval:
+ ax2.barh(j, mses_boston[j], xerr=stds_boston[j],
+ color=colors[j], alpha=0.6, align='center')
+
+ax2.set_title('Feature Selection Techniques with Boston Data')
+ax2.set_yticks(xval)
+ax2.set_xlabel('MSE')
+ax2.invert_yaxis()
+ax2.set_yticklabels([''] * n_bars)
+
+plt.show()
diff --git a/sklearn/impute.py b/sklearn/impute.py
index 2420e02560e42..f395c634ccced 100644
--- a/sklearn/impute.py
+++ b/sklearn/impute.py
@@ -1,16 +1,23 @@
"""Transformers for missing value imputation"""
# Authors: Nicolas Tresegnie <nicolas.tresegnie@gmail.com>
+# Sergey Feldman <sergeyfeldman@gmail.com>
# License: BSD 3 clause
+from __future__ import division
+
import warnings
+from time import time
import numpy as np
import numpy.ma as ma
from scipy import sparse
from scipy import stats
+from collections import namedtuple
from .base import BaseEstimator, TransformerMixin
-from .utils import check_array
+from .base import clone
+from .preprocessing import normalize
+from .utils import check_array, check_random_state, safe_indexing
from .utils.sparsefuncs import _get_median
from .utils.validation import check_is_fitted
from .utils.validation import FLOAT_DTYPES
@@ -20,8 +27,13 @@
zip = six.moves.zip
map = six.moves.map
+MICETriplet = namedtuple('MICETriplet', ['feat_idx',
+ 'neighbor_feat_idx',
+ 'predictor'])
+
__all__ = [
'SimpleImputer',
+ 'MICEImputer',
]
@@ -321,3 +333,541 @@ def transform(self, X):
X[coordinates] = values
return X
+
+
+class MICEImputer(BaseEstimator, TransformerMixin):
+ """MICE transformer to impute missing values.
+
+ Basic implementation of MICE (Multivariate Imputations by Chained
+ Equations) package from R. This version assumes all of the features are
+ Gaussian.
+
+ Parameters
+ ----------
+ missing_values : int or "NaN", optional (default="NaN")
+ The placeholder for the missing values. All occurrences of
+ ``missing_values`` will be imputed. For missing values encoded as
+ np.nan, use the string value "NaN".
+
+ imputation_order : str, optional (default="ascending")
+ The order in which the features will be imputed. Possible values:
+
+ "ascending"
+ From features with fewest missing values to most.
+ "descending"
+ From features with most missing values to fewest.
+ "roman"
+ Left to right.
+ "arabic"
+ Right to left.
+ "random"
+ A random order for each round.
+
+ n_imputations : int, optional (default=100)
+ Number of MICE rounds to perform, the results of which will be
+ used in the final average.
+
+ n_burn_in : int, optional (default=10)
+ Number of initial MICE rounds to perform the results of which
+ will not be returned.
+
+ predictor : estimator object, default=BayesianRidge()
+ The predictor to use at each step of the round-robin imputation.
+ It must support ``return_std`` in its ``predict`` method.
+
+ n_nearest_features : int, optional (default=None)
+ Number of other features to use to estimate the missing values of
+ the each feature column. Nearness between features is measured using
+ the absolute correlation coefficient between each feature pair (after
+ initial imputation). Can provide significant speed-up when the number
+ of features is huge. If ``None``, all features will be used.
+
+ initial_strategy : str, optional (default="mean")
+ Which strategy to use to initialize the missing values. Same as the
+ ``strategy`` parameter in :class:`sklearn.preprocessing.Imputer`
+ Valid values: {"mean", "median", or "most_frequent"}.
+
+ min_value : float, optional (default=None)
+ Minimum possible imputed value. Default of ``None`` will set minimum
+ to negative infinity.
+
+ max_value : float, optional (default=None)
+ Maximum possible imputed value. Default of ``None`` will set maximum
+ to positive infinity.
+
+ verbose : int, optional (default=0)
+ Verbosity flag, controls the debug messages that are issued
+ as functions are evaluated. The higher, the more verbose. Can be 0, 1,
+ or 2.
+
+ random_state : int, RandomState instance or None, optional (default=None)
+ The seed of the pseudo random number generator to use when shuffling
+ the data. If int, random_state is the seed used by the random number
+ generator; If RandomState instance, random_state is the random number
+ generator; If None, the random number generator is the RandomState
+ instance used by ``np.random``.
+
+ Attributes
+ ----------
+ initial_imputer_ : object of class :class:`sklearn.preprocessing.Imputer`'
+ The imputer used to initialize the missing values.
+
+ imputation_sequence_ : list of tuples
+ Each tuple has ``(feat_idx, neighbor_feat_idx, predictor)``, where
+ ``feat_idx`` is the current feature to be imputed,
+ ``neighbor_feat_idx`` is the array of other features used to impute the
+ current feature, and ``predictor`` is the trained predictor used for
+ the imputation.
+
+ Notes
+ -----
+ The R version of MICE does not have inductive functionality, i.e. first
+ fitting on ``X_train`` and then transforming any ``X_test`` without
+ additional fitting. We do this by storing each feature's predictor during
+ the round-robin ``fit`` phase, and predicting without refitting (in order)
+ during the ``transform`` phase.
+
+ Features which contain all missing values at ``fit`` are discarded upon
+ ``transform``.
+
+ Features with missing values in transform which did not have any missing
+ values in fit will be imputed with the initial imputation method only.
+
+ References
+ ----------
+ .. [1] `Stef van Buuren, Karin Groothuis-Oudshoorn (2011). "mice:
+ Multivariate Imputation by Chained Equations in R". Journal of
+ Statistical Software 45: 1-67.
+ <https://www.jstatsoft.org/article/view/v045i03>`_
+ """
+
+ def __init__(self,
+ missing_values='NaN',
+ imputation_order='ascending',
+ n_imputations=100,
+ n_burn_in=10,
+ predictor=None,
+ n_nearest_features=None,
+ initial_strategy="mean",
+ min_value=None,
+ max_value=None,
+ verbose=False,
+ random_state=None):
+
+ self.missing_values = missing_values
+ self.imputation_order = imputation_order
+ self.n_imputations = n_imputations
+ self.n_burn_in = n_burn_in
+ self.predictor = predictor
+ self.n_nearest_features = n_nearest_features
+ self.initial_strategy = initial_strategy
+ self.min_value = min_value
+ self.max_value = max_value
+ self.verbose = verbose
+ self.random_state = random_state
+
+ def _impute_one_feature(self,
+ X_filled,
+ mask_missing_values,
+ feat_idx,
+ neighbor_feat_idx,
+ predictor=None,
+ fit_mode=True):
+ """Impute a single feature from the others provided.
+
+ This function predicts the missing values of one of the features using
+ the current estimates of all the other features. The ``predictor`` must
+ support ``return_std=True`` in its ``predict`` method for this function
+ to work.
+
+ Parameters
+ ----------
+ X_filled : ndarray
+ Input data with the most recent imputations.
+
+ mask_missing_values : ndarray
+ Input data's missing indicator matrix.
+
+ feat_idx : int
+ Index of the feature currently being imputed.
+
+ neighbor_feat_idx : ndarray
+ Indices of the features to be used in imputing ``feat_idx``.
+
+ predictor : object
+ The predictor to use at this step of the round-robin imputation.
+ It must support ``return_std`` in its ``predict`` method.
+ If None, it will be cloned from self._predictor.
+
+ fit_mode : boolean, default=True
+ Whether to fit and predict with the predictor or just predict.
+
+ Returns
+ -------
+ X_filled : ndarray
+ Input data with ``X_filled[missing_row_mask, feat_idx]`` updated.
+
+ predictor : predictor with sklearn API
+ The fitted predictor used to impute
+ ``X_filled[missing_row_mask, feat_idx]``.
+ """
+
+ # if nothing is missing, just return the default
+ # (should not happen at fit time because feat_ids would be excluded)
+ missing_row_mask = mask_missing_values[:, feat_idx]
+ if not np.any(missing_row_mask):
+ return X_filled, predictor
+
+ if predictor is None and fit_mode is False:
+ raise ValueError("If fit_mode is False, then an already-fitted "
+ "predictor should be passed in.")
+
+ if predictor is None:
+ predictor = clone(self._predictor)
+
+ if fit_mode:
+ X_train = safe_indexing(X_filled[:, neighbor_feat_idx],
+ ~missing_row_mask)
+ y_train = safe_indexing(X_filled[:, feat_idx],
+ ~missing_row_mask)
+ predictor.fit(X_train, y_train)
+
+ # get posterior samples
+ X_test = safe_indexing(X_filled[:, neighbor_feat_idx],
+ missing_row_mask)
+ mus, sigmas = predictor.predict(X_test, return_std=True)
+ good_sigmas = sigmas > 0
+ imputed_values = np.zeros(mus.shape, dtype=X_filled.dtype)
+ imputed_values[~good_sigmas] = mus[~good_sigmas]
+ imputed_values[good_sigmas] = self.random_state_.normal(
+ loc=mus[good_sigmas], scale=sigmas[good_sigmas])
+
+ # clip the values
+ imputed_values = np.clip(imputed_values,
+ self._min_value,
+ self._max_value)
+
+ # update the feature
+ X_filled[missing_row_mask, feat_idx] = imputed_values
+ return X_filled, predictor
+
+ def _get_neighbor_feat_idx(self,
+ n_features,
+ feat_idx,
+ abs_corr_mat):
+ """Get a list of other features to predict ``feat_idx``.
+
+ If self.n_nearest_features is less than or equal to the total
+ number of features, then use a probability proportional to the absolute
+ correlation between ``feat_idx`` and each other feature to randomly
+ choose a subsample of the other features (without replacement).
+
+ Parameters
+ ----------
+ n_features : int
+ Number of features in ``X``.
+
+ feat_idx : int
+ Index of the feature currently being imputed.
+
+ abs_corr_mat : ndarray, shape (n_features, n_features)
+ Absolute correlation matrix of ``X``. The diagonal has been zeroed
+ out and each feature has been normalized to sum to 1. Can be None.
+
+ Returns
+ -------
+ neighbor_feat_idx : array-like
+ The features to use to impute ``feat_idx``.
+ """
+ if (self.n_nearest_features is not None and
+ self.n_nearest_features < n_features):
+ p = abs_corr_mat[:, feat_idx]
+ neighbor_feat_idx = self.random_state_.choice(
+ np.arange(n_features), self.n_nearest_features, replace=False,
+ p=p)
+ else:
+ inds_left = np.arange(feat_idx)
+ inds_right = np.arange(feat_idx + 1, n_features)
+ neighbor_feat_idx = np.concatenate((inds_left, inds_right))
+ return neighbor_feat_idx
+
+ def _get_ordered_idx(self, mask_missing_values):
+ """Decide in what order we will update the features.
+
+ As a homage to the MICE R package, we will have 4 main options of
+ how to order the updates, and use a random order if anything else
+ is specified.
+
+ Also, this function skips features which have no missing values.
+
+ Parameters
+ ----------
+ mask_missing_values : array-like, shape (n_samples, n_features)
+ Input data's missing indicator matrix, where "n_samples" is the
+ number of samples and "n_features" is the number of features.
+
+ Returns
+ -------
+ ordered_idx : ndarray, shape (n_features,)
+ The order in which to impute the features.
+ """
+ frac_of_missing_values = mask_missing_values.mean(axis=0)
+ missing_values_idx = np.nonzero(frac_of_missing_values)[0]
+ if self.imputation_order == 'roman':
+ ordered_idx = missing_values_idx
+ elif self.imputation_order == 'arabic':
+ ordered_idx = missing_values_idx[::-1]
+ elif self.imputation_order == 'ascending':
+ n = len(frac_of_missing_values) - len(missing_values_idx)
+ ordered_idx = np.argsort(frac_of_missing_values,
+ kind='mergesort')[n:][::-1]
+ elif self.imputation_order == 'descending':
+ n = len(frac_of_missing_values) - len(missing_values_idx)
+ ordered_idx = np.argsort(frac_of_missing_values,
+ kind='mergesort')[n:]
+ elif self.imputation_order == 'random':
+ ordered_idx = missing_values_idx
+ self.random_state_.shuffle(ordered_idx)
+ else:
+ raise ValueError("Got an invalid imputation order: '{0}'. It must "
+ "be one of the following: 'roman', 'arabic', "
+ "'ascending', 'descending', or "
+ "'random'.".format(self.imputation_order))
+ return ordered_idx
+
+ def _get_abs_corr_mat(self, X_filled, tolerance=1e-6):
+ """Get absolute correlation matrix between features.
+
+ Parameters
+ ----------
+ X_filled : ndarray, shape (n_samples, n_features)
+ Input data with the most recent imputations.
+
+ tolerance : float, optional (default=1e-6)
+ ``abs_corr_mat`` can have nans, which will be replaced
+ with ``tolerance``.
+
+ Returns
+ -------
+ abs_corr_mat : ndarray, shape (n_features, n_features)
+ Absolute correlation matrix of ``X`` at the beginning of the
+ current round. The diagonal has been zeroed out and each feature's
+ absolute correlations with all others have been normalized to sum
+ to 1.
+ """
+ n_features = X_filled.shape[1]
+ if (self.n_nearest_features is None or
+ self.n_nearest_features >= n_features):
+ return None
+ abs_corr_mat = np.abs(np.corrcoef(X_filled.T))
+ # np.corrcoef is not defined for features with zero std
+ abs_corr_mat[np.isnan(abs_corr_mat)] = tolerance
+ # ensures exploration, i.e. at least some probability of sampling
+ abs_corr_mat[abs_corr_mat < tolerance] = tolerance
+ # features are not their own neighbors
+ np.fill_diagonal(abs_corr_mat, 0)
+ # needs to sum to 1 for np.random.choice sampling
+ abs_corr_mat = normalize(abs_corr_mat, norm='l1', axis=0, copy=False)
+ return abs_corr_mat
+
+ def _initial_imputation(self, X):
+ """Perform initial imputation for input X.
+
+ Parameters
+ ----------
+ X : ndarray, shape (n_samples, n_features)
+ Input data, where "n_samples" is the number of samples and
+ "n_features" is the number of features.
+
+ Returns
+ -------
+ Xt : ndarray, shape (n_samples, n_features)
+ Input data, where "n_samples" is the number of samples and
+ "n_features" is the number of features.
+
+ X_filled : ndarray, shape (n_samples, n_features)
+ Input data with the most recent imputations.
+
+ mask_missing_values : ndarray, shape (n_samples, n_features)
+ Input data's missing indicator matrix, where "n_samples" is the
+ number of samples and "n_features" is the number of features.
+ """
+ X = check_array(X, dtype=FLOAT_DTYPES, order="F",
+ force_all_finite='allow-nan'
+ if self.missing_values == 'NaN'
+ or np.isnan(self.missing_values) else True)
+
+ mask_missing_values = _get_mask(X, self.missing_values)
+ if self.initial_imputer_ is None:
+ self.initial_imputer_ = SimpleImputer(
+ missing_values=self.missing_values,
+ strategy=self.initial_strategy)
+ X_filled = self.initial_imputer_.fit_transform(X)
+ else:
+ X_filled = self.initial_imputer_.transform(X)
+
+ valid_mask = np.flatnonzero(np.logical_not(
+ np.isnan(self.initial_imputer_.statistics_)))
+ Xt = X[:, valid_mask]
+ mask_missing_values = mask_missing_values[:, valid_mask]
+
+ return Xt, X_filled, mask_missing_values
+
+ def fit_transform(self, X, y=None):
+ """Fits the imputer on X and return the transformed X.
+
+ Parameters
+ ----------
+ X : array-like, shape (n_samples, n_features)
+ Input data, where "n_samples" is the number of samples and
+ "n_features" is the number of features.
+
+ y : ignored.
+
+ Returns
+ -------
+ Xt : array-like, shape (n_samples, n_features)
+ The imputed input data.
+ """
+ self.random_state_ = getattr(self, "random_state_",
+ check_random_state(self.random_state))
+
+ if self.predictor is None:
+ from .linear_model import BayesianRidge
+ self._predictor = BayesianRidge()
+ else:
+ self._predictor = clone(self.predictor)
+
+ self._min_value = np.nan if self.min_value is None else self.min_value
+ self._max_value = np.nan if self.max_value is None else self.max_value
+
+ self.initial_imputer_ = None
+ X, X_filled, mask_missing_values = self._initial_imputation(X)
+
+ # edge case: in case the user specifies 0 for n_imputations,
+ # then there is no need to do burn in and the result should be
+ # just the initial imputation (before clipping)
+ if self.n_imputations < 1:
+ return X_filled
+
+ X_filled = np.clip(X_filled, self._min_value, self._max_value)
+
+ # order in which to impute
+ # note this is probably too slow for large feature data (d > 100000)
+ # and a better way would be good.
+ # see: https://goo.gl/KyCNwj and subsequent comments
+ ordered_idx = self._get_ordered_idx(mask_missing_values)
+
+ abs_corr_mat = self._get_abs_corr_mat(X_filled)
+
+ # impute data
+ n_rounds = self.n_burn_in + self.n_imputations
+ n_samples, n_features = X_filled.shape
+ Xt = np.zeros((n_samples, n_features), dtype=X.dtype)
+ self.imputation_sequence_ = []
+ if self.verbose > 0:
+ print("[MICE] Completing matrix with shape %s" % (X.shape,))
+ start_t = time()
+ for i_rnd in range(n_rounds):
+ if self.imputation_order == 'random':
+ ordered_idx = self._get_ordered_idx(mask_missing_values)
+
+ for feat_idx in ordered_idx:
+ neighbor_feat_idx = self._get_neighbor_feat_idx(n_features,
+ feat_idx,
+ abs_corr_mat)
+ X_filled, predictor = self._impute_one_feature(
+ X_filled, mask_missing_values, feat_idx, neighbor_feat_idx,
+ predictor=None, fit_mode=True)
+ predictor_triplet = MICETriplet(feat_idx,
+ neighbor_feat_idx,
+ predictor)
+ self.imputation_sequence_.append(predictor_triplet)
+
+ if i_rnd >= self.n_burn_in:
+ Xt += X_filled
+ if self.verbose > 0:
+ print('[MICE] Ending imputation round '
+ '%d/%d, elapsed time %0.2f'
+ % (i_rnd + 1, n_rounds, time() - start_t))
+
+ Xt /= self.n_imputations
+ Xt[~mask_missing_values] = X[~mask_missing_values]
+ return Xt
+
+ def transform(self, X):
+ """Imputes all missing values in X.
+
+ Note that this is stochastic, and that if random_state is not fixed,
+ repeated calls, or permuted input, will yield different results.
+
+ Parameters
+ ----------
+ X : array-like, shape = [n_samples, n_features]
+ The input data to complete.
+
+ Returns
+ -------
+ Xt : array-like, shape (n_samples, n_features)
+ The imputed input data.
+ """
+ check_is_fitted(self, 'initial_imputer_')
+
+ X, X_filled, mask_missing_values = self._initial_imputation(X)
+
+ # edge case: in case the user specifies 0 for n_imputations,
+ # then there is no need to do burn in and the result should be
+ # just the initial imputation (before clipping)
+ if self.n_imputations < 1:
+ return X_filled
+
+ X_filled = np.clip(X_filled, self._min_value, self._max_value)
+
+ n_rounds = self.n_burn_in + self.n_imputations
+ n_imputations = len(self.imputation_sequence_)
+ imputations_per_round = n_imputations // n_rounds
+ i_rnd = 0
+ Xt = np.zeros(X.shape, dtype=X.dtype)
+ if self.verbose > 0:
+ print("[MICE] Completing matrix with shape %s" % (X.shape,))
+ start_t = time()
+ for it, predictor_triplet in enumerate(self.imputation_sequence_):
+ X_filled, _ = self._impute_one_feature(
+ X_filled,
+ mask_missing_values,
+ predictor_triplet.feat_idx,
+ predictor_triplet.neighbor_feat_idx,
+ predictor=predictor_triplet.predictor,
+ fit_mode=False
+ )
+ if not (it + 1) % imputations_per_round:
+ if i_rnd >= self.n_burn_in:
+ Xt += X_filled
+ if self.verbose > 1:
+ print('[MICE] Ending imputation round '
+ '%d/%d, elapsed time %0.2f'
+ % (i_rnd + 1, n_rounds, time() - start_t))
+ i_rnd += 1
+
+ Xt /= self.n_imputations
+ Xt[~mask_missing_values] = X[~mask_missing_values]
+ return Xt
+
+ def fit(self, X, y=None):
+ """Fits the imputer on X and return self.
+
+ Parameters
+ ----------
+ X : array-like, shape (n_samples, n_features)
+ Input data, where "n_samples" is the number of samples and
+ "n_features" is the number of features.
+
+ y : ignored
+
+ Returns
+ -------
+ self : object
+ Returns self.
+ """
+ self.fit_transform(X)
+ return self
| diff --git a/sklearn/tests/test_impute.py b/sklearn/tests/test_impute.py
index f2bf5912e2213..954a016a835bb 100644
--- a/sklearn/tests/test_impute.py
+++ b/sklearn/tests/test_impute.py
@@ -1,14 +1,19 @@
+from __future__ import division
+
+import pytest
import numpy as np
from scipy import sparse
-from sklearn.utils.testing import assert_equal
+from sklearn.utils.testing import assert_allclose
from sklearn.utils.testing import assert_array_equal
from sklearn.utils.testing import assert_array_almost_equal
from sklearn.utils.testing import assert_raises
from sklearn.utils.testing import assert_false
-from sklearn.impute import SimpleImputer
+from sklearn.impute import SimpleImputer, MICEImputer
+from sklearn.dummy import DummyRegressor
+from sklearn.linear_model import BayesianRidge, ARDRegression
from sklearn.pipeline import Pipeline
from sklearn.model_selection import GridSearchCV
from sklearn import tree
@@ -61,10 +66,14 @@ def test_imputation_shape():
for strategy in ['mean', 'median', 'most_frequent']:
imputer = SimpleImputer(strategy=strategy)
- X_imputed = imputer.fit_transform(X)
- assert_equal(X_imputed.shape, (10, 2))
X_imputed = imputer.fit_transform(sparse.csr_matrix(X))
- assert_equal(X_imputed.shape, (10, 2))
+ assert X_imputed.shape == (10, 2)
+ X_imputed = imputer.fit_transform(X)
+ assert X_imputed.shape == (10, 2)
+
+ mice_imputer = MICEImputer(initial_strategy=strategy)
+ X_imputed = mice_imputer.fit_transform(X)
+ assert X_imputed.shape == (10, 2)
def safe_median(arr, *args, **kwargs):
@@ -257,3 +266,227 @@ def test_imputation_copy():
Xt = imputer.fit(X).transform(X)
Xt.data[0] = -1
assert_false(np.all(X.data == Xt.data))
+
+ # Note: If X is sparse and if missing_values=0, then a (dense) copy of X is
+ # made, even if copy=False.
+
+
+def test_mice_rank_one():
+ rng = np.random.RandomState(0)
+ d = 100
+ A = rng.rand(d, 1)
+ B = rng.rand(1, d)
+ X = np.dot(A, B)
+ nan_mask = rng.rand(d, d) < 0.5
+ X_missing = X.copy()
+ X_missing[nan_mask] = np.nan
+
+ imputer = MICEImputer(n_imputations=5,
+ n_burn_in=5,
+ verbose=True,
+ random_state=rng)
+ X_filled = imputer.fit_transform(X_missing)
+ assert_allclose(X_filled, X, atol=0.001)
+
+
+@pytest.mark.parametrize(
+ "imputation_order",
+ ['random', 'roman', 'ascending', 'descending', 'arabic']
+)
+def test_mice_imputation_order(imputation_order):
+ rng = np.random.RandomState(0)
+ n = 100
+ d = 10
+ X = sparse_random_matrix(n, d, density=0.10, random_state=rng).toarray()
+ X[:, 0] = 1 # this column should not be discarded by MICEImputer
+
+ imputer = MICEImputer(missing_values=0,
+ n_imputations=1,
+ n_burn_in=1,
+ n_nearest_features=5,
+ min_value=0,
+ max_value=1,
+ verbose=False,
+ imputation_order=imputation_order,
+ random_state=rng)
+ imputer.fit_transform(X)
+ ordered_idx = [i.feat_idx for i in imputer.imputation_sequence_]
+ if imputation_order == 'roman':
+ assert np.all(ordered_idx[:d-1] == np.arange(1, d))
+ elif imputation_order == 'arabic':
+ assert np.all(ordered_idx[:d-1] == np.arange(d-1, 0, -1))
+ elif imputation_order == 'random':
+ ordered_idx_round_1 = ordered_idx[:d-1]
+ ordered_idx_round_2 = ordered_idx[d-1:]
+ assert ordered_idx_round_1 != ordered_idx_round_2
+ elif 'ending' in imputation_order:
+ assert len(ordered_idx) == 2 * (d - 1)
+
+
+@pytest.mark.parametrize(
+ "predictor",
+ [DummyRegressor(), BayesianRidge(), ARDRegression()]
+)
+def test_mice_predictors(predictor):
+ rng = np.random.RandomState(0)
+
+ n = 100
+ d = 10
+ X = sparse_random_matrix(n, d, density=0.10, random_state=rng).toarray()
+
+ imputer = MICEImputer(missing_values=0,
+ n_imputations=1,
+ n_burn_in=1,
+ predictor=predictor,
+ random_state=rng)
+ imputer.fit_transform(X)
+
+ # check that types are correct for predictors
+ hashes = []
+ for triplet in imputer.imputation_sequence_:
+ assert triplet.predictor
+ hashes.append(id(triplet.predictor))
+
+ # check that each predictor is unique
+ assert len(set(hashes)) == len(hashes)
+
+
+def test_mice_clip():
+ rng = np.random.RandomState(0)
+ n = 100
+ d = 10
+ X = sparse_random_matrix(n, d, density=0.10,
+ random_state=rng).toarray()
+
+ imputer = MICEImputer(missing_values=0,
+ n_imputations=1,
+ n_burn_in=1,
+ min_value=0.1,
+ max_value=0.2,
+ random_state=rng)
+
+ Xt = imputer.fit_transform(X)
+ assert_allclose(np.min(Xt[X == 0]), 0.1)
+ assert_allclose(np.max(Xt[X == 0]), 0.2)
+ assert_allclose(Xt[X != 0], X[X != 0])
+
+
+@pytest.mark.parametrize(
+ "strategy",
+ ["mean", "median", "most_frequent"]
+)
+def test_mice_missing_at_transform(strategy):
+ rng = np.random.RandomState(0)
+ n = 100
+ d = 10
+ X_train = rng.randint(low=0, high=3, size=(n, d))
+ X_test = rng.randint(low=0, high=3, size=(n, d))
+
+ X_train[:, 0] = 1 # definitely no missing values in 0th column
+ X_test[0, 0] = 0 # definitely missing value in 0th column
+
+ mice = MICEImputer(missing_values=0,
+ n_imputations=1,
+ n_burn_in=1,
+ initial_strategy=strategy,
+ random_state=rng).fit(X_train)
+ initial_imputer = SimpleImputer(missing_values=0,
+ strategy=strategy).fit(X_train)
+
+ # if there were no missing values at time of fit, then mice will
+ # only use the initial imputer for that feature at transform
+ assert np.all(mice.transform(X_test)[:, 0] ==
+ initial_imputer.transform(X_test)[:, 0])
+
+
+def test_mice_transform_stochasticity():
+ rng = np.random.RandomState(0)
+ n = 100
+ d = 10
+ X = sparse_random_matrix(n, d, density=0.10,
+ random_state=rng).toarray()
+
+ imputer = MICEImputer(missing_values=0,
+ n_imputations=1,
+ n_burn_in=1,
+ random_state=rng)
+ imputer.fit(X)
+
+ X_fitted_1 = imputer.transform(X)
+ X_fitted_2 = imputer.transform(X)
+
+ # sufficient to assert that the means are not the same
+ assert np.mean(X_fitted_1) != pytest.approx(np.mean(X_fitted_2))
+
+
+def test_mice_no_missing():
+ rng = np.random.RandomState(0)
+ X = rng.rand(100, 100)
+ X[:, 0] = np.nan
+ m1 = MICEImputer(n_imputations=10, random_state=rng)
+ m2 = MICEImputer(n_imputations=10, random_state=rng)
+ pred1 = m1.fit(X).transform(X)
+ pred2 = m2.fit_transform(X)
+ # should exclude the first column entirely
+ assert_allclose(X[:, 1:], pred1)
+ # fit and fit_transform should both be identical
+ assert_allclose(pred1, pred2)
+
+
+@pytest.mark.parametrize(
+ "rank",
+ [3, 5]
+)
+def test_mice_transform_recovery(rank):
+ rng = np.random.RandomState(0)
+ n = 100
+ d = 100
+ A = rng.rand(n, rank)
+ B = rng.rand(rank, d)
+ X_filled = np.dot(A, B)
+ # half is randomly missing
+ nan_mask = rng.rand(n, d) < 0.5
+ X_missing = X_filled.copy()
+ X_missing[nan_mask] = np.nan
+
+ # split up data in half
+ n = n // 2
+ X_train = X_missing[:n]
+ X_test_filled = X_filled[n:]
+ X_test = X_missing[n:]
+
+ imputer = MICEImputer(n_imputations=10,
+ n_burn_in=10,
+ verbose=True,
+ random_state=rng).fit(X_train)
+ X_test_est = imputer.transform(X_test)
+ assert_allclose(X_test_filled, X_test_est, rtol=1e-5, atol=0.1)
+
+
+def test_mice_additive_matrix():
+ rng = np.random.RandomState(0)
+ n = 100
+ d = 10
+ A = rng.randn(n, d)
+ B = rng.randn(n, d)
+ X_filled = np.zeros(A.shape)
+ for i in range(d):
+ for j in range(d):
+ X_filled[:, (i+j) % d] += (A[:, i] + B[:, j]) / 2
+ # a quarter is randomly missing
+ nan_mask = rng.rand(n, d) < 0.25
+ X_missing = X_filled.copy()
+ X_missing[nan_mask] = np.nan
+
+ # split up data
+ n = n // 2
+ X_train = X_missing[:n]
+ X_test_filled = X_filled[n:]
+ X_test = X_missing[n:]
+
+ imputer = MICEImputer(n_imputations=25,
+ n_burn_in=10,
+ verbose=True,
+ random_state=rng).fit(X_train)
+ X_test_est = imputer.transform(X_test)
+ assert_allclose(X_test_filled, X_test_est, atol=0.01)
diff --git a/sklearn/utils/estimator_checks.py b/sklearn/utils/estimator_checks.py
index 9a321e914b238..8508e166cd8f9 100644
--- a/sklearn/utils/estimator_checks.py
+++ b/sklearn/utils/estimator_checks.py
@@ -72,7 +72,7 @@
'OrthogonalMatchingPursuit', 'PLSCanonical', 'PLSRegression',
'RANSACRegressor', 'RadiusNeighborsRegressor',
'RandomForestRegressor', 'Ridge', 'RidgeCV']
-
+ALLOW_NAN = ['Imputer', 'SimpleImputer', 'MICEImputer']
def _yield_non_meta_checks(name, estimator):
yield check_estimators_dtypes
@@ -93,7 +93,7 @@ def _yield_non_meta_checks(name, estimator):
# cross-decomposition's "transform" returns X and Y
yield check_pipeline_consistency
- if name not in ['SimpleImputer', 'Imputer']:
+ if name not in ALLOW_NAN:
# Test that all estimators check their input for NaN's and infs
yield check_estimators_nan_inf
| diff --git a/doc/modules/classes.rst b/doc/modules/classes.rst
index 2e7dcba82e846..243c63ab0c7e2 100644
--- a/doc/modules/classes.rst
+++ b/doc/modules/classes.rst
@@ -646,6 +646,7 @@ Kernels:
:template: class.rst
impute.SimpleImputer
+ impute.MICEImputer
.. _kernel_approximation_ref:
diff --git a/doc/modules/impute.rst b/doc/modules/impute.rst
index e806cc2fd5b4a..c4e8a3395025c 100644
--- a/doc/modules/impute.rst
+++ b/doc/modules/impute.rst
@@ -15,6 +15,10 @@ values. However, this comes at the price of losing data which may be valuable
(even though incomplete). A better strategy is to impute the missing values,
i.e., to infer them from the known part of the data.
+
+Univariate feature imputation
+=============================
+
The :class:`SimpleImputer` class provides basic strategies for imputing missing
values, either using the mean, the median or the most frequent value of
the row or column in which the missing values are located. This class
@@ -52,5 +56,34 @@ Note that, here, missing values are encoded by 0 and are thus implicitly stored
in the matrix. This format is thus suitable when there are many more missing
values than observed values.
-:class:`SimpleImputer` can be used in a Pipeline as a way to build a composite
-estimator that supports imputation. See :ref:`sphx_glr_auto_examples_plot_missing_values.py`.
+
+Multivariate feature imputation
+===============================
+
+A more sophisticated approach is to use the :class:`MICEImputer` class, which
+implements the Multivariate Imputation by Chained Equations technique. MICE
+models each feature with missing values as a function of other features, and
+uses that estimate for imputation. It does so in a round-robin fashion: at
+each step, a feature column is designated as output `y` and the other feature
+columns are treated as inputs `X`. A regressor is fit on `(X, y)` for known `y`.
+Then, the regressor is used to predict the unknown values of `y`. This is
+repeated for each feature, and then is done for a number of imputation rounds.
+Here is an example snippet::
+
+ >>> import numpy as np
+ >>> from sklearn.impute import MICEImputer
+ >>> imp = MICEImputer(n_imputations=10, random_state=0)
+ >>> imp.fit([[1, 2], [np.nan, 3], [7, np.nan]])
+ MICEImputer(imputation_order='ascending', initial_strategy='mean',
+ max_value=None, min_value=None, missing_values='NaN', n_burn_in=10,
+ n_imputations=10, n_nearest_features=None, predictor=None,
+ random_state=0, verbose=False)
+ >>> X_test = [[np.nan, 2], [6, np.nan], [np.nan, 6]]
+ >>> print(np.round(imp.transform(X_test)))
+ [[ 1. 2.]
+ [ 6. 4.]
+ [13. 6.]]
+
+Both :class:`SimpleImputer` and :class:`MICEImputer` can be used in a Pipeline
+as a way to build a composite estimator that supports imputation.
+See :ref:`sphx_glr_auto_examples_plot_missing_values.py`.
| [
{
"components": [
{
"doc": "",
"lines": [
30,
80
],
"name": "get_results",
"signature": "def get_results(dataset):",
"type": "function"
}
],
"file": "examples/plot_missing_values.py"
},
{
"components": [
{
... | [
"sklearn/tests/test_impute.py::test_imputation_shape",
"sklearn/tests/test_impute.py::test_imputation_mean_median",
"sklearn/tests/test_impute.py::test_imputation_median_special_cases",
"sklearn/tests/test_impute.py::test_imputation_most_frequent",
"sklearn/tests/test_impute.py::test_imputation_pipeline_gri... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
[MRG+2] Basic version of MICE Imputation
Reference Issue
This is in reference to #7840, and builds on #7838.
Fixes #7840.
This code provides basic MICE imputation functionality. It currently only uses Bayesian linear regression as the prediction model. Once this is merged, I will add predictive mean matching (slower but sometimes better). See here for a reference: https://stat.ethz.ch/education/semesters/ss2012/ams/paper/mice.pdf
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in examples/plot_missing_values.py]
(definition of get_results:)
def get_results(dataset):
[end of new definitions in examples/plot_missing_values.py]
[start of new definitions in sklearn/impute.py]
(definition of MICEImputer:)
class MICEImputer(BaseEstimator, TransformerMixin):
"""MICE transformer to impute missing values.
Basic implementation of MICE (Multivariate Imputations by Chained
Equations) package from R. This version assumes all of the features are
Gaussian.
Parameters
----------
missing_values : int or "NaN", optional (default="NaN")
The placeholder for the missing values. All occurrences of
``missing_values`` will be imputed. For missing values encoded as
np.nan, use the string value "NaN".
imputation_order : str, optional (default="ascending")
The order in which the features will be imputed. Possible values:
"ascending"
From features with fewest missing values to most.
"descending"
From features with most missing values to fewest.
"roman"
Left to right.
"arabic"
Right to left.
"random"
A random order for each round.
n_imputations : int, optional (default=100)
Number of MICE rounds to perform, the results of which will be
used in the final average.
n_burn_in : int, optional (default=10)
Number of initial MICE rounds to perform the results of which
will not be returned.
predictor : estimator object, default=BayesianRidge()
The predictor to use at each step of the round-robin imputation.
It must support ``return_std`` in its ``predict`` method.
n_nearest_features : int, optional (default=None)
Number of other features to use to estimate the missing values of
the each feature column. Nearness between features is measured using
the absolute correlation coefficient between each feature pair (after
initial imputation). Can provide significant speed-up when the number
of features is huge. If ``None``, all features will be used.
initial_strategy : str, optional (default="mean")
Which strategy to use to initialize the missing values. Same as the
``strategy`` parameter in :class:`sklearn.preprocessing.Imputer`
Valid values: {"mean", "median", or "most_frequent"}.
min_value : float, optional (default=None)
Minimum possible imputed value. Default of ``None`` will set minimum
to negative infinity.
max_value : float, optional (default=None)
Maximum possible imputed value. Default of ``None`` will set maximum
to positive infinity.
verbose : int, optional (default=0)
Verbosity flag, controls the debug messages that are issued
as functions are evaluated. The higher, the more verbose. Can be 0, 1,
or 2.
random_state : int, RandomState instance or None, optional (default=None)
The seed of the pseudo random number generator to use when shuffling
the data. If int, random_state is the seed used by the random number
generator; If RandomState instance, random_state is the random number
generator; If None, the random number generator is the RandomState
instance used by ``np.random``.
Attributes
----------
initial_imputer_ : object of class :class:`sklearn.preprocessing.Imputer`'
The imputer used to initialize the missing values.
imputation_sequence_ : list of tuples
Each tuple has ``(feat_idx, neighbor_feat_idx, predictor)``, where
``feat_idx`` is the current feature to be imputed,
``neighbor_feat_idx`` is the array of other features used to impute the
current feature, and ``predictor`` is the trained predictor used for
the imputation.
Notes
-----
The R version of MICE does not have inductive functionality, i.e. first
fitting on ``X_train`` and then transforming any ``X_test`` without
additional fitting. We do this by storing each feature's predictor during
the round-robin ``fit`` phase, and predicting without refitting (in order)
during the ``transform`` phase.
Features which contain all missing values at ``fit`` are discarded upon
``transform``.
Features with missing values in transform which did not have any missing
values in fit will be imputed with the initial imputation method only.
References
----------
.. [1] `Stef van Buuren, Karin Groothuis-Oudshoorn (2011). "mice:
Multivariate Imputation by Chained Equations in R". Journal of
Statistical Software 45: 1-67.
<https://www.jstatsoft.org/article/view/v045i03>`_"""
(definition of MICEImputer.__init__:)
def __init__(self, missing_values='NaN', imputation_order='ascending', n_imputations=100, n_burn_in=10, predictor=None, n_nearest_features=None, initial_strategy="mean", min_value=None, max_value=None, verbose=False, random_state=None):
(definition of MICEImputer._impute_one_feature:)
def _impute_one_feature(self, X_filled, mask_missing_values, feat_idx, neighbor_feat_idx, predictor=None, fit_mode=True):
"""Impute a single feature from the others provided.
This function predicts the missing values of one of the features using
the current estimates of all the other features. The ``predictor`` must
support ``return_std=True`` in its ``predict`` method for this function
to work.
Parameters
----------
X_filled : ndarray
Input data with the most recent imputations.
mask_missing_values : ndarray
Input data's missing indicator matrix.
feat_idx : int
Index of the feature currently being imputed.
neighbor_feat_idx : ndarray
Indices of the features to be used in imputing ``feat_idx``.
predictor : object
The predictor to use at this step of the round-robin imputation.
It must support ``return_std`` in its ``predict`` method.
If None, it will be cloned from self._predictor.
fit_mode : boolean, default=True
Whether to fit and predict with the predictor or just predict.
Returns
-------
X_filled : ndarray
Input data with ``X_filled[missing_row_mask, feat_idx]`` updated.
predictor : predictor with sklearn API
The fitted predictor used to impute
``X_filled[missing_row_mask, feat_idx]``."""
(definition of MICEImputer._get_neighbor_feat_idx:)
def _get_neighbor_feat_idx(self, n_features, feat_idx, abs_corr_mat):
"""Get a list of other features to predict ``feat_idx``.
If self.n_nearest_features is less than or equal to the total
number of features, then use a probability proportional to the absolute
correlation between ``feat_idx`` and each other feature to randomly
choose a subsample of the other features (without replacement).
Parameters
----------
n_features : int
Number of features in ``X``.
feat_idx : int
Index of the feature currently being imputed.
abs_corr_mat : ndarray, shape (n_features, n_features)
Absolute correlation matrix of ``X``. The diagonal has been zeroed
out and each feature has been normalized to sum to 1. Can be None.
Returns
-------
neighbor_feat_idx : array-like
The features to use to impute ``feat_idx``."""
(definition of MICEImputer._get_ordered_idx:)
def _get_ordered_idx(self, mask_missing_values):
"""Decide in what order we will update the features.
As a homage to the MICE R package, we will have 4 main options of
how to order the updates, and use a random order if anything else
is specified.
Also, this function skips features which have no missing values.
Parameters
----------
mask_missing_values : array-like, shape (n_samples, n_features)
Input data's missing indicator matrix, where "n_samples" is the
number of samples and "n_features" is the number of features.
Returns
-------
ordered_idx : ndarray, shape (n_features,)
The order in which to impute the features."""
(definition of MICEImputer._get_abs_corr_mat:)
def _get_abs_corr_mat(self, X_filled, tolerance=1e-6):
"""Get absolute correlation matrix between features.
Parameters
----------
X_filled : ndarray, shape (n_samples, n_features)
Input data with the most recent imputations.
tolerance : float, optional (default=1e-6)
``abs_corr_mat`` can have nans, which will be replaced
with ``tolerance``.
Returns
-------
abs_corr_mat : ndarray, shape (n_features, n_features)
Absolute correlation matrix of ``X`` at the beginning of the
current round. The diagonal has been zeroed out and each feature's
absolute correlations with all others have been normalized to sum
to 1."""
(definition of MICEImputer._initial_imputation:)
def _initial_imputation(self, X):
"""Perform initial imputation for input X.
Parameters
----------
X : ndarray, shape (n_samples, n_features)
Input data, where "n_samples" is the number of samples and
"n_features" is the number of features.
Returns
-------
Xt : ndarray, shape (n_samples, n_features)
Input data, where "n_samples" is the number of samples and
"n_features" is the number of features.
X_filled : ndarray, shape (n_samples, n_features)
Input data with the most recent imputations.
mask_missing_values : ndarray, shape (n_samples, n_features)
Input data's missing indicator matrix, where "n_samples" is the
number of samples and "n_features" is the number of features."""
(definition of MICEImputer.fit_transform:)
def fit_transform(self, X, y=None):
"""Fits the imputer on X and return the transformed X.
Parameters
----------
X : array-like, shape (n_samples, n_features)
Input data, where "n_samples" is the number of samples and
"n_features" is the number of features.
y : ignored.
Returns
-------
Xt : array-like, shape (n_samples, n_features)
The imputed input data."""
(definition of MICEImputer.transform:)
def transform(self, X):
"""Imputes all missing values in X.
Note that this is stochastic, and that if random_state is not fixed,
repeated calls, or permuted input, will yield different results.
Parameters
----------
X : array-like, shape = [n_samples, n_features]
The input data to complete.
Returns
-------
Xt : array-like, shape (n_samples, n_features)
The imputed input data."""
(definition of MICEImputer.fit:)
def fit(self, X, y=None):
"""Fits the imputer on X and return self.
Parameters
----------
X : array-like, shape (n_samples, n_features)
Input data, where "n_samples" is the number of samples and
"n_features" is the number of features.
y : ignored
Returns
-------
self : object
Returns self."""
[end of new definitions in sklearn/impute.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
MICE imputer
Proceeding from https://github.com/scikit-learn/scikit-learn/pull/4844#issuecomment-253043238 is a suggestion that MICE imputation be included in scikit-learn. @sergeyf has implemented it [here](https://github.com/hammerlab/fancyimpute/blob/master/fancyimpute/mice.py).
Here we will discuss issues related to its design and inclusion.
----------
@sergeyf [wrote](https://github.com/scikit-learn/scikit-learn/pull/4844#issuecomment-259055975):
> @amueller and @jnothman do either of you have a sense of what the right structure for MICE would be? It's pretty involved, so I don't want to just add a new option to `Imputer`. What about adding a new `MICE` class to `imputation.py`?
>
> Also, the `fit` method doesn't really make sense for MICE. It can pretty much only do `fit_transform` or `transform`: you provide some `X_train` that has holes in it, and MICE fills them in. If you then have an `X_test`, it doesn't use any of the info from `X_train` to fill it in. The simplest thing to do here would be to have `fit` do nothing, have `transform` do all the work, and have `fit_transform` alias to `transform`.
Yes, please create a new `MICEImputer` class.
I think we need an API that allows people to use this in a standard supervised prediction pipeline. Two issues with your proposal then:
1. the data being transformed is often much smaller (and therefore less informative) than the training data. To solve this, we could store the training data (pre-imputed, IMO) and perform the imputation rounds on the concatenation of training and transform data. However:
2. a downstream estimator will operate best when the training transformation and testing transformation is identical. To solve this, we could instead store the model constructed for each column in each of the `n_imputations` performed at train time. (In the case of PMM, we would draw the K nearest neighbors only from the training data as imputed at that point in time.) This also ensures that each row is transformed deterministically given a fitted model.
Could you please comment on how this relates to what's possible in the R interface?
@jnothman It is certainly often true that the data being transformed is smaller than the training data. I like the idea of storing each model for each iteration that was done during the fit phase (as well as the order the imputations were done in). So this would be `fit` or `fit_transform`?
Also, I wasn't going to implement PMM for this pull request: it's much slower and did not work nearly as well in most of my experiments. If someone requests it later, I can add it on without too much trouble.
As for MICE in R: I'm not too familiar with it, but I think MICE just fills in matrices. There is no notion of training and test - that is left up to the user.
One key point is that MICE provides the last `n_last` imputations, and doesn't average them for you. The intent is to then take the multiple imputations and use them all in parallel downstream, averaging only as the final step (after, say, training and prediction of `n_last` classifiers). We could return both an averaged version, and optionally return the last `n_last` imputations. What do you think?
`fit` and `fit_transform` are meant to be functionally identical in terms of the model that is stored. Both would run the full imputation process on the training data and store the sufficient information to run the imputations on new data in `transform`. I suspect in this case little would differ between their implementations but the return value.
OK, sounds good, thanks.
I think I have a sense of what to do. I'll get started once the `return_std` is accepted.
Though you're right, we don't _need_ any ability to transform new data (just as we do not provide this in TSNE), as the technique is unsupervised. So first implement `fit_transform`, then we'll decide whether it's worth storing the model and implementing `transform`.
Well I already implemented `transform` as it was fairly straight-forward. I'll leave it in and we can trim it later if that seems like the wiser path.
--------------------
</issues> | 51407623e4f491f00e3b465626dd5c4b55860bd0 |
conan-io__conan-1032 | 1,032 | conan-io/conan | null | b0213d832650f92c0210d1b25f98b1f459441b50 | 2017-02-26T15:56:51Z | diff --git a/conans/client/command.py b/conans/client/command.py
index 0236ca67cfd..401d61b9670 100644
--- a/conans/client/command.py
+++ b/conans/client/command.py
@@ -328,6 +328,11 @@ def info(self, *args):
parser.add_argument("--file", "-f", help="specify conanfile filename")
parser.add_argument("--only", "-n", nargs="?", const="None",
help='show fields only')
+ parser.add_argument("--paths", action='store_true', default=False,
+ help='Show package paths in local cache')
+ parser.add_argument("--package_filter", nargs='?',
+ help='print information only for packages that match the filter'
+ 'e.g., MyPackage/1.2@user/channel or MyPackage*')
parser.add_argument("--build_order", "-bo",
help='given a modified reference, return an ordered list to build (CI)',
nargs=1, action=Extender)
@@ -353,11 +358,13 @@ def info(self, *args):
remote=args.remote,
profile=profile,
info=args.only,
+ package_filter=args.package_filter,
check_updates=args.update,
filename=args.file,
build_order=args.build_order,
build_mode=args.build,
- graph_filename=args.graph)
+ graph_filename=args.graph,
+ show_paths=args.paths)
def build(self, *args):
""" Utility command to run your current project 'conanfile.py' build() method.
@@ -815,7 +822,6 @@ def run(self, *args):
return errors
-
def _check_query_parameter_and_get_reference(args):
reference = None
if args.pattern:
@@ -828,6 +834,7 @@ def _check_query_parameter_and_get_reference(args):
"MyPackage/1.2@user/channel -q \"os=Windows\"")
return reference
+
def _parse_manifests_arguments(args, reference, current_path):
if args.manifests and args.manifests_interactive:
raise ConanException("Do not specify both manifests and "
diff --git a/conans/client/manager.py b/conans/client/manager.py
index 9dfb99409b8..65c45ff9848 100644
--- a/conans/client/manager.py
+++ b/conans/client/manager.py
@@ -188,7 +188,8 @@ def _get_graph(self, reference, current_path, profile, remote, filename, update,
def info(self, reference, current_path, profile, remote=None,
info=None, filename=None, update=False, check_updates=False,
- build_order=None, build_mode=None, graph_filename=None):
+ build_order=None, build_mode=None, graph_filename=None, package_filter=None,
+ show_paths=False):
""" Fetch and build all dependencies for the given reference
@param reference: ConanFileReference or path to user space conanfile
@param current_path: where the output files will be saved
@@ -239,7 +240,8 @@ def read_dates(deps_graph):
else:
Printer(self._user_io.out).print_info(deps_graph, project_reference,
info, registry, graph_updates_info,
- remote, read_dates(deps_graph))
+ remote, read_dates(deps_graph),
+ self._client_cache, package_filter, show_paths)
def install(self, reference, current_path, profile, remote=None,
build_mode=None, filename=None, update=False, check_updates=False,
diff --git a/conans/client/printer.py b/conans/client/printer.py
index cc25427842c..b99826647d6 100644
--- a/conans/client/printer.py
+++ b/conans/client/printer.py
@@ -1,9 +1,12 @@
from collections import OrderedDict
+from conans.paths import SimplePaths
+
from conans.client.output import Color
from conans.model.ref import ConanFileReference
from conans.model.ref import PackageReference
from conans.client.installer import build_id
+import fnmatch
class Printer(object):
@@ -41,8 +44,29 @@ def print_graph(self, deps_graph, registry):
self._out.writeln(" %s" % repr(ref), Color.BRIGHT_CYAN)
self._out.writeln("")
+ def _print_paths(self, ref, conan, path_resolver, show):
+ if isinstance(ref, ConanFileReference):
+ if show("export_folder"):
+ path = path_resolver.export(ref)
+ self._out.writeln(" export_folder: %s" % path, Color.BRIGHT_GREEN)
+ if show("source_folder"):
+ path = path_resolver.source(ref, conan.short_paths)
+ self._out.writeln(" source_folder: %s" % path, Color.BRIGHT_GREEN)
+ if show("build_folder") and isinstance(path_resolver, SimplePaths):
+ # @todo: check if this is correct or if it must always be package_id()
+ bid = build_id(conan)
+ if not bid:
+ bid = conan.info.package_id()
+ path = path_resolver.build(PackageReference(ref, bid), conan.short_paths)
+ self._out.writeln(" build_folder: %s" % path, Color.BRIGHT_GREEN)
+ if show("package_folder") and isinstance(path_resolver, SimplePaths):
+ id_ = conan.info.package_id()
+ path = path_resolver.package(PackageReference(ref, id_), conan.short_paths)
+ self._out.writeln(" package_folder: %s" % path, Color.BRIGHT_GREEN)
+
def print_info(self, deps_graph, project_reference, _info, registry, graph_updates_info=None,
- remote=None, node_times=None):
+ remote=None, node_times=None, path_resolver=None, package_filter=None,
+ show_paths=False):
""" Print the dependency information for a conan file
Attributes:
@@ -73,6 +97,8 @@ def show(field):
continue
else:
ref = project_reference
+ if package_filter and not fnmatch.fnmatch(str(ref), package_filter):
+ continue
self._out.writeln("%s" % str(ref), Color.BRIGHT_CYAN)
reg_remote = registry.get_ref(ref)
# Excludes PROJECT fake reference
@@ -87,6 +113,9 @@ def show(field):
bid = build_id(conan)
self._out.writeln(" BuildID: %s" % bid, Color.BRIGHT_GREEN)
+ if show_paths:
+ self._print_paths(ref, conan, path_resolver, show)
+
if isinstance(ref, ConanFileReference) and show("remote"):
if reg_remote:
self._out.writeln(" Remote: %s=%s" % (reg_remote.name, reg_remote.url),
| diff --git a/conans/test/command/info_folders_test.py b/conans/test/command/info_folders_test.py
new file mode 100644
index 00000000000..f3d54a16490
--- /dev/null
+++ b/conans/test/command/info_folders_test.py
@@ -0,0 +1,151 @@
+import unittest
+import os
+import platform
+
+from conans import tools
+from conans.test.tools import TestClient
+from conans.test.utils.test_files import temp_folder
+from conans.paths import CONANFILE
+from conans.model.ref import ConanFileReference, PackageReference
+import re
+
+
+conanfile_py = """
+from conans import ConanFile
+
+class AConan(ConanFile):
+ name = "MyPackage"
+ version = "0.1.0"
+ short_paths=False
+"""
+
+with_deps_path_file = """
+from conans import ConanFile
+
+class BConan(ConanFile):
+ name = "MyPackage2"
+ version = "0.2.0"
+ requires = "MyPackage/0.1.0@myUser/testing"
+"""
+
+deps_txt_file = """
+[requires]
+MyPackage2/0.2.0@myUser/testing
+"""
+
+
+class InfoFoldersTest(unittest.TestCase):
+ def setUp(self):
+ self.user_channel = "myUser/testing"
+ self.conan_ref = "MyPackage/0.1.0@%s" % self.user_channel
+ self.conan_ref2 = "MyPackage2/0.2.0@%s" % self.user_channel
+
+ def _prepare_deps(self, client):
+ client.save({CONANFILE: conanfile_py})
+ client.run("export %s" % self.user_channel)
+ client.save({CONANFILE: with_deps_path_file}, clean_first=True)
+ client.run("export %s" % self.user_channel)
+ client.save({'conanfile.txt': deps_txt_file}, clean_first=True)
+
+ def test_basic(self):
+ client = TestClient()
+ client.save({CONANFILE: conanfile_py})
+ client.run("export %s" % self.user_channel)
+ client.run("info --paths %s" % (self.conan_ref))
+ base_path = os.path.join("MyPackage", "0.1.0", "myUser", "testing")
+ output = client.user_io.out
+ self.assertIn(os.path.join(base_path, "export"), output)
+ self.assertIn(os.path.join(base_path, "source"), output)
+ id_ = "5ab84d6acfe1f23c4fae0ab88f26e3a396351ac9"
+ self.assertIn(os.path.join(base_path, "build", id_), output)
+ self.assertIn(os.path.join(base_path, "package", id_), output)
+
+ def test_deps_basic(self):
+ client = TestClient()
+ self._prepare_deps(client)
+
+ for ref in [self.conan_ref2, ""]:
+ client.run("info --paths %s" % (ref))
+ output = client.user_io.out
+
+ base_path = os.path.join("MyPackage", "0.1.0", "myUser", "testing")
+ self.assertIn(os.path.join(base_path, "export"), output)
+ self.assertIn(os.path.join(base_path, "source"), output)
+
+ base_path = os.path.join("MyPackage2", "0.2.0", "myUser", "testing")
+ self.assertIn(os.path.join(base_path, "export"), output)
+ self.assertIn(os.path.join(base_path, "source"), output)
+
+ def test_deps_specific_information(self):
+ client = TestClient()
+ self._prepare_deps(client)
+ client.run("info --paths --only package_folder --package_filter MyPackage/*")
+ output = client.user_io.out
+
+ base_path = os.path.join("MyPackage", "0.1.0", "myUser", "testing")
+ self.assertIn(os.path.join(base_path, "package"), output)
+ self.assertNotIn("build", output)
+ self.assertNotIn("MyPackage2", output)
+
+ client.run("info --paths --only package_folder --package_filter MyPackage*")
+ output = client.user_io.out
+
+ base_path = os.path.join("MyPackage", "0.1.0", "myUser", "testing")
+ self.assertIn(os.path.join(base_path, "package"), output)
+ self.assertNotIn("build", output)
+
+ base_path = os.path.join("MyPackage2", "0.2.0", "myUser", "testing")
+ self.assertIn(os.path.join(base_path, "package"), output)
+
+ def test_single_field(self):
+ client = TestClient()
+ client.save({CONANFILE: conanfile_py})
+ client.run("export %s" % self.user_channel)
+ client.run("info --paths --only=build_folder %s" % (self.conan_ref))
+ base_path = os.path.join("MyPackage", "0.1.0", "myUser", "testing")
+ output = client.user_io.out
+ self.assertNotIn("export", output)
+ self.assertNotIn("source", output)
+ self.assertIn(os.path.join(base_path, "build"), output)
+ self.assertNotIn("package", output)
+
+ def test_short_paths(self):
+ if platform.system() == "Windows":
+ folder = temp_folder(False)
+ short_folder = os.path.join(folder, ".cn")
+ with tools.environment_append({"CONAN_USER_HOME_SHORT": short_folder}):
+ client = TestClient(base_folder=folder)
+ client.save({CONANFILE: conanfile_py.replace("False", "True")})
+ client.run("export %s" % self.user_channel)
+ client.run("info --paths %s" % (self.conan_ref))
+ base_path = os.path.join("MyPackage", "0.1.0", "myUser", "testing")
+ output = client.user_io.out
+ self.assertIn(os.path.join(base_path, "export"), output)
+ self.assertNotIn(os.path.join(base_path, "source"), output)
+ self.assertNotIn(os.path.join(base_path, "build"), output)
+ self.assertNotIn(os.path.join(base_path, "package"), output)
+
+ self.assertIn("source_folder: %s" % short_folder, output)
+ self.assertIn("build_folder: %s" % short_folder, output)
+ self.assertIn("package_folder: %s" % short_folder, output)
+
+ # Ensure that the inner folders are not created (that could affect
+ # pkg creation flow
+ ref = ConanFileReference.loads(self.conan_ref)
+ id_ = re.search('ID:\s*([a-z0-9]*)', str(client.user_io.out)).group(1)
+ pkg_ref = PackageReference(ref, id_)
+ for path in (client.client_cache.source(ref, True),
+ client.client_cache.build(pkg_ref, True),
+ client.client_cache.package(pkg_ref, True)):
+ self.assertFalse(os.path.exists(path))
+ self.assertTrue(os.path.exists(os.path.dirname(path)))
+
+ def test_direct_conanfile(self):
+ client = TestClient()
+ client.save({CONANFILE: conanfile_py})
+ client.run("info")
+ output = client.user_io.out
+ self.assertNotIn("export_folder", output)
+ self.assertNotIn("source_folder", output)
+ self.assertNotIn("build_folder", output)
+ self.assertNotIn("package_folder", output)
diff --git a/conans/test/command/info_test.py b/conans/test/command/info_test.py
index a320d9b474d..7f171287380 100644
--- a/conans/test/command/info_test.py
+++ b/conans/test/command/info_test.py
@@ -201,7 +201,11 @@ def clean_output(output):
return "\n".join([line for line in str(output).splitlines()
if not line.strip().startswith("Creation date") and
not line.strip().startswith("ID") and
- not line.strip().startswith("BuildID")])
+ not line.strip().startswith("BuildID") and
+ not line.strip().startswith("export_folder") and
+ not line.strip().startswith("build_folder") and
+ not line.strip().startswith("source_folder") and
+ not line.strip().startswith("package_folder")])
# The timestamp is variable so we can't check the equality
self.assertIn(expected_output, clean_output(self.client.user_io.out))
diff --git a/conans/test/path_limit_test.py b/conans/test/path_limit_test.py
index 1e34251d16e..c0968be0a94 100644
--- a/conans/test/path_limit_test.py
+++ b/conans/test/path_limit_test.py
@@ -4,7 +4,6 @@
import os
from conans.model.ref import PackageReference, ConanFileReference
import platform
-import time
base = '''
| [
{
"components": [
{
"doc": "",
"lines": [
47,
65
],
"name": "Printer._print_paths",
"signature": "def _print_paths(self, ref, conan, path_resolver, show):",
"type": "function"
}
],
"file": "conans/client/printer.py"
}
] | [
"conans/test/command/info_folders_test.py::InfoFoldersTest::test_basic",
"conans/test/command/info_folders_test.py::InfoFoldersTest::test_deps_basic",
"conans/test/command/info_folders_test.py::InfoFoldersTest::test_deps_specific_information",
"conans/test/command/info_folders_test.py::InfoFoldersTest::test_s... | [
"conans/test/command/info_folders_test.py::InfoFoldersTest::test_direct_conanfile",
"conans/test/command/info_folders_test.py::InfoFoldersTest::test_short_paths"
] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
extend conan info command to show path information and filter by packages
A new path_info command was added to conan, taking a user and optional
reference, settings and options.
Returns the export, source, build and package folders for the given
configuration.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/client/printer.py]
(definition of Printer._print_paths:)
def _print_paths(self, ref, conan, path_resolver, show):
[end of new definitions in conans/client/printer.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | ||
sympy__sympy-12171 | 12,171 | sympy/sympy | 1.0 | ca6ef27272be31c9dc3753ede9232c39df9a75d8 | 2017-02-13T18:20:56Z | diff --git a/sympy/printing/mathematica.py b/sympy/printing/mathematica.py
index 4c2ffd81e158..ef61abb6af17 100644
--- a/sympy/printing/mathematica.py
+++ b/sympy/printing/mathematica.py
@@ -109,6 +109,9 @@ def _print_Integral(self, expr):
def _print_Sum(self, expr):
return "Hold[Sum[" + ', '.join(self.doprint(a) for a in expr.args) + "]]"
+ def _print_Derivative(self, expr):
+ return "Hold[D[" + ', '.join(self.doprint(a) for a in expr.args) + "]]"
+
def mathematica_code(expr, **settings):
r"""Converts an expr to a string of the Wolfram Mathematica code
| diff --git a/sympy/printing/tests/test_mathematica.py b/sympy/printing/tests/test_mathematica.py
index 8fd05ae544ad..d4fb90bd1e63 100644
--- a/sympy/printing/tests/test_mathematica.py
+++ b/sympy/printing/tests/test_mathematica.py
@@ -1,5 +1,5 @@
from sympy.core import (S, pi, oo, symbols, Function,
- Rational, Integer, Tuple)
+ Rational, Integer, Tuple, Derivative)
from sympy.integrals import Integral
from sympy.concrete import Sum
from sympy.functions import exp, sin, cos
@@ -74,6 +74,14 @@ def test_Integral():
"{y, -Infinity, Infinity}]]"
+def test_Derivative():
+ assert mcode(Derivative(sin(x), x)) == "Hold[D[Sin[x], x]]"
+ assert mcode(Derivative(x, x)) == "Hold[D[x, x]]"
+ assert mcode(Derivative(sin(x)*y**4, x, 2)) == "Hold[D[y^4*Sin[x], x, x]]"
+ assert mcode(Derivative(sin(x)*y**4, x, y, x)) == "Hold[D[y^4*Sin[x], x, y, x]]"
+ assert mcode(Derivative(sin(x)*y**4, x, y, 3, x)) == "Hold[D[y^4*Sin[x], x, y, y, y, x]]"
+
+
def test_Sum():
assert mcode(Sum(sin(x), (x, 0, 10))) == "Hold[Sum[Sin[x], {x, 0, 10}]]"
assert mcode(Sum(exp(-x**2 - y**2),
| [
{
"components": [
{
"doc": "",
"lines": [
112,
113
],
"name": "MCodePrinter._print_Derivative",
"signature": "def _print_Derivative(self, expr):",
"type": "function"
}
],
"file": "sympy/printing/mathematica.py"
}
] | [
"test_Derivative"
] | [
"test_Integer",
"test_Rational",
"test_Function",
"test_Pow",
"test_Mul",
"test_constants",
"test_containers",
"test_Integral"
] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Added Mathematica printer for derivatives
Fixes #12163
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/printing/mathematica.py]
(definition of MCodePrinter._print_Derivative:)
def _print_Derivative(self, expr):
[end of new definitions in sympy/printing/mathematica.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
matematica code printer does not handle floats and derivatives correctly
In its current state the mathematica code printer does not handle Derivative(func(vars), deriver)
e.g. Derivative(f(t), t) yields Derivative(f(t), t) instead of D[f[t],t]
Also floats with exponents are not handled correctly e.g. 1.0e-4 is not converted to 1.0*^-4
This has an easy fix by adding the following lines to MCodePrinter:
def _print_Derivative(self, expr):
return "D[%s]" % (self.stringify(expr.args, ", "))
def _print_Float(self, expr):
res =str(expr)
return res.replace('e','*^')
----------
I would like to work on this issue
So, should I add the lines in printing/mathematica.py ?
I've tested the above code by adding these methods to a class derived from MCodePrinter and I was able to export an ODE system straight to NDSolve in Mathematica.
So I guess simply adding them to MCodePrinter in in printing/mathematica.py would fix the issue
@kemlath it is giving some error , please have a look

--------------------
</issues> | 820363f5b17cbe5809ef0911ea539e135c179c62 | |
conan-io__conan-915 | 915 | conan-io/conan | null | 83ad42139d4a61d5df09ce861dc75bd2a507ce3c | 2017-01-27T12:50:55Z | diff --git a/conans/client/deps_builder.py b/conans/client/deps_builder.py
index eb53e1916bc..38f33220f11 100644
--- a/conans/client/deps_builder.py
+++ b/conans/client/deps_builder.py
@@ -122,8 +122,12 @@ def propagate_info(self):
indirect_reqs,
non_devs)
- # Once we are done, call conan_info() to narrow and change possible values
- conanfile.conan_info()
+ # Once we are done, call package_id() to narrow and change possible values
+ if hasattr(conanfile, "conan_info"):
+ # Deprecated in 0.19
+ conanfile.conan_info()
+ else:
+ conanfile.package_id()
return ordered
def ordered_closure(self, node, flat):
diff --git a/conans/client/manager.py b/conans/client/manager.py
index c1a60ce6cbd..fece65966ec 100644
--- a/conans/client/manager.py
+++ b/conans/client/manager.py
@@ -108,6 +108,8 @@ def export(self, user, conan_file_path, keep_source=False):
if not field_value:
self._user_io.out.warn("Conanfile doesn't have '%s'.\n"
"It is recommended to add it as attribute" % field)
+ if getattr(conan_file, "conan_info", None):
+ self._user_io.out.warn("conan_info() method is deprecated, use package_id() instead")
conan_ref = ConanFileReference(conan_file.name, conan_file.version, user_name, channel)
conan_ref_str = str(conan_ref)
diff --git a/conans/model/conan_file.py b/conans/model/conan_file.py
index bb3e4ec9c54..e4a2a224bd2 100644
--- a/conans/model/conan_file.py
+++ b/conans/model/conan_file.py
@@ -228,7 +228,7 @@ def run(self, command, output=True, cwd=None):
if retcode != 0:
raise ConanException("Error %d while executing %s" % (retcode, command))
- def conan_info(self):
+ def package_id(self):
""" modify the conans info, typically to narrow values
eg.: conaninfo.package_references = []
"""
diff --git a/conans/model/info.py b/conans/model/info.py
index 4ead8f11442..c7e3b33d4d7 100644
--- a/conans/model/info.py
+++ b/conans/model/info.py
@@ -22,11 +22,9 @@ def __init__(self, value_str, indirect=False):
# sha values
if indirect:
- self.name = self.version = None
+ self.unrelated_mode()
else:
- self.name = self.full_name
- self.version = self.full_version.stable()
- self.user = self.channel = self.package_id = None
+ self.semver()
def dumps(self):
return "/".join([n for n in [self.name, self.version, self.user, self.channel,
@@ -45,24 +43,29 @@ def deserialize(data):
ret = RequirementInfo(data)
return ret
- def semver(self):
+ def unrelated_mode(self):
+ self.name = self.version = self.user = self.channel = self.package_id = None
+
+ def semver_mode(self):
self.name = self.full_name
self.version = self.full_version.stable()
self.user = self.channel = self.package_id = None
- def full_version(self):
+ semver = semver_mode
+
+ def full_version_mode(self):
self.name = self.full_name
self.version = self.full_version
self.user = self.channel = self.package_id = None
- def full_recipe(self):
+ def full_recipe_mode(self):
self.name = self.full_name
self.version = self.full_version
self.user = self.full_user
self.channel = self.full_channel
self.package_id = None
- def full_package(self):
+ def full_package_mode(self):
self.name = self.full_name
self.version = self.full_version
self.user = self.full_user
diff --git a/conans/model/values.py b/conans/model/values.py
index c1030557e3f..4cf56ad3584 100644
--- a/conans/model/values.py
+++ b/conans/model/values.py
@@ -14,7 +14,7 @@ def __getattr__(self, attr):
return self._dict[attr]
def clear(self):
- # TODO: Test. DO not delete, might be used by conan_info() to clear settings values
+ # TODO: Test. DO not delete, might be used by package_id() to clear settings values
self._dict.clear()
self._value = ""
| diff --git a/conans/test/integration/package_id_test.py b/conans/test/integration/package_id_test.py
new file mode 100644
index 00000000000..e5b2cfca788
--- /dev/null
+++ b/conans/test/integration/package_id_test.py
@@ -0,0 +1,180 @@
+import unittest
+from conans.test.tools import TestClient
+from conans.util.files import load
+from conans.paths import CONANINFO
+import os
+
+
+class PackageIDTest(unittest.TestCase):
+
+ def setUp(self):
+ self.client = TestClient()
+
+ def _export(self, name, version, package_id_text=None, requires=None,
+ channel=None, default_option_value="off"):
+ hello_file = """
+from conans import ConanFile
+
+class HelloConan(ConanFile):
+ name = "%s"
+ version = "%s"
+ options = {"an_option": ["on", "off"]}
+ default_options = [("an_option", "%s")]
+""" % (name, version, default_option_value)
+ if requires:
+ hello_file += "\n requires="
+ for require in requires:
+ hello_file += '"%s"' % require
+ if package_id_text:
+ hello_file += "\n def package_id(self):\n %s" % package_id_text
+
+ self.client.save({"conanfile.py": hello_file}, clean_first=True)
+ self.client.run("export %s" % (channel or "lasote/stable"))
+
+ @property
+ def conaninfo(self):
+ return load(os.path.join(self.client.current_folder, CONANINFO))
+
+ def test_version_semver_schema(self):
+ self._export("Hello", "1.2.0", package_id_text=None, requires=None)
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].semver()',
+ requires=["Hello/1.2.0@lasote/stable"])
+
+ # Build the dependencies with --build missing
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ self.client.run("install --build missing")
+ self.assertIn("Hello2/2.Y.Z", [line.strip() for line in self.conaninfo.splitlines()])
+
+ # Now change the Hello version and build it, if we install out requires should not be
+ # needed the --build needed because Hello2 don't need to be rebuilt
+ self._export("Hello", "1.5.0", package_id_text=None, requires=None)
+ self.client.run("install Hello/1.5.0@lasote/stable --build missing")
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].semver()',
+ requires=["Hello/1.5.0@lasote/stable"])
+
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ self.client.run("install .")
+ self.assertIn("Hello2/2.Y.Z", [line.strip() for line in self.conaninfo.splitlines()])
+
+ # Try to change user and channel too, should be the same, not rebuilt needed
+ self._export("Hello", "1.5.0", package_id_text=None, requires=None, channel="memsharded/testing")
+ self.client.run("install Hello/1.5.0@memsharded/testing --build missing")
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].semver()',
+ requires=["Hello/1.5.0@memsharded/testing"])
+
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ self.client.run("install .")
+ self.assertIn("Hello2/2.Y.Z", [line.strip() for line in self.conaninfo.splitlines()])
+
+ def test_version_full_version_schema(self):
+ self._export("Hello", "1.2.0", package_id_text=None, requires=None)
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].full_version_mode()',
+ requires=["Hello/1.2.0@lasote/stable"])
+
+ # Build the dependencies with --build missing
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ self.client.run("install --build missing")
+ self.assertIn("Hello2/2.3.8", self.conaninfo)
+
+ # If we change the user and channel should not be needed to rebuild
+ self._export("Hello", "1.2.0", package_id_text=None, requires=None, channel="memsharded/testing")
+ self.client.run("install Hello/1.2.0@memsharded/testing --build missing")
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].full_version_mode()',
+ requires=["Hello/1.2.0@memsharded/testing"])
+
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ self.client.run("install .")
+ self.assertIn("Hello2/2.3.8", self.conaninfo)
+
+ # Now change the Hello version and build it, if we install out requires is
+ # needed the --build needed because Hello2 needs to be build
+ self._export("Hello", "1.5.0", package_id_text=None, requires=None)
+ self.client.run("install Hello/1.5.0@lasote/stable --build missing")
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].full_version_mode()',
+ requires=["Hello/1.5.0@lasote/stable"])
+
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ with self.assertRaises(Exception):
+ self.client.run("install .")
+ self.assertIn("Can't find a 'Hello2/2.3.8@lasote/stable' package", self.client.user_io.out)
+
+ def test_version_full_recipe_schema(self):
+ self._export("Hello", "1.2.0", package_id_text=None, requires=None)
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].full_recipe_mode()',
+ requires=["Hello/1.2.0@lasote/stable"])
+
+ # Build the dependencies with --build missing
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ self.client.run("install --build missing")
+ self.assertIn("Hello2/2.3.8", self.conaninfo)
+
+ # If we change the user and channel should be needed to rebuild
+ self._export("Hello", "1.2.0", package_id_text=None, requires=None, channel="memsharded/testing")
+ self.client.run("install Hello/1.2.0@memsharded/testing --build missing")
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].full_recipe_mode()',
+ requires=["Hello/1.2.0@memsharded/testing"])
+
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ with self.assertRaises(Exception):
+ self.client.run("install .")
+ self.assertIn("Can't find a 'Hello2/2.3.8@lasote/stable' package", self.client.user_io.out)
+
+ # If we change only the package ID from hello (one more defaulted option to True) should not affect
+ self._export("Hello", "1.2.0", package_id_text=None, requires=None, default_option_value="on")
+ self.client.run("install Hello/1.2.0@lasote/stable --build missing")
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].full_recipe_mode()',
+ requires=["Hello/1.2.0@lasote/stable"])
+
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+
+ self.client.run("install .")
+
+ def test_version_full_package_schema(self):
+ self._export("Hello", "1.2.0", package_id_text=None, requires=None)
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].full_package_mode()',
+ requires=["Hello/1.2.0@lasote/stable"])
+
+ # Build the dependencies with --build missing
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ self.client.run("install --build missing")
+ self.assertIn("Hello2/2.3.8", self.conaninfo)
+
+ # If we change only the package ID from hello (one more defaulted option to True) should affect
+ self._export("Hello", "1.2.0", package_id_text=None, requires=None, default_option_value="on")
+ self.client.run("install Hello/1.2.0@lasote/stable --build missing")
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ with self.assertRaises(Exception):
+ self.client.run("install .")
+ self.assertIn("Can't find a 'Hello2/2.3.8@lasote/stable' package", self.client.user_io.out)
+
+ def test_nameless_mode(self):
+ self._export("Hello", "1.2.0", package_id_text=None, requires=None)
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["Hello"].unrelated_mode()',
+ requires=["Hello/1.2.0@lasote/stable"])
+
+ # Build the dependencies with --build missing
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ self.client.run("install --build missing")
+ self.assertIn("Hello2/2.3.8", self.conaninfo)
+
+ # If we change even the require, should not affect
+ self._export("HelloNew", "1.2.0")
+ self.client.run("install HelloNew/1.2.0@lasote/stable --build missing")
+ self._export("Hello2", "2.3.8",
+ package_id_text='self.info.requires["HelloNew"].unrelated_mode()',
+ requires=["HelloNew/1.2.0@lasote/stable"])
+
+ self.client.save({"conanfile.txt": "[requires]\nHello2/2.3.8@lasote/stable"}, clean_first=True)
+ # Not needed to rebuild Hello2, it doesn't matter its requires
+ self.client.run("install .")
diff --git a/conans/test/model/transitive_reqs_test.py b/conans/test/model/transitive_reqs_test.py
index 06c8b6309a4..079a9ad422e 100644
--- a/conans/test/model/transitive_reqs_test.py
+++ b/conans/test/model/transitive_reqs_test.py
@@ -377,8 +377,8 @@ class ChatConan(ConanFile):
requires = "Hello/1.2@diego/testing"
def conan_info(self):
- self.info.requires["Hello"].full_package()
- self.info.requires["Say"].semver()
+ self.info.requires["Hello"].full_package_mode()
+ self.info.requires["Say"].semver_mode()
"""
self.retriever.conan(say_ref, say_content)
| [
{
"components": [
{
"doc": "modify the conans info, typically to narrow values\neg.: conaninfo.package_references = []",
"lines": [
231,
232
],
"name": "ConanFile.package_id",
"signature": "def package_id(self):",
"type": "function"
... | [
"conans/test/integration/package_id_test.py::PackageIDTest::test_nameless_mode",
"conans/test/integration/package_id_test.py::PackageIDTest::test_version_full_package_schema",
"conans/test/integration/package_id_test.py::PackageIDTest::test_version_full_recipe_schema",
"conans/test/integration/package_id_test... | [
"conans/test/integration/package_id_test.py::PackageIDTest::test_version_semver_schema",
"conans/test/model/transitive_reqs_test.py::ConanRequirementsTest::test_basic",
"conans/test/model/transitive_reqs_test.py::ConanRequirementsTest::test_basic_option",
"conans/test/model/transitive_reqs_test.py::ConanRequi... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Renamed conan_info to package_id and tests added
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/model/conan_file.py]
(definition of ConanFile.package_id:)
def package_id(self):
"""modify the conans info, typically to narrow values
eg.: conaninfo.package_references = []"""
[end of new definitions in conans/model/conan_file.py]
[start of new definitions in conans/model/info.py]
(definition of RequirementInfo.unrelated_mode:)
def unrelated_mode(self):
(definition of RequirementInfo.semver_mode:)
def semver_mode(self):
(definition of RequirementInfo.full_version_mode:)
def full_version_mode(self):
(definition of RequirementInfo.full_recipe_mode:)
def full_recipe_mode(self):
(definition of RequirementInfo.full_package_mode:)
def full_package_mode(self):
[end of new definitions in conans/model/info.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | ||
falconry__falcon-991 | 991 | falconry/falcon | null | 8be7ff7057fc0f6f589353281730d6000dd17b0d | 2017-01-27T00:18:06Z | diff --git a/docs/api/api.rst b/docs/api/api.rst
index c3b72ac48..3b9a29fab 100644
--- a/docs/api/api.rst
+++ b/docs/api/api.rst
@@ -18,4 +18,6 @@ standard-compliant WSGI server.
.. autoclass:: falcon.RequestOptions
:members:
+.. autoclass:: falcon.ResponseOptions
+ :members:
diff --git a/docs/api/cookies.rst b/docs/api/cookies.rst
index 69c89d8c4..1c8e1dec2 100644
--- a/docs/api/cookies.rst
+++ b/docs/api/cookies.rst
@@ -98,14 +98,15 @@ the request.
.. warning::
- For this attribute to be effective, your application will need to
- enforce HTTPS when setting the cookie, as well as in all
- subsequent requests that require the cookie to be sent back from
- the client.
+ For this attribute to be effective, your web server or load
+ balancer will need to enforce HTTPS when setting the cookie, as
+ well as in all subsequent requests that require the cookie to be
+ sent back from the client.
When running your application in a development environment, you can
-disable this behavior by passing `secure=False` to
-:py:meth:`~.Response.set_cookie`. This lets you test your app locally
+disable this default behavior by setting
+:py:attr:`~.ResponseOptions.secure_cookies_by_default` to ``False``
+via :any:`API.resp_options`. This lets you test your app locally
without having to set up TLS. You can make this option configurable to
easily switch between development and production environments.
diff --git a/falcon/__init__.py b/falcon/__init__.py
index 8c4f0920b..b9176bb19 100644
--- a/falcon/__init__.py
+++ b/falcon/__init__.py
@@ -57,4 +57,4 @@
from falcon.hooks import before, after # NOQA
from falcon.request import Request, RequestOptions # NOQA
-from falcon.response import Response # NOQA
+from falcon.response import Response, ResponseOptions # NOQA
diff --git a/falcon/api.py b/falcon/api.py
index b8fe690e0..659eeec48 100644
--- a/falcon/api.py
+++ b/falcon/api.py
@@ -23,7 +23,7 @@
from falcon.http_status import HTTPStatus
from falcon.request import Request, RequestOptions
import falcon.responders
-from falcon.response import Response
+from falcon.response import Response, ResponseOptions
import falcon.status_codes as status
from falcon.util.misc import get_argnames
@@ -110,6 +110,8 @@ def process_response(self, req, resp, resource, req_succeeded)
Attributes:
req_options: A set of behavioral options related to incoming
requests. See also: :py:class:`~.RequestOptions`
+ resp_options: A set of behavioral options related to outgoing
+ responses. See also: :py:class:`~.ResponseOptions`
"""
# PERF(kgriffs): Reference via self since that is faster than
@@ -125,7 +127,7 @@ def process_response(self, req, resp, resource, req_succeeded)
__slots__ = ('_request_type', '_response_type',
'_error_handlers', '_media_type', '_router', '_sinks',
- '_serialize_error', 'req_options',
+ '_serialize_error', 'req_options', 'resp_options',
'_middleware', '_independent_middleware')
def __init__(self, media_type=DEFAULT_MEDIA_TYPE,
@@ -147,7 +149,9 @@ def __init__(self, media_type=DEFAULT_MEDIA_TYPE,
self._error_handlers = []
self._serialize_error = helpers.default_serialize_error
+
self.req_options = RequestOptions()
+ self.resp_options = ResponseOptions()
# NOTE(kgriffs): Add default error handlers
self.add_error_handler(falcon.HTTPError, self._http_error_handler)
@@ -170,7 +174,7 @@ def __call__(self, env, start_response): # noqa: C901
"""
req = self._request_type(env, options=self.req_options)
- resp = self._response_type()
+ resp = self._response_type(options=self.resp_options)
resource = None
params = {}
diff --git a/falcon/request.py b/falcon/request.py
index e909eb5d9..f36074bcb 100644
--- a/falcon/request.py
+++ b/falcon/request.py
@@ -286,12 +286,11 @@ class Request(object):
string, the value mapped to that parameter key will be a list of
all the values in the order seen.
- options (dict): Set of global options passed from the API handler.
-
cookies (dict):
A dict of name/value cookie pairs.
See also: :ref:`Getting Cookies <getting-cookies>`
+ options (dict): Set of global options passed from the API handler.
"""
__slots__ = (
diff --git a/falcon/response.py b/falcon/response.py
index b9cf5754c..fda62c644 100644
--- a/falcon/response.py
+++ b/falcon/response.py
@@ -44,6 +44,9 @@ class Response(object):
Note:
`Response` is not meant to be instantiated directly by responders.
+ Keyword Arguments:
+ options (dict): Set of global options passed from the API handler.
+
Attributes:
status (str): HTTP status line (e.g., '200 OK'). Falcon requires the
full status line, not just the code (e.g., 200). This design
@@ -109,6 +112,8 @@ class Response(object):
opposed to a class), the function is called like a method of
the current Response instance. Therefore the first argument is
the Response instance itself (self).
+
+ options (dict): Set of global options passed from the API handler.
"""
__slots__ = (
@@ -120,16 +125,19 @@ class Response(object):
'stream',
'stream_len',
'context',
+ 'options',
'__dict__',
)
# Child classes may override this
context_type = None
- def __init__(self):
+ def __init__(self, options=None):
self.status = '200 OK'
self._headers = {}
+ self.options = ResponseOptions() if options is None else options
+
# NOTE(tbug): will be set to a SimpleCookie object
# when cookie is set via set_cookie
self._cookies = None
@@ -164,7 +172,7 @@ def set_stream(self, stream, stream_len):
self.stream_len = stream_len
def set_cookie(self, name, value, expires=None, max_age=None,
- domain=None, path=None, secure=True, http_only=True):
+ domain=None, path=None, secure=None, http_only=True):
"""Set a response cookie.
Note:
@@ -223,6 +231,12 @@ def set_cookie(self, name, value, expires=None, max_age=None,
(default: ``True``). This prevents attackers from
reading sensitive cookie data.
+ Note:
+ The default value for this argument is normally
+ ``True``, but can be modified by setting
+ :py:attr:`~.ResponseOptions.secure_cookies_by_default`
+ via :any:`API.resp_options`.
+
Warning:
For the `secure` cookie attribute to be effective,
your application will need to enforce HTTPS.
@@ -295,8 +309,13 @@ def set_cookie(self, name, value, expires=None, max_age=None,
if path:
self._cookies[name]['path'] = path
- if secure:
- self._cookies[name]['secure'] = secure
+ if secure is None:
+ is_secure = self.options.secure_cookies_by_default
+ else:
+ is_secure = secure
+
+ if is_secure:
+ self._cookies[name]['secure'] = True
if http_only:
self._cookies[name]['httponly'] = http_only
@@ -716,3 +735,24 @@ def _wsgi_headers(self, media_type=None, py2=PY2):
items += [('set-cookie', c.OutputString())
for c in self._cookies.values()]
return items
+
+
+class ResponseOptions(object):
+ """Defines a set of configurable response options.
+
+ An instance of this class is exposed via :any:`API.resp_options` for
+ configuring certain :py:class:`~.Response` behaviors.
+
+ Attributes:
+ secure_cookies_by_default (bool): Set to ``False`` in development
+ environments to make the `secure` attribute for all cookies
+ default to ``False``. This can make testing easier by
+ not requiring HTTPS. Note, however, that this setting can
+ be overridden via `set_cookie()`'s `secure` kwarg.
+ """
+ __slots__ = (
+ 'secure_cookies_by_default',
+ )
+
+ def __init__(self):
+ self.secure_cookies_by_default = True
| diff --git a/tests/test_cookies.py b/tests/test_cookies.py
index d2ceb1985..212259967 100644
--- a/tests/test_cookies.py
+++ b/tests/test_cookies.py
@@ -58,7 +58,7 @@ def on_get(self, req, resp):
'foostring', 'bar', max_age='15', secure=False, http_only=False)
-@pytest.fixture(scope='module')
+@pytest.fixture()
def client():
app = falcon.API()
app.add_route('/', CookieResource())
@@ -92,6 +92,18 @@ def test_response_base_case(client):
assert cookie.secure
+def test_response_disable_secure_globally(client):
+ client.app.resp_options.secure_cookies_by_default = False
+ result = client.simulate_get('/')
+ cookie = result.cookies['foo']
+ assert not cookie.secure
+
+ client.app.resp_options.secure_cookies_by_default = True
+ result = client.simulate_get('/')
+ cookie = result.cookies['foo']
+ assert cookie.secure
+
+
def test_response_complex_case(client):
result = client.simulate_head('/')
| diff --git a/docs/api/api.rst b/docs/api/api.rst
index c3b72ac48..3b9a29fab 100644
--- a/docs/api/api.rst
+++ b/docs/api/api.rst
@@ -18,4 +18,6 @@ standard-compliant WSGI server.
.. autoclass:: falcon.RequestOptions
:members:
+.. autoclass:: falcon.ResponseOptions
+ :members:
diff --git a/docs/api/cookies.rst b/docs/api/cookies.rst
index 69c89d8c4..1c8e1dec2 100644
--- a/docs/api/cookies.rst
+++ b/docs/api/cookies.rst
@@ -98,14 +98,15 @@ the request.
.. warning::
- For this attribute to be effective, your application will need to
- enforce HTTPS when setting the cookie, as well as in all
- subsequent requests that require the cookie to be sent back from
- the client.
+ For this attribute to be effective, your web server or load
+ balancer will need to enforce HTTPS when setting the cookie, as
+ well as in all subsequent requests that require the cookie to be
+ sent back from the client.
When running your application in a development environment, you can
-disable this behavior by passing `secure=False` to
-:py:meth:`~.Response.set_cookie`. This lets you test your app locally
+disable this default behavior by setting
+:py:attr:`~.ResponseOptions.secure_cookies_by_default` to ``False``
+via :any:`API.resp_options`. This lets you test your app locally
without having to set up TLS. You can make this option configurable to
easily switch between development and production environments.
| [
{
"components": [
{
"doc": "Defines a set of configurable response options.\n\nAn instance of this class is exposed via :any:`API.resp_options` for\nconfiguring certain :py:class:`~.Response` behaviors.\n\nAttributes:\n secure_cookies_by_default (bool): Set to ``False`` in development\n ... | [
"tests/test_cookies.py::test_response_disable_secure_globally"
] | [
"tests/test_cookies.py::test_response_base_case",
"tests/test_cookies.py::test_response_complex_case",
"tests/test_cookies.py::test_cookie_expires_naive",
"tests/test_cookies.py::test_cookie_expires_aware",
"tests/test_cookies.py::test_cookies_setable",
"tests/test_cookies.py::test_cookie_max_age_float_an... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat(Response): Add an option for disabling secure cookies for testing
This should be a more practical approach vs. having to pass it every time to `set_cookie()`.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/response.py]
(definition of ResponseOptions:)
class ResponseOptions(object):
"""Defines a set of configurable response options.
An instance of this class is exposed via :any:`API.resp_options` for
configuring certain :py:class:`~.Response` behaviors.
Attributes:
secure_cookies_by_default (bool): Set to ``False`` in development
environments to make the `secure` attribute for all cookies
default to ``False``. This can make testing easier by
not requiring HTTPS. Note, however, that this setting can
be overridden via `set_cookie()`'s `secure` kwarg."""
(definition of ResponseOptions.__init__:)
def __init__(self):
[end of new definitions in falcon/response.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | |
conan-io__conan-884 | 884 | conan-io/conan | null | 807e94065016ed121a06696efaa337b92b5aef60 | 2017-01-20T17:49:28Z | diff --git a/.codecov.yml b/.codecov.yml
index 087f2d7b36d..43ef6a38928 100644
--- a/.codecov.yml
+++ b/.codecov.yml
@@ -8,8 +8,10 @@ coverage:
range: "85...100"
status:
- project: yes
- patch: yes
+ project:
+ default:
+ threshold: 3%
+ patch: no
changes: no
notify:
diff --git a/conans/client/command.py b/conans/client/command.py
index 013c668eff9..7f38614ef8d 100644
--- a/conans/client/command.py
+++ b/conans/client/command.py
@@ -32,6 +32,7 @@
from conans.util.files import rmdir, load, save_files, exception_message_safe
from conans.util.config_parser import get_bool_from_text
from conans.client.printer import Printer
+from conans.util.tracer import log_command, log_exception
class Extender(argparse.Action):
@@ -193,6 +194,7 @@ def new(self, *args):
'in the configure method')
args = parser.parse_args(*args)
+ log_command("new", vars(args))
root_folder = os.getcwd()
try:
@@ -250,6 +252,7 @@ def test_package(self, *args):
self._parse_args(parser)
args = parser.parse_args(*args)
+ log_command("test_package", vars(args))
current_path = os.getcwd()
root_folder = os.path.normpath(os.path.join(current_path, args.path))
@@ -396,6 +399,7 @@ def install(self, *args):
self._parse_args(parser)
args = parser.parse_args(*args)
+ log_command("install", vars(args))
self._user_io.out.werror_active = args.werror
current_path = os.getcwd()
@@ -488,6 +492,7 @@ def info(self, *args):
parser.add_argument("--scope", "-sc", nargs=1, action=Extender,
help='Use the specified scope in the info command')
args = parser.parse_args(*args)
+ log_command("info", vars(args))
options = self._get_tuples_list_from_extender_arg(args.options)
settings, package_settings = self._get_simple_and_package_tuples(args.settings)
@@ -526,6 +531,7 @@ def build(self, *args):
parser.add_argument("--file", "-f", help="specify conanfile filename")
parser.add_argument("--profile", "-pr", default=None, help='Apply a profile')
args = parser.parse_args(*args)
+ log_command("build", vars(args))
current_path = os.getcwd()
if args.path:
root_path = os.path.abspath(args.path)
@@ -563,6 +569,7 @@ def package(self, *args):
're-packaged')
args = parser.parse_args(*args)
+ log_command("package", vars(args))
current_path = os.getcwd()
try:
@@ -606,6 +613,7 @@ def source(self, *args):
" do nothing.")
args = parser.parse_args(*args)
+ log_command("source", vars(args))
current_path, reference = self._get_reference(args)
self._manager.source(current_path, reference, args.force)
@@ -629,6 +637,7 @@ def imports(self, *args):
help="Undo imports. Remove imported files")
args = parser.parse_args(*args)
+ log_command("imports", vars(args))
if args.undo:
if not os.path.isabs(args.reference):
@@ -656,6 +665,7 @@ def export(self, *args):
help='Optional. Do not remove the source folder in the local cache. '
'Use for testing purposes only')
args = parser.parse_args(*args)
+ log_command("export", vars(args))
current_path = os.path.abspath(args.path or os.getcwd())
keep_source = args.keep_source
@@ -680,6 +690,7 @@ def remove(self, *args):
action='store_true', help='Remove without requesting a confirmation')
parser.add_argument('-r', '--remote', help='Will remove from the specified remote')
args = parser.parse_args(*args)
+ log_command("remove", vars(args))
if args.packages:
args.packages = args.packages.split(",")
@@ -710,6 +721,7 @@ def copy(self, *args):
default=False,
help='Override destination packages and the package recipe')
args = parser.parse_args(*args)
+ log_command("copy", vars(args))
reference = ConanFileReference.loads(args.reference)
new_ref = ConanFileReference.loads("%s/%s@%s" % (reference.name,
@@ -735,6 +747,7 @@ def user(self, *parameters):
parser.add_argument('-c', '--clean', default=False,
action='store_true', help='Remove user and tokens for all remotes')
args = parser.parse_args(*parameters) # To enable -h
+ log_command("user", vars(args))
if args.clean:
localdb = LocalDB(self._client_cache.localdb)
@@ -763,6 +776,7 @@ def search(self, *args):
'reference: MyPackage/1.2'
'@user/channel')
args = parser.parse_args(*args)
+ log_command("search", vars(args))
reference = None
if args.pattern:
@@ -801,6 +815,7 @@ def upload(self, *args):
help='Waits specified seconds before retry again')
args = parser.parse_args(*args)
+ log_command("upload", vars(args))
if args.package and not is_a_reference(args.pattern):
raise ConanException("-p parameter only allowed with a valid recipe reference, not with a pattern")
@@ -844,6 +859,7 @@ def remote(self, *args):
parser_pupd.add_argument('reference', help='package recipe reference')
parser_pupd.add_argument('remote', help='name of the remote')
args = parser.parse_args(*args)
+ log_command("remote", vars(args))
registry = RemoteRegistry(self._client_cache.registry, self._user_io.out)
if args.subcommand == "list":
@@ -883,6 +899,7 @@ def profile(self, *args):
' a profile file in any location.')
parser_show.add_argument('profile', help='name of the profile')
args = parser.parse_args(*args)
+ log_command("profile", vars(args))
if args.subcommand == "list":
folder = self._client_cache.profiles_path
@@ -949,6 +966,11 @@ def run(self, *args):
errors = True
msg = exception_message_safe(exc)
self._user_io.out.error(msg)
+ log_exception(exc, msg)
+ except Exception as exc:
+ msg = exception_message_safe(exc)
+ log_exception(exc, msg)
+ raise exc
return errors
@@ -1007,7 +1029,11 @@ def instance_remote_manager(client_cache):
# Get a search manager
search_adapter = DiskSearchAdapter()
search_manager = DiskSearchManager(client_cache, search_adapter)
- command = Command(client_cache, user_io, ConanRunner(), remote_manager, search_manager)
+ print_commands_to_output = get_env("CONAN_PRINT_RUN_COMMANDS", False)
+ generate_run_log_file = get_env("CONAN_LOG_RUN_TO_FILE", False)
+ log_run_to_output = get_env("CONAN_LOG_RUN_TO_OUTPUT", True)
+ runner = ConanRunner(print_commands_to_output, generate_run_log_file, log_run_to_output)
+ command = Command(client_cache, user_io, runner, remote_manager, search_manager)
return command
diff --git a/conans/client/installer.py b/conans/client/installer.py
index 855e9abca35..e2479986725 100644
--- a/conans/client/installer.py
+++ b/conans/client/installer.py
@@ -4,7 +4,7 @@
import fnmatch
import shutil
-from conans.paths import CONANINFO, BUILD_INFO, CONANENV
+from conans.paths import CONANINFO, BUILD_INFO, CONANENV, RUN_LOG_NAME
from conans.util.files import save, rmdir
from conans.model.ref import PackageReference
from conans.util.log import logger
@@ -17,6 +17,7 @@
from conans.client.source import config_source
from conans.client.generators.env import ConanEnvGenerator
from conans.tools import environment_append
+from conans.util.tracer import log_package_built
def init_package_info(deps_graph, paths):
@@ -130,6 +131,7 @@ def _build(self, nodes_by_level, skip_private_nodes, build_mode):
build_allowed = self._build_allowed(conan_ref, build_mode, conan_file)
if not build_allowed:
self._raise_package_not_found_error(conan_ref, conan_file)
+
output = ScopedOutput(str(conan_ref), self._out)
package_ref = PackageReference(conan_ref, package_id)
package_folder = self._client_cache.package(package_ref, conan_file.short_paths)
@@ -138,6 +140,7 @@ def _build(self, nodes_by_level, skip_private_nodes, build_mode):
elif self._build_forced(conan_ref, build_mode, conan_file):
output.warn('Forced build from source')
+ t1 = time.time()
# Assign to node the propagated info
self._propagate_info(conan_ref, conan_file, flat)
@@ -149,6 +152,12 @@ def _build(self, nodes_by_level, skip_private_nodes, build_mode):
# Call the info method
self._package_info_conanfile(conan_ref, conan_file)
+
+ duration = time.time() - t1
+ log_file = os.path.join(self._client_cache.build(package_ref, conan_file.short_paths),
+ RUN_LOG_NAME)
+ log_file = log_file if os.path.exists(log_file) else None
+ log_package_built(package_ref, duration, log_file)
else:
# Get the package, we have a not outdated remote package
if conan_ref:
diff --git a/conans/client/proxy.py b/conans/client/proxy.py
index a266ca0abe6..126536e6633 100644
--- a/conans/client/proxy.py
+++ b/conans/client/proxy.py
@@ -9,6 +9,8 @@
import os
from conans.paths import rm_conandir
from conans.client.remover import DiskRemover
+from conans.util.tracer import log_package_got_from_local_cache,\
+ log_recipe_got_from_local_cache
class ConanProxy(object):
@@ -93,6 +95,7 @@ def get_package(self, package_ref, short_paths):
if local_package:
output.info('Already installed!')
installed = True
+ log_package_got_from_local_cache(package_ref)
else:
installed = self._retrieve_remote_package(package_ref, package_folder,
output)
@@ -129,6 +132,7 @@ def _refresh():
conanfile_path = self._client_cache.conanfile(conan_reference)
if os.path.exists(conanfile_path):
+ log_recipe_got_from_local_cache(conan_reference)
if self._check_updates:
ret = self.update_available(conan_reference)
if ret != 0: # Found and not equal
diff --git a/conans/client/remote_manager.py b/conans/client/remote_manager.py
index 21cd48e8eae..6500a2ba3ad 100644
--- a/conans/client/remote_manager.py
+++ b/conans/client/remote_manager.py
@@ -15,6 +15,8 @@
from conans.util.files import gzopen_without_timestamps
from conans.util.files import touch
from conans.model.manifest import discarded_file
+from conans.util.tracer import log_package_upload, log_recipe_upload,\
+ log_recipe_download, log_package_download
class RemoteManager(object):
@@ -27,6 +29,7 @@ def __init__(self, client_cache, remote_client, output):
def upload_conan(self, conan_reference, remote, retry, retry_wait):
"""Will upload the conans to the first remote"""
+ t1 = time.time()
export_folder = self._client_cache.export(conan_reference)
rel_files = relative_dirs(export_folder)
the_files = {filename: os.path.join(export_folder, filename) for filename in rel_files}
@@ -38,8 +41,10 @@ def upload_conan(self, conan_reference, remote, retry, retry_wait):
the_files = compress_conan_files(the_files, export_folder, EXPORT_TGZ_NAME,
CONANFILE, self._output)
- ret = self._call_remote(remote, "upload_conan", conan_reference, the_files,
+ ret = self._call_remote(remote, "upload_conan", conan_reference, the_files,
retry, retry_wait)
+ duration = time.time() - t1
+ log_recipe_upload(conan_reference, duration, the_files)
msg = "Uploaded conan recipe '%s' to '%s'" % (str(conan_reference), remote.name)
# FIXME: server dependent
if remote.url == "https://server.conan.io":
@@ -96,7 +101,9 @@ def upload_package(self, package_reference, remote, retry, retry_wait):
tmp = self._call_remote(remote, "upload_package", package_reference, the_files,
retry, retry_wait)
- logger.debug("====> Time remote_manager upload_package: %f" % (time.time() - t1))
+ duration = time.time() - t1
+ log_package_upload(package_reference, duration, the_files)
+ logger.debug("====> Time remote_manager upload_package: %f" % (duration))
return tmp
def get_conan_digest(self, conan_reference, remote):
@@ -130,7 +137,11 @@ def get_recipe(self, conan_reference, dest_folder, remote):
returns (dict relative_filepath:abs_path , remote_name)"""
rmdir(dest_folder) # Remove first the destination folder
+ t1 = time.time()
zipped_files = self._call_remote(remote, "get_recipe", conan_reference, dest_folder)
+ duration = time.time() - t1
+ log_recipe_download(conan_reference, duration, remote, zipped_files)
+
files = unzip_and_get_files(zipped_files, dest_folder, EXPORT_TGZ_NAME)
# Make sure that the source dir is deleted
rm_conandir(self._client_cache.source(conan_reference))
@@ -148,7 +159,10 @@ def get_package(self, package_reference, dest_folder, remote):
returns (dict relative_filepath:abs_path , remote_name)"""
rm_conandir(dest_folder) # Remove first the destination folder
+ t1 = time.time()
zipped_files = self._call_remote(remote, "get_package", package_reference, dest_folder)
+ duration = time.time() - t1
+ log_package_download(package_reference, duration, remote, zipped_files)
files = unzip_and_get_files(zipped_files, dest_folder, PACKAGE_TGZ_NAME)
# Issue #214 https://github.com/conan-io/conan/issues/214
for dirname, _, filenames in os.walk(dest_folder):
diff --git a/conans/client/rest/rest_client.py b/conans/client/rest/rest_client.py
index 8daf82a9a56..1b8550d4fe6 100644
--- a/conans/client/rest/rest_client.py
+++ b/conans/client/rest/rest_client.py
@@ -15,6 +15,7 @@
from conans import COMPLEX_SEARCH_CAPABILITY
from conans.search.search import filter_packages
from conans.model.info import ConanInfo
+from conans.util.tracer import log_client_rest_api_call
def handle_return_deserializer(deserializer=None):
@@ -234,9 +235,12 @@ def upload_package(self, package_reference, the_files, retry, retry_wait):
def authenticate(self, user, password):
'''Sends user + password to get a token'''
auth = HTTPBasicAuth(user, password)
- path = "%s/users/authenticate" % self._remote_api_url
- ret = self.requester.get(path, auth=auth, headers=self.custom_headers,
+ url = "%s/users/authenticate" % self._remote_api_url
+ t1 = time.time()
+ ret = self.requester.get(url, auth=auth, headers=self.custom_headers,
verify=self.verify_ssl)
+ duration = time.time() - t1
+ log_client_rest_api_call(url, "GET", duration, self.custom_headers)
return ret
@handle_return_deserializer()
@@ -244,8 +248,11 @@ def check_credentials(self):
"""If token is not valid will raise AuthenticationException.
User will be asked for new user/pass"""
url = "%s/users/check_credentials" % self._remote_api_url
+ t1 = time.time()
ret = self.requester.get(url, auth=self.auth, headers=self.custom_headers,
verify=self.verify_ssl)
+ duration = time.time() - t1
+ log_client_rest_api_call(url, "GET", duration, self.custom_headers)
return ret
def search(self, pattern=None, ignorecase=True):
@@ -367,19 +374,24 @@ def _post_json(self, url, payload):
return response
def _get_json(self, url, data=None):
+ t1 = time.time()
+ headers = self.custom_headers
if data: # POST request
- headers = {'Content-type': 'application/json',
- 'Accept': 'text/plain',
- 'Accept': 'application/json'}
- headers.update(self.custom_headers)
+ headers.update({'Content-type': 'application/json',
+ 'Accept': 'text/plain',
+ 'Accept': 'application/json'})
response = self.requester.post(url, auth=self.auth, headers=headers,
verify=self.verify_ssl,
stream=True,
data=json.dumps(data))
else:
- response = self.requester.get(url, auth=self.auth, headers=self.custom_headers,
+ response = self.requester.get(url, auth=self.auth, headers=headers,
verify=self.verify_ssl,
stream=True)
+
+ duration = time.time() - t1
+ method = "POST" if data else "GET"
+ log_client_rest_api_call(url, method, duration, headers)
if response.status_code != 200: # Error message is text
response.charset = "utf-8" # To be able to access ret.text (ret.content are bytes)
raise get_exception_from_error(response.status_code)(response.text)
diff --git a/conans/client/rest/uploader_downloader.py b/conans/client/rest/uploader_downloader.py
index 3a7269964df..30cf7d8fafd 100644
--- a/conans/client/rest/uploader_downloader.py
+++ b/conans/client/rest/uploader_downloader.py
@@ -4,6 +4,7 @@
from conans.util.files import save, sha1sum, exception_message_safe
import os
import time
+from conans.util.tracer import log_download
class Uploader(object):
@@ -112,6 +113,7 @@ def download(self, url, file_path=None, auth=None, retry=1, retry_wait=0):
# Should not happen, better to raise, probably we had to remove the dest folder before
raise ConanException("Error, the file to download already exists: '%s'" % file_path)
+ t1 = time.time()
ret = bytearray()
response = call_with_retry(self.output, retry, retry_wait, self._download_file, url, auth)
if not response.ok: # Do not retry if not found or whatever controlled error
@@ -142,6 +144,10 @@ def download(self, url, file_path=None, auth=None, retry=1, retry_wait=0):
if self.output:
print_progress(self.output, units)
last_progress = units
+
+ duration = time.time() - t1
+ log_download(url, duration)
+
if not file_path:
return bytes(ret)
else:
diff --git a/conans/client/runner.py b/conans/client/runner.py
index 692d0128448..c93f23498b9 100644
--- a/conans/client/runner.py
+++ b/conans/client/runner.py
@@ -1,46 +1,86 @@
import os
-from subprocess import Popen, PIPE, STDOUT
+import sys
+from subprocess import Popen, PIPE
from conans.util.files import decode_text
from conans.errors import ConanException
+import six
class ConanRunner(object):
- def __call__(self, command, output, cwd=None):
- """ There are two options, with or without you (sorry, U2 pun :)
- With or without output. Probably the Popen approach would be fine for both cases
- but I found it more error prone, slower, problems with very large outputs (typical
- when building C/C++ projects...) so I prefer to keep the os.system one for
- most cases, in which the user does not want to capture the output, and the Popen
- for cases they want
+ def __init__(self, print_commands_to_output=False, generate_run_log_file=False, log_run_to_output=True):
+ self._print_commands_to_output = print_commands_to_output
+ self._generate_run_log_file = generate_run_log_file
+ self._log_run_to_output = log_run_to_output
+
+ def __call__(self, command, output, log_filepath=None, cwd=None):
+ """
+ @param command: Command to execute
+ @param output: Instead of print to sys.stdout print to that stream. Could be None
+ @param log_filepath: If specified, also log to a file
+ @param cwd: Move to directory to execute
"""
- if output is True:
- if not cwd:
- return os.system(command)
- else:
- try:
- old_dir = os.getcwd()
- os.chdir(cwd)
- result = os.system(command)
- except Exception as e:
- raise ConanException("Error while executing '%s'\n\t%s" % (command, str(e)))
- finally:
- os.chdir(old_dir)
- return result
+ stream_output = output if output and hasattr(output, "write") else sys.stdout
+
+ if not self._generate_run_log_file:
+ log_filepath = None
+
+ # Log the command call in output and logger
+ call_message = "\n----Running------\n> %s\n-----------------\n" % command
+ if self._print_commands_to_output and stream_output and self._log_run_to_output:
+ stream_output.write(call_message)
+
+ # No output has to be redirected to logs or buffer or omitted
+ if output is True and not log_filepath and self._log_run_to_output:
+ return self._simple_os_call(command, cwd)
+ elif log_filepath:
+ if stream_output:
+ stream_output.write("Logging command output to file '%s'\n" % log_filepath)
+ with open(log_filepath, "a+") as log_handler:
+ if self._print_commands_to_output:
+ log_handler.write(call_message)
+ return self._pipe_os_call(command, stream_output, log_handler, cwd)
+ else:
+ return self._pipe_os_call(command, stream_output, None, cwd)
+
+ def _pipe_os_call(self, command, stream_output, log_handler, cwd):
+
+ try:
+ proc = Popen(command, shell=True, stdout=PIPE, stderr=PIPE, cwd=cwd)
+ except Exception as e:
+ raise ConanException("Error while executing '%s'\n\t%s" % (command, str(e)))
+
+ def get_stream_lines(the_stream):
+ while True:
+ line = the_stream.readline()
+ if not line:
+ break
+ decoded_line = decode_text(line)
+ if stream_output and self._log_run_to_output:
+ stream_output.write(decoded_line)
+ if log_handler:
+ # Write decoded in PY2 causes some ASCII encoding problems
+ # tried to open the log_handler binary but same result.
+ log_handler.write(line if six.PY2 else decoded_line)
+
+ get_stream_lines(proc.stdout)
+ get_stream_lines(proc.stderr)
+
+ proc.communicate()
+ ret = proc.returncode
+ return ret
+
+ def _simple_os_call(self, command, cwd):
+ if not cwd:
+ return os.system(command)
else:
- proc = Popen(command, shell=True, stdout=PIPE, stderr=STDOUT, cwd=cwd)
- if hasattr(output, "write"):
- while True:
- line = proc.stdout.readline()
- if not line:
- break
- output.write(decode_text(line))
- out, err = proc.communicate()
-
- if hasattr(output, "write"):
- if out:
- output.write(decode_text(out))
- if err:
- output.write(decode_text(err))
-
- return proc.returncode
+ try:
+ old_dir = os.getcwd()
+ os.chdir(cwd)
+ result = os.system(command)
+ except Exception as e:
+ raise ConanException("Error while executing"
+ " '%s'\n\t%s" % (command, str(e)))
+ finally:
+ os.chdir(old_dir)
+ return result
diff --git a/conans/model/conan_file.py b/conans/model/conan_file.py
index 7c92e7045f9..283af36194e 100644
--- a/conans/model/conan_file.py
+++ b/conans/model/conan_file.py
@@ -5,6 +5,8 @@
from conans.errors import ConanException
from conans.model.env_info import DepsEnvInfo
import os
+from conans.util.files import mkdir
+from conans.paths import RUN_LOG_NAME
def create_options(conanfile):
@@ -228,7 +230,7 @@ def run(self, command, output=True, cwd=None):
""" runs such a command in the folder the Conan
is defined
"""
- retcode = self._runner(command, output, cwd)
+ retcode = self._runner(command, output, os.path.abspath(RUN_LOG_NAME), cwd)
if retcode != 0:
raise ConanException("Error %d while executing %s" % (retcode, command))
diff --git a/conans/paths.py b/conans/paths.py
index 95d0dafba16..722a0d8a92f 100644
--- a/conans/paths.py
+++ b/conans/paths.py
@@ -34,6 +34,8 @@
EXPORT_TGZ_NAME = "conan_export.tgz"
CONAN_LINK = ".conan_link"
+RUN_LOG_NAME = "conan_run.log"
+
def conan_expand_user(path):
""" wrapper to the original expanduser function, to workaround python returning
diff --git a/conans/util/env_reader.py b/conans/util/env_reader.py
index dd077513fd0..e49e24819f8 100644
--- a/conans/util/env_reader.py
+++ b/conans/util/env_reader.py
@@ -11,18 +11,16 @@
def get_env(env_key, default=None, environment=os.environ):
'''Get the env variable associated with env_key'''
-
env_var = environment.get(env_key, default)
if env_var != default:
if isinstance(default, str):
return env_var
+ elif isinstance(default, bool):
+ return env_var == "1"
elif isinstance(default, int):
return int(env_var)
elif isinstance(default, float):
return float(env_var)
elif isinstance(default, list):
return env_var.split(",")
- elif isinstance(default, bool):
- return env_var == "1"
-
return env_var
diff --git a/conans/util/log.py b/conans/util/log.py
index 7e39f3ad66e..8c5efa3c55c 100644
--- a/conans/util/log.py
+++ b/conans/util/log.py
@@ -3,6 +3,7 @@
import sys
from conans.util.env_reader import get_env
+
# #### LOGGER, MOVED FROM CONF BECAUSE OF MULTIPLE PROBLEM WITH CIRCULAR INCLUDES #####
CONAN_LOGGING_LEVEL = get_env('CONAN_LOGGING_LEVEL', logging.CRITICAL)
CONAN_LOGGING_FILE = get_env('CONAN_LOGGING_FILE', None) # None is stdout
diff --git a/conans/util/tracer.py b/conans/util/tracer.py
new file mode 100644
index 00000000000..5f674e29106
--- /dev/null
+++ b/conans/util/tracer.py
@@ -0,0 +1,131 @@
+import os
+from conans.errors import ConanException
+import fasteners
+from conans.util.log import logger
+import json
+from conans.model.ref import PackageReference, ConanFileReference
+import time
+from os.path import isdir
+import copy
+
+TRACER_ACTIONS = ["UPLOADED_RECIPE", "UPLOADED_PACKAGE",
+ "DOWNLOADED_RECIPE", "DOWNLOADED_PACKAGE",
+ "PACKAGE_BUILT_FROM_SOURCES",
+ "GOT_RECIPE_FROM_LOCAL_CACHE", "GOT_PACKAGE_FROM_LOCAL_CACHE",
+ "REST_API_CALL", "COMMAND",
+ "EXCEPTION",
+ "DOWNLOAD"]
+
+MASKED_FIELD = "**********"
+
+
+def _validate_action(action_name):
+ if action_name not in TRACER_ACTIONS:
+ raise ConanException("Unknown action %s" % action_name)
+
+tracer_file = None
+
+
+def _get_tracer_file():
+ '''
+ If CONAN_TRACE_FILE is a file in an existing dir will log to it creating the file if needed
+ Otherwise won't log anything
+ '''
+ global tracer_file
+ if tracer_file is None:
+ trace_path = os.environ.get("CONAN_TRACE_FILE", None)
+ if trace_path is not None:
+ if not os.path.exists(os.path.dirname(trace_path)):
+ raise ConanException("The specified path doesn't exist: '%s'" % trace_path)
+ if isdir(trace_path):
+ raise ConanException("CONAN_TRACE_FILE is a directory. Please, specify a file path")
+ tracer_file = trace_path
+ return tracer_file
+
+
+def _append_to_log(obj):
+ """Add a new line to the log file locking the file to protect concurrent access"""
+ if _get_tracer_file():
+ filepath = _get_tracer_file()
+ with fasteners.InterProcessLock(filepath + ".lock", logger=logger):
+ with open(filepath, "a") as logfile:
+ logfile.write(json.dumps(obj, sort_keys=True) + "\n")
+
+
+def _append_action(action_name, props):
+ """Validate the action_name and append to logs"""
+ _validate_action(action_name)
+ props["_action"] = action_name
+ props["time"] = time.time()
+ _append_to_log(props)
+
+
+# ############## LOG METHODS ######################
+
+def log_recipe_upload(conan_reference, duration, files_uploaded):
+ assert(isinstance(conan_reference, ConanFileReference))
+ _append_action("UPLOADED_RECIPE", {"_id": str(conan_reference),
+ "duration": duration,
+ "files": files_uploaded})
+
+
+def log_package_upload(package_ref, duration, files_uploaded):
+ '''files_uploaded is a dict with relative path as keys and abs path as values'''
+ assert(isinstance(package_ref, PackageReference))
+ _append_action("UPLOADED_PACKAGE", {"_id": str(package_ref),
+ "duration": duration,
+ "files": files_uploaded})
+
+
+def log_recipe_download(conan_reference, duration, remote, files_downloaded):
+ assert(isinstance(conan_reference, ConanFileReference))
+ _append_action("DOWNLOADED_RECIPE", {"_id": str(conan_reference),
+ "duration": duration,
+ "remote": remote.name,
+ "files": files_downloaded})
+
+
+def log_package_download(package_ref, duration, remote, files_downloaded):
+ assert(isinstance(package_ref, PackageReference))
+ _append_action("DOWNLOADED_PACKAGE", {"_id": str(package_ref),
+ "duration": duration,
+ "remote": remote.name,
+ "files": files_downloaded})
+
+
+def log_recipe_got_from_local_cache(conan_reference):
+ assert(isinstance(conan_reference, ConanFileReference))
+ _append_action("GOT_RECIPE_FROM_LOCAL_CACHE", {"_id": str(conan_reference)})
+
+
+def log_package_got_from_local_cache(package_ref):
+ assert(isinstance(package_ref, PackageReference))
+ _append_action("GOT_PACKAGE_FROM_LOCAL_CACHE", {"_id": str(package_ref)})
+
+
+def log_package_built(package_ref, duration, log_run=None):
+ assert(isinstance(package_ref, PackageReference))
+ _append_action("PACKAGE_BUILT_FROM_SOURCES", {"_id": str(package_ref), "duration": duration, "log": log_run})
+
+
+def log_client_rest_api_call(url, method, duration, headers):
+ headers = copy.copy(headers)
+ headers["Authorization"] = MASKED_FIELD
+ headers["X-Client-Anonymous-Id"] = MASKED_FIELD
+ _append_action("REST_API_CALL", {"method": method, "url": url,
+ "duration": duration, "headers": headers})
+
+
+def log_command(name, parameters):
+ if name == "user" and "password" in parameters:
+ parameters = copy.copy(parameters) # Ensure we don't alter any app object like args
+ parameters["password"] = MASKED_FIELD
+ _append_action("COMMAND", {"name": name, "parameters": parameters})
+
+
+def log_exception(exc, message):
+ _append_action("EXCEPTION", {"class": str(exc.__class__.__name__), "message": message})
+
+
+def log_download(url, duration):
+ _append_action("DOWNLOAD", {"url": url, "duration": duration})
| diff --git a/conans/test/conan_trace_file_test.py b/conans/test/conan_trace_file_test.py
new file mode 100644
index 00000000000..c17b613ef53
--- /dev/null
+++ b/conans/test/conan_trace_file_test.py
@@ -0,0 +1,54 @@
+import unittest
+from conans import tools
+from conans.test.utils.test_files import temp_folder
+import os
+from conans.model.ref import ConanFileReference
+from conans.test.utils.cpp_test_files import cpp_hello_conan_files
+from conans.test.tools import TestServer, TestClient
+from conans.util.files import load
+import json
+
+
+class ConanTraceTest(unittest.TestCase):
+
+ def setUp(self):
+ test_server = TestServer()
+ self.servers = {"default": test_server}
+ self.client = TestClient(servers=self.servers, users={"default": [("lasote", "mypass")]})
+
+ def testTraceActions(self):
+ trace_file = os.path.join(temp_folder(), "conan_trace.log")
+ with tools.environment_append({"CONAN_TRACE_FILE": trace_file}):
+ # UPLOAD A PACKAGE
+ conan_reference = ConanFileReference.loads("Hello0/0.1@lasote/stable")
+ files = cpp_hello_conan_files("Hello0", "0.1", need_patch=True, build=False)
+ self.client.save(files)
+ self.client.run("user lasote -p mypass -r default")
+ self.client.run("export lasote/stable")
+ self.client.run("install %s --build missing" % str(conan_reference))
+ self.client.run("upload %s --all" % str(conan_reference))
+
+ traces = load(trace_file)
+ self.assertNotIn("mypass", traces)
+ self.assertIn('"password": "**********"', traces)
+ self.assertIn('"Authorization": "**********"', traces)
+ self.assertIn('"X-Client-Anonymous-Id": "**********"', traces)
+ actions = traces.splitlines()
+ self.assertEquals(len(actions), 17)
+ for trace in actions:
+ doc = json.loads(trace)
+ self.assertIn("_action", doc) # Valid jsons
+
+ self.assertEquals(json.loads(actions[0])["_action"], "COMMAND")
+ self.assertEquals(json.loads(actions[0])["name"], "user")
+
+ self.assertEquals(json.loads(actions[2])["_action"], "COMMAND")
+ self.assertEquals(json.loads(actions[2])["name"], "export")
+
+ self.assertEquals(json.loads(actions[3])["_action"], "COMMAND")
+ self.assertEquals(json.loads(actions[3])["name"], "install")
+
+ self.assertEquals(json.loads(actions[4])["_action"], "GOT_RECIPE_FROM_LOCAL_CACHE")
+ self.assertEquals(json.loads(actions[4])["_id"], "Hello0/0.1@lasote/stable")
+
+ self.assertEquals(json.loads(actions[-1])["_action"], "UPLOADED_PACKAGE")
diff --git a/conans/test/runner_test.py b/conans/test/runner_test.py
index 410f744f0ae..f51a48793fd 100644
--- a/conans/test/runner_test.py
+++ b/conans/test/runner_test.py
@@ -1,10 +1,21 @@
import unittest
from conans.test.tools import TestClient
import os
+from conans.client.runner import ConanRunner
class RunnerTest(unittest.TestCase):
+ def _install_and_build(self, conanfile_text, runner=None):
+ client = TestClient(runner=runner)
+ files = {"conanfile.py": conanfile_text}
+ test_folder = os.path.join(client.current_folder, "test_folder")
+ self.assertFalse(os.path.exists(test_folder))
+ client.save(files)
+ client.run("install")
+ client.run("build")
+ return client
+
def basic_test(self):
conanfile = '''
from conans import ConanFile
@@ -17,16 +28,77 @@ def build(self):
self._runner = ConanRunner()
self.run("mkdir test_folder")
'''
- files = {"conanfile.py": conanfile}
-
- client = TestClient()
+ client = self._install_and_build(conanfile)
test_folder = os.path.join(client.current_folder, "test_folder")
- self.assertFalse(os.path.exists(test_folder))
- client.save(files)
- client.run("install")
- client.run("build")
self.assertTrue(os.path.exists(test_folder))
+ def log_test(self):
+ conanfile = '''
+from conans import ConanFile
+from conans.client.runner import ConanRunner
+import platform
+
+class ConanFileToolsTest(ConanFile):
+
+ def build(self):
+ self.run("cmake --version")
+ '''
+ # A runner logging everything
+ runner = ConanRunner(print_commands_to_output=True,
+ generate_run_log_file=True,
+ log_run_to_output=True)
+ client = self._install_and_build(conanfile, runner=runner)
+ self.assertIn("--Running---", client.user_io.out)
+ self.assertIn("> cmake --version", client.user_io.out)
+ self.assertIn("cmake version", client.user_io.out)
+ self.assertIn("Logging command output to file ", client.user_io.out)
+
+ # A runner logging everything
+ runner = ConanRunner(print_commands_to_output=True,
+ generate_run_log_file=False,
+ log_run_to_output=True)
+ client = self._install_and_build(conanfile, runner=runner)
+ self.assertIn("--Running---", client.user_io.out)
+ self.assertIn("> cmake --version", client.user_io.out)
+ self.assertIn("cmake version", client.user_io.out)
+ self.assertNotIn("Logging command output to file ", client.user_io.out)
+
+ runner = ConanRunner(print_commands_to_output=False,
+ generate_run_log_file=True,
+ log_run_to_output=True)
+ client = self._install_and_build(conanfile, runner=runner)
+ self.assertNotIn("--Running---", client.user_io.out)
+ self.assertNotIn("> cmake --version", client.user_io.out)
+ self.assertIn("cmake version", client.user_io.out)
+ self.assertIn("Logging command output to file ", client.user_io.out)
+
+ runner = ConanRunner(print_commands_to_output=False,
+ generate_run_log_file=False,
+ log_run_to_output=True)
+ client = self._install_and_build(conanfile, runner=runner)
+ self.assertNotIn("--Running---", client.user_io.out)
+ self.assertNotIn("> cmake --version", client.user_io.out)
+ self.assertIn("cmake version", client.user_io.out)
+ self.assertNotIn("Logging command output to file ", client.user_io.out)
+
+ runner = ConanRunner(print_commands_to_output=False,
+ generate_run_log_file=False,
+ log_run_to_output=False)
+ client = self._install_and_build(conanfile, runner=runner)
+ self.assertNotIn("--Running---", client.user_io.out)
+ self.assertNotIn("> cmake --version", client.user_io.out)
+ self.assertNotIn("cmake version", client.user_io.out)
+ self.assertNotIn("Logging command output to file ", client.user_io.out)
+
+ runner = ConanRunner(print_commands_to_output=False,
+ generate_run_log_file=True,
+ log_run_to_output=False)
+ client = self._install_and_build(conanfile, runner=runner)
+ self.assertNotIn("--Running---", client.user_io.out)
+ self.assertNotIn("> cmake --version", client.user_io.out)
+ self.assertNotIn("cmake version", client.user_io.out)
+ self.assertIn("Logging command output to file ", client.user_io.out)
+
def cwd_test(self):
conanfile = '''
from conans import ConanFile
diff --git a/conans/test/tools.py b/conans/test/tools.py
index 7df2ea0f4ef..43b01047828 100644
--- a/conans/test/tools.py
+++ b/conans/test/tools.py
@@ -283,7 +283,7 @@ class TestClient(object):
def __init__(self, base_folder=None, current_folder=None,
servers=None, users=None, client_version=CLIENT_VERSION,
min_server_compatible_version=MIN_SERVER_COMPATIBLE_VERSION,
- requester_class=None):
+ requester_class=None, runner=None):
"""
storage_folder: Local storage path
current_folder: Current execution folder
@@ -312,6 +312,7 @@ def __init__(self, base_folder=None, current_folder=None,
get_env("CONAN_LIBCXX", "libstdc++"))
self.requester_class = requester_class
+ self.conan_runner = runner
self.init_dynamic_vars()
@@ -358,7 +359,7 @@ def _init_collaborators(self, user_io=None):
output = TestBufferConanOutput()
self.user_io = user_io or MockedUserIO(self.users, out=output)
- self.runner = TestRunner(output)
+ self.runner = TestRunner(output, runner=self.conan_runner)
# Check if servers are real
real_servers = False
diff --git a/conans/test/utils/runner.py b/conans/test/utils/runner.py
index 65b6a1ed10a..62538d0da3a 100644
--- a/conans/test/utils/runner.py
+++ b/conans/test/utils/runner.py
@@ -5,9 +5,11 @@ class TestRunner(object):
"""Wraps Conan runner and allows to redirect all the ouput to an StrinIO passed
in the __init__ method"""
- def __init__(self, output):
+ def __init__(self, output, runner=None):
self._output = output
- self.runner = ConanRunner()
+ self.runner = runner or ConanRunner(print_commands_to_output=True,
+ generate_run_log_file=True,
+ log_run_to_output=True)
- def __call__(self, command, output=None, cwd=None):
- return self.runner(command, output=self._output, cwd=cwd)
+ def __call__(self, command, output=None, log_filepath=None, cwd=None):
+ return self.runner(command, output=self._output, log_filepath=log_filepath, cwd=cwd)
| diff --git a/.codecov.yml b/.codecov.yml
index 087f2d7b36d..43ef6a38928 100644
--- a/.codecov.yml
+++ b/.codecov.yml
@@ -8,8 +8,10 @@ coverage:
range: "85...100"
status:
- project: yes
- patch: yes
+ project:
+ default:
+ threshold: 3%
+ patch: no
changes: no
notify:
| [
{
"components": [
{
"doc": "",
"lines": [
11,
14
],
"name": "ConanRunner.__init__",
"signature": "def __init__(self, print_commands_to_output=False, generate_run_log_file=False, log_run_to_output=True):",
"type": "function"
},
... | [
"conans/test/conan_trace_file_test.py::ConanTraceTest::testTraceActions"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Feature/conan logger
- Create a log-trace with conan actions. Logged if ``CONAN_TRACE_FILE`` points to a file.
- Improved runner to optionally log to file the "self.run" commands, controlled with those variables:
- CONAN_PRINT_RUN_COMMANDS: Print the commands to output e.j:``*****Running*******\n > cmake``
- CONAN_LOG_RUN_TO_FILE: Create a log file in the current folder (source, build, etc) appending each self.run execution. Will use PIPE
- CONAN_LOG_RUN_TO_OUTPUT: Default True. If False won't log any output. (Will use PIPE and discard the output)
- If CONAN_LOG_RUN_TO_OUTPUT and not CONAN_LOG_RUN_TO_FILE and not used special StrinIO in self.run command, sys.command will be used (default)
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/client/runner.py]
(definition of ConanRunner.__init__:)
def __init__(self, print_commands_to_output=False, generate_run_log_file=False, log_run_to_output=True):
(definition of ConanRunner._pipe_os_call:)
def _pipe_os_call(self, command, stream_output, log_handler, cwd):
(definition of ConanRunner._pipe_os_call.get_stream_lines:)
def get_stream_lines(the_stream):
(definition of ConanRunner._simple_os_call:)
def _simple_os_call(self, command, cwd):
[end of new definitions in conans/client/runner.py]
[start of new definitions in conans/util/tracer.py]
(definition of _validate_action:)
def _validate_action(action_name):
(definition of _get_tracer_file:)
def _get_tracer_file():
"""If CONAN_TRACE_FILE is a file in an existing dir will log to it creating the file if needed
Otherwise won't log anything"""
(definition of _append_to_log:)
def _append_to_log(obj):
"""Add a new line to the log file locking the file to protect concurrent access"""
(definition of _append_action:)
def _append_action(action_name, props):
"""Validate the action_name and append to logs"""
(definition of log_recipe_upload:)
def log_recipe_upload(conan_reference, duration, files_uploaded):
(definition of log_package_upload:)
def log_package_upload(package_ref, duration, files_uploaded):
"""files_uploaded is a dict with relative path as keys and abs path as values"""
(definition of log_recipe_download:)
def log_recipe_download(conan_reference, duration, remote, files_downloaded):
(definition of log_package_download:)
def log_package_download(package_ref, duration, remote, files_downloaded):
(definition of log_recipe_got_from_local_cache:)
def log_recipe_got_from_local_cache(conan_reference):
(definition of log_package_got_from_local_cache:)
def log_package_got_from_local_cache(package_ref):
(definition of log_package_built:)
def log_package_built(package_ref, duration, log_run=None):
(definition of log_client_rest_api_call:)
def log_client_rest_api_call(url, method, duration, headers):
(definition of log_command:)
def log_command(name, parameters):
(definition of log_exception:)
def log_exception(exc, message):
(definition of log_download:)
def log_download(url, duration):
[end of new definitions in conans/util/tracer.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | |
conan-io__conan-802 | 802 | conan-io/conan | null | 1a5eaec00d5f880190e32558b605314b6016d688 | 2016-12-26T11:39:48Z | diff --git a/conans/client/conf/__init__.py b/conans/client/conf/__init__.py
index 7330cf80b76..152e2d6aa22 100644
--- a/conans/client/conf/__init__.py
+++ b/conans/client/conf/__init__.py
@@ -7,6 +7,7 @@
from conans.model.values import Values
import urllib
from conans.paths import conan_expand_user
+from collections import OrderedDict
MIN_SERVER_COMPATIBLE_VERSION = '0.12.0'
@@ -97,5 +98,36 @@ def proxies(self):
@property
def settings_defaults(self):
default_settings = self.get_conf("settings_defaults")
+ default_settings = self._mix_settings_with_env(default_settings)
values = Values.from_list(default_settings)
return values
+
+ def _mix_settings_with_env(self, settings):
+ """Reads CONAN_ENV_XXXX variables from environment
+ and if it's defined uses these value instead of the default
+ from conf file. If you specify a compiler with ENV variable you
+ need to specify all the subsettings, the file defaulted will be
+ ignored"""
+
+ def get_env_value(name):
+ env_name = "CONAN_ENV_%s" % name.upper().replace(".", "_")
+ return os.getenv(env_name, None)
+
+ def get_setting_name(env_name):
+ return env_name[10:].lower().replace("_", ".")
+
+ ret = OrderedDict()
+ for name, value in settings:
+ if get_env_value(name):
+ ret[name] = get_env_value(name)
+ else:
+ # being a subsetting, if parent exist in env discard this, because
+ # env doesn't define this setting. EX: env=>Visual Studio but
+ # env doesn't define compiler.libcxx
+ if "." not in name or not get_env_value(name.split(".")[0]):
+ ret[name] = value
+ # Now read if there are more env variables
+ for env, value in sorted(os.environ.items()):
+ if env.startswith("CONAN_ENV_") and get_setting_name(env) not in ret:
+ ret[get_setting_name(env)] = value
+ return list(ret.items())
| diff --git a/conans/test/client_conf_test.py b/conans/test/client_conf_test.py
new file mode 100644
index 00000000000..006ef8fc51f
--- /dev/null
+++ b/conans/test/client_conf_test.py
@@ -0,0 +1,80 @@
+import unittest
+from conans.test.utils.test_files import temp_folder
+from conans.client.conf import ConanClientConfigParser
+from conans.util.files import save
+from conans.client.client_cache import CONAN_CONF
+import os
+from conans import tools
+
+
+default_client_conf = '''[storage]
+path: ~/.conan/data
+
+[proxies]
+[settings_defaults]
+arch=x86_64
+build_type=Release
+compiler=gcc
+compiler.libcxx=libstdc++
+compiler.version=4.9
+os=Linux
+'''
+
+
+class ClientConfTest(unittest.TestCase):
+
+ def env_setting_override_test(self):
+ tmp_dir = temp_folder()
+ save(os.path.join(tmp_dir, CONAN_CONF), default_client_conf)
+ config = ConanClientConfigParser(os.path.join(tmp_dir, CONAN_CONF))
+
+ # If I don't specify an ENV for compiler, the subsettings are kept,
+ # except the compiler version that I'm overriding
+ with tools.environment_append({"CONAN_ENV_COMPILER_VERSION": "4.6"}):
+ settings = config.settings_defaults
+ self.assertEquals(settings.as_list(), [("arch", "x86_64"),
+ ("build_type", "Release"),
+ ("compiler", "gcc"),
+ ("compiler.libcxx", "libstdc++"),
+ ("compiler.version", "4.6"),
+ ("os", "Linux")])
+ with tools.environment_append({}):
+ settings = config.settings_defaults
+ self.assertEquals(settings.as_list(), [("arch", "x86_64"),
+ ("build_type", "Release"),
+ ("compiler", "gcc"),
+ ("compiler.libcxx", "libstdc++"),
+ ("compiler.version", "4.9"),
+ ("os", "Linux")])
+
+ # If compiler is overwritten compiler subsettings are not assigned
+ with tools.environment_append({"CONAN_ENV_COMPILER": "Visual Studio"}):
+ settings = config.settings_defaults
+ self.assertEquals(settings.as_list(), [("arch", "x86_64"),
+ ("build_type", "Release"),
+ ("compiler", "Visual Studio"),
+ ("os", "Linux")])
+
+ with tools.environment_append({"CONAN_ENV_COMPILER": "Visual Studio",
+ "CONAN_ENV_COMPILER_VERSION": "14",
+ "CONAN_ENV_COMPILER_RUNTIME": "MDd"}):
+ settings = config.settings_defaults
+ self.assertEquals(dict(settings.as_list()), dict([("arch", "x86_64"),
+ ("build_type", "Release"),
+ ("compiler", "Visual Studio"),
+ ("compiler.version", "14"),
+ ("compiler.runtime", "MDd"),
+ ("os", "Linux")]))
+
+ # Specified settings are applied in order (first fake and then fake.setting)
+ with tools.environment_append({"CONAN_ENV_FAKE": "Fake1",
+ "CONAN_ENV_FAKE_SETTING": "Fake"}):
+ settings = config.settings_defaults
+ self.assertEquals(dict(settings.as_list()), dict([("arch", "x86_64"),
+ ("build_type", "Release"),
+ ("compiler", "gcc"),
+ ("compiler.libcxx", "libstdc++"),
+ ("compiler.version", "4.9"),
+ ("os", "Linux"),
+ ("fake", "Fake1"),
+ ("fake.setting", "Fake")]))
diff --git a/conans/test/integration/settings_override_test.py b/conans/test/integration/settings_override_test.py
index d848aac6595..099e90dec1b 100644
--- a/conans/test/integration/settings_override_test.py
+++ b/conans/test/integration/settings_override_test.py
@@ -5,6 +5,7 @@
from conans.model.ref import ConanFileReference
from conans.util.files import load
import os
+from conans import tools
class SettingsOverrideTest(unittest.TestCase):
@@ -68,6 +69,19 @@ def test_override_in_non_existing_recipe(self):
self.assertIn("COMPILER=> MinGWBuild Visual Studio", self.client.user_io.out)
self.assertIn("COMPILER=> VisualBuild Visual Studio", self.client.user_io.out)
+ def test_override_setting_with_env_variables(self):
+ files = cpp_hello_conan_files(name="VisualBuild",
+ version="0.1", build=False, deps=["MinGWBuild/0.1@lasote/testing"])
+ self._patch_build_to_print_compiler(files)
+ self.client.save(files)
+ self.client.run("export lasote/testing")
+ with tools.environment_append({"CONAN_ENV_COMPILER": "Visual Studio",
+ "CONAN_ENV_COMPILER_VERSION": "14",
+ "CONAN_ENV_COMPILER_RUNTIME": "MD"}):
+ self.client.run("install VisualBuild/0.1@lasote/testing --build missing")
+
+ self.assertIn("COMPILER=> MinGWBuild Visual Studio", self.client.user_io.out)
+
def _patch_build_to_print_compiler(self, files):
files[CONANFILE] = files[CONANFILE] + '''
def build(self):
| [
{
"components": [
{
"doc": "Reads CONAN_ENV_XXXX variables from environment\nand if it's defined uses these value instead of the default\nfrom conf file. If you specify a compiler with ENV variable you\nneed to specify all the subsettings, the file defaulted will be\nignored",
"lines": [
... | [
"conans/test/integration/settings_override_test.py::SettingsOverrideTest::test_override_setting_with_env_variables"
] | [
"conans/test/integration/settings_override_test.py::SettingsOverrideTest::test_non_existing_setting",
"conans/test/integration/settings_override_test.py::SettingsOverrideTest::test_override",
"conans/test/integration/settings_override_test.py::SettingsOverrideTest::test_override_in_non_existing_recipe"
] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Allow to define default settings as Environment variables
#489
It allows to override default settings from environment variables. If a setting with subsettings like compiler is defined (CONAN_ENV_COMPILER) the subsettings are ignored from the conf file and they need to be defined as an environment variables too.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/client/conf/__init__.py]
(definition of ConanClientConfigParser._mix_settings_with_env:)
def _mix_settings_with_env(self, settings):
"""Reads CONAN_ENV_XXXX variables from environment
and if it's defined uses these value instead of the default
from conf file. If you specify a compiler with ENV variable you
need to specify all the subsettings, the file defaulted will be
ignored"""
(definition of ConanClientConfigParser._mix_settings_with_env.get_env_value:)
def get_env_value(name):
(definition of ConanClientConfigParser._mix_settings_with_env.get_setting_name:)
def get_setting_name(env_name):
[end of new definitions in conans/client/conf/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | ||
scikit-learn__scikit-learn-8075 | 8,075 | scikit-learn/scikit-learn | 0.20 | beb2aa0395cdbc65843db9acf1a9fdfd25e18e7e | 2016-12-18T10:10:43Z | diff --git a/doc/modules/classes.rst b/doc/modules/classes.rst
index f217d7848c9b2..5a1d5fcda6d6e 100644
--- a/doc/modules/classes.rst
+++ b/doc/modules/classes.rst
@@ -656,7 +656,8 @@ Kernels:
impute.SimpleImputer
impute.ChainedImputer
-
+ impute.MissingIndicator
+
.. _kernel_approximation_ref:
:mod:`sklearn.kernel_approximation` Kernel Approximation
diff --git a/doc/modules/impute.rst b/doc/modules/impute.rst
index 0f9089c981782..84c8538f1f05b 100644
--- a/doc/modules/impute.rst
+++ b/doc/modules/impute.rst
@@ -121,7 +121,6 @@ Both :class:`SimpleImputer` and :class:`ChainedImputer` can be used in a Pipelin
as a way to build a composite estimator that supports imputation.
See :ref:`sphx_glr_auto_examples_plot_missing_values.py`.
-
.. _multiple_imputation:
Multiple vs. Single Imputation
@@ -142,3 +141,49 @@ random seeds with the ``n_imputations`` parameter set to 1.
Note that a call to the ``transform`` method of :class:`ChainedImputer` is not
allowed to change the number of samples. Therefore multiple imputations cannot be
achieved by a single call to ``transform``.
+
+.. _missing_indicator:
+
+Marking imputed values
+======================
+
+The :class:`MissingIndicator` transformer is useful to transform a dataset into
+corresponding binary matrix indicating the presence of missing values in the
+dataset. This transformation is useful in conjunction with imputation. When
+using imputation, preserving the information about which values had been
+missing can be informative.
+
+``NaN`` is usually used as the placeholder for missing values. However, it
+enforces the data type to be float. The parameter ``missing_values`` allows to
+specify other placeholder such as integer. In the following example, we will
+use ``-1`` as missing values::
+
+ >>> from sklearn.impute import MissingIndicator
+ >>> X = np.array([[-1, -1, 1, 3],
+ ... [4, -1, 0, -1],
+ ... [8, -1, 1, 0]])
+ >>> indicator = MissingIndicator(missing_values=-1)
+ >>> mask_missing_values_only = indicator.fit_transform(X)
+ >>> mask_missing_values_only
+ array([[ True, True, False],
+ [False, True, True],
+ [False, True, False]])
+
+The ``features`` parameter is used to choose the features for which the mask is
+constructed. By default, it is ``'missing-only'`` which returns the imputer
+mask of the features containing missing values at ``fit`` time::
+
+ >>> indicator.features_
+ array([0, 1, 3])
+
+The ``features`` parameter can be set to ``'all'`` to returned all features
+whether or not they contain missing values::
+
+ >>> indicator = MissingIndicator(missing_values=-1, features="all")
+ >>> mask_all = indicator.fit_transform(X)
+ >>> mask_all
+ array([[ True, True, False, False],
+ [False, True, False, True],
+ [False, True, False, False]])
+ >>> indicator.features_
+ array([0, 1, 2, 3])
diff --git a/doc/whats_new/v0.20.rst b/doc/whats_new/v0.20.rst
index d9aa052d740db..cfb7835397e38 100644
--- a/doc/whats_new/v0.20.rst
+++ b/doc/whats_new/v0.20.rst
@@ -149,6 +149,10 @@ Preprocessing
back to the original space via an inverse transform. :issue:`9041` by
`Andreas Müller`_ and :user:`Guillaume Lemaitre <glemaitre>`.
+- Added :class:`MissingIndicator` which generates a binary indicator for
+ missing values. :issue:`8075` by :user:`Maniteja Nandana <maniteja123>` and
+ :user:`Guillaume Lemaitre <glemaitre>`.
+
- Added :class:`impute.ChainedImputer`, which is a strategy for imputing missing
values by modeling each feature with missing values as a function of
other features in a round-robin fashion. :issue:`8478` by
diff --git a/examples/plot_missing_values.py b/examples/plot_missing_values.py
index d238a16592edb..8cd20087dfb0f 100644
--- a/examples/plot_missing_values.py
+++ b/examples/plot_missing_values.py
@@ -4,15 +4,19 @@
====================================================
Missing values can be replaced by the mean, the median or the most frequent
-value using the basic ``SimpleImputer``.
+value using the basic :func:`sklearn.impute.SimpleImputer`.
The median is a more robust estimator for data with high magnitude variables
which could dominate results (otherwise known as a 'long tail').
-Another option is the ``ChainedImputer``. This uses round-robin linear
-regression, treating every variable as an output in turn. The version
-implemented assumes Gaussian (output) variables. If your features are obviously
-non-Normal, consider transforming them to look more Normal so as to improve
-performance.
+Another option is the :func:`sklearn.impute.ChainedImputer`. This uses
+round-robin linear regression, treating every variable as an output in
+turn. The version implemented assumes Gaussian (output) variables. If your
+features are obviously non-Normal, consider transforming them to look more
+Normal so as to improve performance.
+
+In addition of using an imputing method, we can also keep an indication of the
+missing information using :func:`sklearn.impute.MissingIndicator` which might
+carry some information.
"""
import numpy as np
@@ -21,8 +25,8 @@
from sklearn.datasets import load_diabetes
from sklearn.datasets import load_boston
from sklearn.ensemble import RandomForestRegressor
-from sklearn.pipeline import Pipeline
-from sklearn.impute import SimpleImputer, ChainedImputer
+from sklearn.pipeline import make_pipeline, make_union
+from sklearn.impute import SimpleImputer, ChainedImputer, MissingIndicator
from sklearn.model_selection import cross_val_score
rng = np.random.RandomState(0)
@@ -60,18 +64,18 @@ def get_results(dataset):
X_missing = X_full.copy()
X_missing[np.where(missing_samples)[0], missing_features] = 0
y_missing = y_full.copy()
- estimator = Pipeline([("imputer", SimpleImputer(missing_values=0,
- strategy="mean")),
- ("forest", RandomForestRegressor(random_state=0,
- n_estimators=100))])
+ estimator = make_pipeline(
+ make_union(SimpleImputer(missing_values=0, strategy="mean"),
+ MissingIndicator(missing_values=0)),
+ RandomForestRegressor(random_state=0, n_estimators=100))
mean_impute_scores = cross_val_score(estimator, X_missing, y_missing,
scoring='neg_mean_squared_error')
# Estimate the score after chained imputation of the missing values
- estimator = Pipeline([("imputer", ChainedImputer(missing_values=0,
- random_state=0)),
- ("forest", RandomForestRegressor(random_state=0,
- n_estimators=100))])
+ estimator = make_pipeline(
+ make_union(ChainedImputer(missing_values=0, random_state=0),
+ MissingIndicator(missing_values=0)),
+ RandomForestRegressor(random_state=0, n_estimators=100))
chained_impute_scores = cross_val_score(estimator, X_missing, y_missing,
scoring='neg_mean_squared_error')
diff --git a/sklearn/impute.py b/sklearn/impute.py
index 72dd1ac5c24ca..fec9d8b0d7a8d 100644
--- a/sklearn/impute.py
+++ b/sklearn/impute.py
@@ -35,6 +35,7 @@
'predictor'])
__all__ = [
+ 'MissingIndicator',
'SimpleImputer',
'ChainedImputer',
]
@@ -975,3 +976,225 @@ def fit(self, X, y=None):
"""
self.fit_transform(X)
return self
+
+
+class MissingIndicator(BaseEstimator, TransformerMixin):
+ """Binary indicators for missing values.
+
+ Parameters
+ ----------
+ missing_values : number, string, np.nan (default) or None
+ The placeholder for the missing values. All occurrences of
+ `missing_values` will be imputed.
+
+ features : str, optional
+ Whether the imputer mask should represent all or a subset of
+ features.
+
+ - If "missing-only" (default), the imputer mask will only represent
+ features containing missing values during fit time.
+ - If "all", the imputer mask will represent all features.
+
+ sparse : boolean or "auto", optional
+ Whether the imputer mask format should be sparse or dense.
+
+ - If "auto" (default), the imputer mask will be of same type as
+ input.
+ - If True, the imputer mask will be a sparse matrix.
+ - If False, the imputer mask will be a numpy array.
+
+ error_on_new : boolean, optional
+ If True (default), transform will raise an error when there are
+ features with missing values in transform that have no missing values
+ in fit This is applicable only when ``features="missing-only"``.
+
+ Attributes
+ ----------
+ features_ : ndarray, shape (n_missing_features,) or (n_features,)
+ The features indices which will be returned when calling ``transform``.
+ They are computed during ``fit``. For ``features='all'``, it is
+ to ``range(n_features)``.
+
+ Examples
+ --------
+ >>> import numpy as np
+ >>> from sklearn.impute import MissingIndicator
+ >>> X1 = np.array([[np.nan, 1, 3],
+ ... [4, 0, np.nan],
+ ... [8, 1, 0]])
+ >>> X2 = np.array([[5, 1, np.nan],
+ ... [np.nan, 2, 3],
+ ... [2, 4, 0]])
+ >>> indicator = MissingIndicator()
+ >>> indicator.fit(X1)
+ MissingIndicator(error_on_new=True, features='missing-only',
+ missing_values=nan, sparse='auto')
+ >>> X2_tr = indicator.transform(X2)
+ >>> X2_tr
+ array([[False, True],
+ [ True, False],
+ [False, False]])
+
+ """
+
+ def __init__(self, missing_values=np.nan, features="missing-only",
+ sparse="auto", error_on_new=True):
+ self.missing_values = missing_values
+ self.features = features
+ self.sparse = sparse
+ self.error_on_new = error_on_new
+
+ def _get_missing_features_info(self, X):
+ """Compute the imputer mask and the indices of the features
+ containing missing values.
+
+ Parameters
+ ----------
+ X : {ndarray or sparse matrix}, shape (n_samples, n_features)
+ The input data with missing values. Note that ``X`` has been
+ checked in ``fit`` and ``transform`` before to call this function.
+
+ Returns
+ -------
+ imputer_mask : {ndarray or sparse matrix}, shape \
+(n_samples, n_features) or (n_samples, n_features_with_missing)
+ The imputer mask of the original data.
+
+ features_with_missing : ndarray, shape (n_features_with_missing)
+ The features containing missing values.
+
+ """
+ if sparse.issparse(X) and self.missing_values != 0:
+ mask = _get_mask(X.data, self.missing_values)
+
+ # The imputer mask will be constructed with the same sparse format
+ # as X.
+ sparse_constructor = (sparse.csr_matrix if X.format == 'csr'
+ else sparse.csc_matrix)
+ imputer_mask = sparse_constructor(
+ (mask, X.indices.copy(), X.indptr.copy()),
+ shape=X.shape, dtype=bool)
+
+ missing_values_mask = imputer_mask.copy()
+ missing_values_mask.eliminate_zeros()
+ features_with_missing = (
+ np.flatnonzero(np.diff(missing_values_mask.indptr))
+ if missing_values_mask.format == 'csc'
+ else np.unique(missing_values_mask.indices))
+
+ if self.sparse is False:
+ imputer_mask = imputer_mask.toarray()
+ elif imputer_mask.format == 'csr':
+ imputer_mask = imputer_mask.tocsc()
+ else:
+ if sparse.issparse(X):
+ # case of sparse matrix with 0 as missing values. Implicit and
+ # explicit zeros are considered as missing values.
+ X = X.toarray()
+ imputer_mask = _get_mask(X, self.missing_values)
+ features_with_missing = np.flatnonzero(imputer_mask.sum(axis=0))
+
+ if self.sparse is True:
+ imputer_mask = sparse.csc_matrix(imputer_mask)
+
+ return imputer_mask, features_with_missing
+
+ def fit(self, X, y=None):
+ """Fit the transformer on X.
+
+ Parameters
+ ----------
+ X : {array-like, sparse matrix}, shape (n_samples, n_features)
+ Input data, where ``n_samples`` is the number of samples and
+ ``n_features`` is the number of features.
+
+ Returns
+ -------
+ self : object
+ Returns self.
+ """
+ if not is_scalar_nan(self.missing_values):
+ force_all_finite = True
+ else:
+ force_all_finite = "allow-nan"
+ X = check_array(X, accept_sparse=('csc', 'csr'),
+ force_all_finite=force_all_finite)
+ _check_inputs_dtype(X, self.missing_values)
+
+ self._n_features = X.shape[1]
+
+ if self.features not in ('missing-only', 'all'):
+ raise ValueError("'features' has to be either 'missing-only' or "
+ "'all'. Got {} instead.".format(self.features))
+
+ if not ((isinstance(self.sparse, six.string_types) and
+ self.sparse == "auto") or isinstance(self.sparse, bool)):
+ raise ValueError("'sparse' has to be a boolean or 'auto'. "
+ "Got {!r} instead.".format(self.sparse))
+
+ self.features_ = (self._get_missing_features_info(X)[1]
+ if self.features == 'missing-only'
+ else np.arange(self._n_features))
+
+ return self
+
+ def transform(self, X):
+ """Generate missing values indicator for X.
+
+ Parameters
+ ----------
+ X : {array-like, sparse matrix}, shape (n_samples, n_features)
+ The input data to complete.
+
+ Returns
+ -------
+ Xt : {ndarray or sparse matrix}, shape (n_samples, n_features)
+ The missing indicator for input data. The data type of ``Xt``
+ will be boolean.
+
+ """
+ check_is_fitted(self, "features_")
+
+ if not is_scalar_nan(self.missing_values):
+ force_all_finite = True
+ else:
+ force_all_finite = "allow-nan"
+ X = check_array(X, accept_sparse=('csc', 'csr'),
+ force_all_finite=force_all_finite)
+ _check_inputs_dtype(X, self.missing_values)
+
+ if X.shape[1] != self._n_features:
+ raise ValueError("X has a different number of features "
+ "than during fitting.")
+
+ imputer_mask, features = self._get_missing_features_info(X)
+
+ if self.features == "missing-only":
+ features_diff_fit_trans = np.setdiff1d(features, self.features_)
+ if (self.error_on_new and features_diff_fit_trans.size > 0):
+ raise ValueError("The features {} have missing values "
+ "in transform but have no missing values "
+ "in fit.".format(features_diff_fit_trans))
+
+ if (self.features_.size > 0 and
+ self.features_.size < self._n_features):
+ imputer_mask = imputer_mask[:, self.features_]
+
+ return imputer_mask
+
+ def fit_transform(self, X, y=None):
+ """Generate missing values indicator for X.
+
+ Parameters
+ ----------
+ X : {array-like, sparse matrix}, shape (n_samples, n_features)
+ The input data to complete.
+
+ Returns
+ -------
+ Xt : {ndarray or sparse matrix}, shape (n_samples, n_features)
+ The missing indicator for input data. The data type of ``Xt``
+ will be boolean.
+
+ """
+ return self.fit(X, y).transform(X)
| diff --git a/sklearn/tests/test_impute.py b/sklearn/tests/test_impute.py
index b286c5006d431..7fb1b0ac3280b 100644
--- a/sklearn/tests/test_impute.py
+++ b/sklearn/tests/test_impute.py
@@ -13,6 +13,7 @@
from sklearn.utils.testing import assert_array_almost_equal
from sklearn.utils.testing import assert_false
+from sklearn.impute import MissingIndicator
from sklearn.impute import SimpleImputer, ChainedImputer
from sklearn.dummy import DummyRegressor
from sklearn.linear_model import BayesianRidge, ARDRegression
@@ -707,6 +708,121 @@ def test_chained_imputer_additive_matrix():
assert_allclose(X_test_filled, X_test_est, atol=0.01)
+@pytest.mark.parametrize(
+ "X_fit, X_trans, params, msg_err",
+ [(np.array([[-1, 1], [1, 2]]), np.array([[-1, 1], [1, -1]]),
+ {'features': 'missing-only', 'sparse': 'auto'},
+ 'have missing values in transform but have no missing values in fit'),
+ (np.array([[-1, 1], [1, 2]]), np.array([[-1, 1], [1, 2]]),
+ {'features': 'random', 'sparse': 'auto'},
+ "'features' has to be either 'missing-only' or 'all'"),
+ (np.array([[-1, 1], [1, 2]]), np.array([[-1, 1], [1, 2]]),
+ {'features': 'all', 'sparse': 'random'},
+ "'sparse' has to be a boolean or 'auto'")]
+)
+def test_missing_indicator_error(X_fit, X_trans, params, msg_err):
+ indicator = MissingIndicator(missing_values=-1)
+ indicator.set_params(**params)
+ with pytest.raises(ValueError, match=msg_err):
+ indicator.fit(X_fit).transform(X_trans)
+
+
+@pytest.mark.parametrize(
+ "missing_values, dtype",
+ [(np.nan, np.float64),
+ (0, np.int32),
+ (-1, np.int32)])
+@pytest.mark.parametrize(
+ "arr_type",
+ [np.array, sparse.csc_matrix, sparse.csr_matrix, sparse.coo_matrix,
+ sparse.lil_matrix, sparse.bsr_matrix])
+@pytest.mark.parametrize(
+ "param_features, n_features, features_indices",
+ [('missing-only', 2, np.array([0, 1])),
+ ('all', 3, np.array([0, 1, 2]))])
+def test_missing_indicator_new(missing_values, arr_type, dtype, param_features,
+ n_features, features_indices):
+ X_fit = np.array([[missing_values, missing_values, 1],
+ [4, missing_values, 2]])
+ X_trans = np.array([[missing_values, missing_values, 1],
+ [4, 12, 10]])
+ X_fit_expected = np.array([[1, 1, 0], [0, 1, 0]])
+ X_trans_expected = np.array([[1, 1, 0], [0, 0, 0]])
+
+ # convert the input to the right array format and right dtype
+ X_fit = arr_type(X_fit).astype(dtype)
+ X_trans = arr_type(X_trans).astype(dtype)
+ X_fit_expected = X_fit_expected.astype(dtype)
+ X_trans_expected = X_trans_expected.astype(dtype)
+
+ indicator = MissingIndicator(missing_values=missing_values,
+ features=param_features,
+ sparse=False)
+ X_fit_mask = indicator.fit_transform(X_fit)
+ X_trans_mask = indicator.transform(X_trans)
+
+ assert X_fit_mask.shape[1] == n_features
+ assert X_trans_mask.shape[1] == n_features
+
+ assert_array_equal(indicator.features_, features_indices)
+ assert_allclose(X_fit_mask, X_fit_expected[:, features_indices])
+ assert_allclose(X_trans_mask, X_trans_expected[:, features_indices])
+
+ assert X_fit_mask.dtype == bool
+ assert X_trans_mask.dtype == bool
+ assert isinstance(X_fit_mask, np.ndarray)
+ assert isinstance(X_trans_mask, np.ndarray)
+
+ indicator.set_params(sparse=True)
+ X_fit_mask_sparse = indicator.fit_transform(X_fit)
+ X_trans_mask_sparse = indicator.transform(X_trans)
+
+ assert X_fit_mask_sparse.dtype == bool
+ assert X_trans_mask_sparse.dtype == bool
+ assert X_fit_mask_sparse.format == 'csc'
+ assert X_trans_mask_sparse.format == 'csc'
+ assert_allclose(X_fit_mask_sparse.toarray(), X_fit_mask)
+ assert_allclose(X_trans_mask_sparse.toarray(), X_trans_mask)
+
+
+@pytest.mark.parametrize("param_sparse", [True, False, 'auto'])
+@pytest.mark.parametrize("missing_values", [np.nan, 0])
+@pytest.mark.parametrize(
+ "arr_type",
+ [np.array, sparse.csc_matrix, sparse.csr_matrix, sparse.coo_matrix])
+def test_missing_indicator_sparse_param(arr_type, missing_values,
+ param_sparse):
+ # check the format of the output with different sparse parameter
+ X_fit = np.array([[missing_values, missing_values, 1],
+ [4, missing_values, 2]])
+ X_trans = np.array([[missing_values, missing_values, 1],
+ [4, 12, 10]])
+ X_fit = arr_type(X_fit).astype(np.float64)
+ X_trans = arr_type(X_trans).astype(np.float64)
+
+ indicator = MissingIndicator(missing_values=missing_values,
+ sparse=param_sparse)
+ X_fit_mask = indicator.fit_transform(X_fit)
+ X_trans_mask = indicator.transform(X_trans)
+
+ if param_sparse is True:
+ assert X_fit_mask.format == 'csc'
+ assert X_trans_mask.format == 'csc'
+ elif param_sparse == 'auto' and missing_values == 0:
+ assert isinstance(X_fit_mask, np.ndarray)
+ assert isinstance(X_trans_mask, np.ndarray)
+ elif param_sparse is False:
+ assert isinstance(X_fit_mask, np.ndarray)
+ assert isinstance(X_trans_mask, np.ndarray)
+ else:
+ if sparse.issparse(X_fit):
+ assert X_fit_mask.format == 'csc'
+ assert X_trans_mask.format == 'csc'
+ else:
+ assert isinstance(X_fit_mask, np.ndarray)
+ assert isinstance(X_trans_mask, np.ndarray)
+
+
@pytest.mark.parametrize("imputer_constructor",
[SimpleImputer, ChainedImputer])
@pytest.mark.parametrize(
diff --git a/sklearn/utils/estimator_checks.py b/sklearn/utils/estimator_checks.py
index d25abbe6377db..65112ad9d382e 100644
--- a/sklearn/utils/estimator_checks.py
+++ b/sklearn/utils/estimator_checks.py
@@ -77,7 +77,7 @@
'RANSACRegressor', 'RadiusNeighborsRegressor',
'RandomForestRegressor', 'Ridge', 'RidgeCV']
-ALLOW_NAN = ['Imputer', 'SimpleImputer', 'ChainedImputer',
+ALLOW_NAN = ['Imputer', 'SimpleImputer', 'ChainedImputer', 'MissingIndicator',
'MaxAbsScaler', 'MinMaxScaler', 'RobustScaler', 'StandardScaler',
'PowerTransformer', 'QuantileTransformer']
| diff --git a/doc/modules/classes.rst b/doc/modules/classes.rst
index f217d7848c9b2..5a1d5fcda6d6e 100644
--- a/doc/modules/classes.rst
+++ b/doc/modules/classes.rst
@@ -656,7 +656,8 @@ Kernels:
impute.SimpleImputer
impute.ChainedImputer
-
+ impute.MissingIndicator
+
.. _kernel_approximation_ref:
:mod:`sklearn.kernel_approximation` Kernel Approximation
diff --git a/doc/modules/impute.rst b/doc/modules/impute.rst
index 0f9089c981782..84c8538f1f05b 100644
--- a/doc/modules/impute.rst
+++ b/doc/modules/impute.rst
@@ -121,7 +121,6 @@ Both :class:`SimpleImputer` and :class:`ChainedImputer` can be used in a Pipelin
as a way to build a composite estimator that supports imputation.
See :ref:`sphx_glr_auto_examples_plot_missing_values.py`.
-
.. _multiple_imputation:
Multiple vs. Single Imputation
@@ -142,3 +141,49 @@ random seeds with the ``n_imputations`` parameter set to 1.
Note that a call to the ``transform`` method of :class:`ChainedImputer` is not
allowed to change the number of samples. Therefore multiple imputations cannot be
achieved by a single call to ``transform``.
+
+.. _missing_indicator:
+
+Marking imputed values
+======================
+
+The :class:`MissingIndicator` transformer is useful to transform a dataset into
+corresponding binary matrix indicating the presence of missing values in the
+dataset. This transformation is useful in conjunction with imputation. When
+using imputation, preserving the information about which values had been
+missing can be informative.
+
+``NaN`` is usually used as the placeholder for missing values. However, it
+enforces the data type to be float. The parameter ``missing_values`` allows to
+specify other placeholder such as integer. In the following example, we will
+use ``-1`` as missing values::
+
+ >>> from sklearn.impute import MissingIndicator
+ >>> X = np.array([[-1, -1, 1, 3],
+ ... [4, -1, 0, -1],
+ ... [8, -1, 1, 0]])
+ >>> indicator = MissingIndicator(missing_values=-1)
+ >>> mask_missing_values_only = indicator.fit_transform(X)
+ >>> mask_missing_values_only
+ array([[ True, True, False],
+ [False, True, True],
+ [False, True, False]])
+
+The ``features`` parameter is used to choose the features for which the mask is
+constructed. By default, it is ``'missing-only'`` which returns the imputer
+mask of the features containing missing values at ``fit`` time::
+
+ >>> indicator.features_
+ array([0, 1, 3])
+
+The ``features`` parameter can be set to ``'all'`` to returned all features
+whether or not they contain missing values::
+
+ >>> indicator = MissingIndicator(missing_values=-1, features="all")
+ >>> mask_all = indicator.fit_transform(X)
+ >>> mask_all
+ array([[ True, True, False, False],
+ [False, True, False, True],
+ [False, True, False, False]])
+ >>> indicator.features_
+ array([0, 1, 2, 3])
diff --git a/doc/whats_new/v0.20.rst b/doc/whats_new/v0.20.rst
index d9aa052d740db..cfb7835397e38 100644
--- a/doc/whats_new/v0.20.rst
+++ b/doc/whats_new/v0.20.rst
@@ -149,6 +149,10 @@ Preprocessing
back to the original space via an inverse transform. :issue:`9041` by
`Andreas Müller`_ and :user:`Guillaume Lemaitre <glemaitre>`.
+- Added :class:`MissingIndicator` which generates a binary indicator for
+ missing values. :issue:`8075` by :user:`Maniteja Nandana <maniteja123>` and
+ :user:`Guillaume Lemaitre <glemaitre>`.
+
- Added :class:`impute.ChainedImputer`, which is a strategy for imputing missing
values by modeling each feature with missing values as a function of
other features in a round-robin fashion. :issue:`8478` by
| [
{
"components": [
{
"doc": "Binary indicators for missing values.\n\nParameters\n----------\nmissing_values : number, string, np.nan (default) or None\n The placeholder for the missing values. All occurrences of\n `missing_values` will be imputed.\n\nfeatures : str, optional\n Whether the... | [
"sklearn/tests/test_impute.py::test_imputation_shape",
"sklearn/tests/test_impute.py::test_imputation_error_invalid_strategy[const]",
"sklearn/tests/test_impute.py::test_imputation_error_invalid_strategy[101]",
"sklearn/tests/test_impute.py::test_imputation_error_invalid_strategy[None]",
"sklearn/tests/test... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
[MRG+1] MissingIndicator transformer
MissingIndicator transformer for the missing values indicator mask.
see #6556
#### What does this implement/fix? Explain your changes.
The current implementation returns a indicator mask for the missing values.
#### Any other comments?
It is a very initial attempt and currently no tests are present. Please do have a look and give suggestions on the design. Thanks !
- [X] Implementation
- [x] Documentation
- [x] Tests
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sklearn/impute.py]
(definition of MissingIndicator:)
class MissingIndicator(BaseEstimator, TransformerMixin):
"""Binary indicators for missing values.
Parameters
----------
missing_values : number, string, np.nan (default) or None
The placeholder for the missing values. All occurrences of
`missing_values` will be imputed.
features : str, optional
Whether the imputer mask should represent all or a subset of
features.
- If "missing-only" (default), the imputer mask will only represent
features containing missing values during fit time.
- If "all", the imputer mask will represent all features.
sparse : boolean or "auto", optional
Whether the imputer mask format should be sparse or dense.
- If "auto" (default), the imputer mask will be of same type as
input.
- If True, the imputer mask will be a sparse matrix.
- If False, the imputer mask will be a numpy array.
error_on_new : boolean, optional
If True (default), transform will raise an error when there are
features with missing values in transform that have no missing values
in fit This is applicable only when ``features="missing-only"``.
Attributes
----------
features_ : ndarray, shape (n_missing_features,) or (n_features,)
The features indices which will be returned when calling ``transform``.
They are computed during ``fit``. For ``features='all'``, it is
to ``range(n_features)``.
Examples
--------
>>> import numpy as np
>>> from sklearn.impute import MissingIndicator
>>> X1 = np.array([[np.nan, 1, 3],
... [4, 0, np.nan],
... [8, 1, 0]])
>>> X2 = np.array([[5, 1, np.nan],
... [np.nan, 2, 3],
... [2, 4, 0]])
>>> indicator = MissingIndicator()
>>> indicator.fit(X1)
MissingIndicator(error_on_new=True, features='missing-only',
missing_values=nan, sparse='auto')
>>> X2_tr = indicator.transform(X2)
>>> X2_tr
array([[False, True],
[ True, False],
[False, False]])"""
(definition of MissingIndicator.__init__:)
def __init__(self, missing_values=np.nan, features="missing-only", sparse="auto", error_on_new=True):
(definition of MissingIndicator._get_missing_features_info:)
def _get_missing_features_info(self, X):
"""Compute the imputer mask and the indices of the features
containing missing values.
Parameters
----------
X : {ndarray or sparse matrix}, shape (n_samples, n_features)
The input data with missing values. Note that ``X`` has been
checked in ``fit`` and ``transform`` before to call this function.
Returns
-------
imputer_mask : {ndarray or sparse matrix}, shape (n_samples, n_features) or (n_samples, n_features_with_missing)
The imputer mask of the original data.
features_with_missing : ndarray, shape (n_features_with_missing)
The features containing missing values."""
(definition of MissingIndicator.fit:)
def fit(self, X, y=None):
"""Fit the transformer on X.
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
Input data, where ``n_samples`` is the number of samples and
``n_features`` is the number of features.
Returns
-------
self : object
Returns self."""
(definition of MissingIndicator.transform:)
def transform(self, X):
"""Generate missing values indicator for X.
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
The input data to complete.
Returns
-------
Xt : {ndarray or sparse matrix}, shape (n_samples, n_features)
The missing indicator for input data. The data type of ``Xt``
will be boolean."""
(definition of MissingIndicator.fit_transform:)
def fit_transform(self, X, y=None):
"""Generate missing values indicator for X.
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
The input data to complete.
Returns
-------
Xt : {ndarray or sparse matrix}, shape (n_samples, n_features)
The missing indicator for input data. The data type of ``Xt``
will be boolean."""
[end of new definitions in sklearn/impute.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 51407623e4f491f00e3b465626dd5c4b55860bd0 | |
conan-io__conan-765 | 765 | conan-io/conan | null | 366eb7ac13c9e5cc09b7fbad7f6704bfb99ee7c1 | 2016-12-14T15:46:10Z | diff --git a/conans/client/generators/__init__.py b/conans/client/generators/__init__.py
index 28a04b588bf..f808f5553eb 100644
--- a/conans/client/generators/__init__.py
+++ b/conans/client/generators/__init__.py
@@ -6,6 +6,7 @@
from .cmake import CMakeGenerator
from .qmake import QmakeGenerator
from .qbs import QbsGenerator
+from .scons import SConsGenerator
from .visualstudio import VisualStudioGenerator
from .xcode import XCodeGenerator
from .ycm import YouCompleteMeGenerator
@@ -22,6 +23,7 @@ def _save_generator(name, klass):
_save_generator("cmake", CMakeGenerator)
_save_generator("qmake", QmakeGenerator)
_save_generator("qbs", QbsGenerator)
+_save_generator("scons", SConsGenerator)
_save_generator("visual_studio", VisualStudioGenerator)
_save_generator("xcode", XCodeGenerator)
_save_generator("ycm", YouCompleteMeGenerator)
diff --git a/conans/client/generators/scons.py b/conans/client/generators/scons.py
new file mode 100644
index 00000000000..232eb8ac2ef
--- /dev/null
+++ b/conans/client/generators/scons.py
@@ -0,0 +1,37 @@
+from conans.model import Generator
+
+class SConsGenerator(Generator):
+ @property
+ def filename(self):
+ return "SConscript_conan"
+
+ @property
+ def content(self):
+ template = (' "{dep}" : {{\n'
+ ' "CPPPATH" : {info.include_paths},\n'
+ ' "LIBPATH" : {info.lib_paths},\n'
+ ' "BINPATH" : {info.bin_paths},\n'
+ ' "LIBS" : {info.libs},\n'
+ ' "CPPDEFINES" : {info.defines},\n'
+ ' "CPPFLAGS" : {info.cppflags},\n'
+ ' "CCFLAGS" : {info.cflags},\n'
+ ' "SHLINKFLAGS" : {info.sharedlinkflags},\n'
+ ' "LINKFLAGS" : {info.exelinkflags},\n'
+ ' }},\n')
+
+ sections = []
+ sections.append("conan = {\n")
+
+ all_flags = template.format(dep="conan", info=self.deps_build_info)
+ sections.append(all_flags)
+
+ for dep_name, info in self.deps_build_info.dependencies:
+ dep_name = dep_name.replace("-", "_")
+ dep_flags = template.format(dep=dep_name, info=info)
+ sections.append(dep_flags)
+
+ sections.append("}\n")
+
+ sections.append("Return('conan')\n")
+
+ return "\n".join(sections)
| diff --git a/conans/test/generators/scons_test.py b/conans/test/generators/scons_test.py
new file mode 100644
index 00000000000..fd2e577439a
--- /dev/null
+++ b/conans/test/generators/scons_test.py
@@ -0,0 +1,27 @@
+import re
+import unittest
+from conans.model.settings import Settings
+from conans.model.conan_file import ConanFile
+from conans.client.generators.scons import SConsGenerator
+from conans.model.build_info import DepsCppInfo
+from conans.model.ref import ConanFileReference
+
+
+class SConsGeneratorTest(unittest.TestCase):
+ def variables_setup_test(self):
+ conanfile = ConanFile(None, None, Settings({}), None)
+ ref = ConanFileReference.loads("MyPkg/0.1@lasote/stables")
+ cpp_info = DepsCppInfo()
+ cpp_info.defines = ["MYDEFINE1"]
+ conanfile.deps_cpp_info.update(cpp_info, ref)
+ ref = ConanFileReference.loads("MyPkg2/0.1@lasote/stables")
+ cpp_info = DepsCppInfo()
+ cpp_info.defines = ["MYDEFINE2"]
+ conanfile.deps_cpp_info.update(cpp_info, ref)
+ generator = SConsGenerator(conanfile)
+ content = generator.content
+ scons_lines = content.splitlines()
+ self.assertIn(" \"CPPDEFINES\" : [\'MYDEFINE2\', \'MYDEFINE1\'],", scons_lines)
+ self.assertIn(" \"CPPDEFINES\" : [\'MYDEFINE1\'],", scons_lines)
+ self.assertIn(" \"CPPDEFINES\" : [\'MYDEFINE2\'],", scons_lines)
+
diff --git a/conans/test/generators_test.py b/conans/test/generators_test.py
index 1e02ea5270e..e22f10d2894 100644
--- a/conans/test/generators_test.py
+++ b/conans/test/generators_test.py
@@ -12,6 +12,7 @@ def test_base(self):
gcc
qbs
qmake
+scons
txt
visual_studio
xcode
@@ -23,6 +24,6 @@ def test_base(self):
client.run("install --build")
self.assertEqual(sorted(['conanfile.txt', 'conaninfo.txt', 'conanbuildinfo.cmake',
'conanbuildinfo.gcc', 'conanbuildinfo.qbs', 'conanbuildinfo.pri',
- 'conanbuildinfo.txt', 'conanbuildinfo.props',
+ 'SConscript_conan', 'conanbuildinfo.txt', 'conanbuildinfo.props',
'conanbuildinfo.xcconfig', '.ycm_extra_conf.py']),
sorted(os.listdir(client.current_folder)))
| [
{
"components": [
{
"doc": "",
"lines": [
3,
37
],
"name": "SConsGenerator",
"signature": "class SConsGenerator(Generator): @property",
"type": "class"
},
{
"doc": "",
"lines": [
5,
6
... | [
"conans/test/generators_test.py::GeneratorsTest::test_base"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add SCons generator
Suggested usage in SConstruct file:
conan = SConscript('SConscript-conan')
completeEnv = Environment()
completeEnv.MergeFlags(conan['conan'])
completeEnv.Program("foo", "foo.c")
partialEnv = Environment()
partialEnv.MergeFlags(conan['zlib'])
partialEnv.MergeFlags(conan['poco'])
partialEnv.Program("bar", "bar.c")
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/client/generators/scons.py]
(definition of SConsGenerator:)
class SConsGenerator(Generator): @property
(definition of SConsGenerator.filename:)
def filename(self):
(definition of SConsGenerator.content:)
def content(self):
[end of new definitions in conans/client/generators/scons.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | ||
scrapy__scrapy-2306 | 2,306 | scrapy/scrapy | null | 7b49b9c0f53396ac89cbd74930bc4c6e41d41901 | 2016-10-05T15:26:13Z | diff --git a/docs/topics/request-response.rst b/docs/topics/request-response.rst
index 3d110b02d76..67f8ec28599 100644
--- a/docs/topics/request-response.rst
+++ b/docs/topics/request-response.rst
@@ -307,6 +307,7 @@ Those are:
* :reqmeta:`proxy`
* ``ftp_user`` (See :setting:`FTP_USER` for more info)
* ``ftp_password`` (See :setting:`FTP_PASSWORD` for more info)
+* :reqmeta:`referrer_policy`
.. reqmeta:: bindaddress
diff --git a/docs/topics/spider-middleware.rst b/docs/topics/spider-middleware.rst
index 8360827e8d7..9a0ccd0c172 100644
--- a/docs/topics/spider-middleware.rst
+++ b/docs/topics/spider-middleware.rst
@@ -95,7 +95,7 @@ following methods:
it has processed the response.
:meth:`process_spider_output` must return an iterable of
- :class:`~scrapy.http.Request`, dict or :class:`~scrapy.item.Item`
+ :class:`~scrapy.http.Request`, dict or :class:`~scrapy.item.Item`
objects.
:param response: the response which generated this output from the
@@ -328,6 +328,90 @@ Default: ``True``
Whether to enable referer middleware.
+.. setting:: REFERRER_POLICY
+
+REFERRER_POLICY
+^^^^^^^^^^^^^^^
+
+.. versionadded:: 1.4
+
+Default: ``'scrapy.spidermiddlewares.referer.DefaultReferrerPolicy'``
+
+.. reqmeta:: referrer_policy
+
+`Referrer Policy`_ to apply when populating Request "Referer" header.
+
+.. note::
+ You can also set the Referrer Policy per request,
+ using the special ``"referrer_policy"`` :ref:`Request.meta <topics-request-meta>` key,
+ with the same acceptable values as for the ``REFERRER_POLICY`` setting.
+
+Acceptable values for REFERRER_POLICY
+*************************************
+
+- either a path to a ``scrapy.spidermiddlewares.referer.ReferrerPolicy``
+ subclass — a custom policy or one of the built-in ones (see classes below),
+- or one of the standard W3C-defined string values,
+- or the special ``"scrapy-default"``.
+
+======================================= ========================================================================
+String value Class name (as a string)
+======================================= ========================================================================
+``"scrapy-default"`` (default) :class:`scrapy.spidermiddlewares.referer.DefaultReferrerPolicy`
+`"no-referrer"`_ :class:`scrapy.spidermiddlewares.referer.NoReferrerPolicy`
+`"no-referrer-when-downgrade"`_ :class:`scrapy.spidermiddlewares.referer.NoReferrerWhenDowngradePolicy`
+`"same-origin"`_ :class:`scrapy.spidermiddlewares.referer.SameOriginPolicy`
+`"origin"`_ :class:`scrapy.spidermiddlewares.referer.OriginPolicy`
+`"strict-origin"`_ :class:`scrapy.spidermiddlewares.referer.StrictOriginPolicy`
+`"origin-when-cross-origin"`_ :class:`scrapy.spidermiddlewares.referer.OriginWhenCrossOriginPolicy`
+`"strict-origin-when-cross-origin"`_ :class:`scrapy.spidermiddlewares.referer.StrictOriginWhenCrossOriginPolicy`
+`"unsafe-url"`_ :class:`scrapy.spidermiddlewares.referer.UnsafeUrlPolicy`
+======================================= ========================================================================
+
+.. autoclass:: DefaultReferrerPolicy
+.. warning::
+ Scrapy's default referrer policy — just like `"no-referrer-when-downgrade"`_,
+ the W3C-recommended value for browsers — will send a non-empty
+ "Referer" header from any ``http(s)://`` to any ``https://`` URL,
+ even if the domain is different.
+
+ `"same-origin"`_ may be a better choice if you want to remove referrer
+ information for cross-domain requests.
+
+.. autoclass:: NoReferrerPolicy
+
+.. autoclass:: NoReferrerWhenDowngradePolicy
+.. note::
+ "no-referrer-when-downgrade" policy is the W3C-recommended default,
+ and is used by major web browsers.
+
+ However, it is NOT Scrapy's default referrer policy (see :class:`DefaultReferrerPolicy`).
+
+.. autoclass:: SameOriginPolicy
+
+.. autoclass:: OriginPolicy
+
+.. autoclass:: StrictOriginPolicy
+
+.. autoclass:: OriginWhenCrossOriginPolicy
+
+.. autoclass:: StrictOriginWhenCrossOriginPolicy
+
+.. autoclass:: UnsafeUrlPolicy
+.. warning::
+ "unsafe-url" policy is NOT recommended.
+
+.. _Referrer Policy: https://www.w3.org/TR/referrer-policy
+.. _"no-referrer": https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer
+.. _"no-referrer-when-downgrade": https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer-when-downgrade
+.. _"same-origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-same-origin
+.. _"origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-origin
+.. _"strict-origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-strict-origin
+.. _"origin-when-cross-origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-origin-when-cross-origin
+.. _"strict-origin-when-cross-origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-strict-origin-when-cross-origin
+.. _"unsafe-url": https://www.w3.org/TR/referrer-policy/#referrer-policy-unsafe-url
+
+
UrlLengthMiddleware
-------------------
diff --git a/scrapy/settings/default_settings.py b/scrapy/settings/default_settings.py
index a5931a3d5d6..35d9844a757 100644
--- a/scrapy/settings/default_settings.py
+++ b/scrapy/settings/default_settings.py
@@ -234,6 +234,7 @@
REDIRECT_PRIORITY_ADJUST = +2
REFERER_ENABLED = True
+REFERRER_POLICY = 'scrapy.spidermiddlewares.referer.DefaultReferrerPolicy'
RETRY_ENABLED = True
RETRY_TIMES = 2 # initial response + 2 retries = 3 requests
diff --git a/scrapy/spidermiddlewares/referer.py b/scrapy/spidermiddlewares/referer.py
index 6a8c4654388..b444e34bb68 100644
--- a/scrapy/spidermiddlewares/referer.py
+++ b/scrapy/spidermiddlewares/referer.py
@@ -2,22 +2,355 @@
RefererMiddleware: populates Request referer field, based on the Response which
originated it.
"""
+from six.moves.urllib.parse import urlparse
+import warnings
-from scrapy.http import Request
+from w3lib.url import safe_url_string
+
+from scrapy.http import Request, Response
from scrapy.exceptions import NotConfigured
+from scrapy import signals
+from scrapy.utils.python import to_native_str
+from scrapy.utils.httpobj import urlparse_cached
+from scrapy.utils.misc import load_object
+from scrapy.utils.url import strip_url
+
+
+LOCAL_SCHEMES = ('about', 'blob', 'data', 'filesystem',)
+
+POLICY_NO_REFERRER = "no-referrer"
+POLICY_NO_REFERRER_WHEN_DOWNGRADE = "no-referrer-when-downgrade"
+POLICY_SAME_ORIGIN = "same-origin"
+POLICY_ORIGIN = "origin"
+POLICY_STRICT_ORIGIN = "strict-origin"
+POLICY_ORIGIN_WHEN_CROSS_ORIGIN = "origin-when-cross-origin"
+POLICY_STRICT_ORIGIN_WHEN_CROSS_ORIGIN = "strict-origin-when-cross-origin"
+POLICY_UNSAFE_URL = "unsafe-url"
+POLICY_SCRAPY_DEFAULT = "scrapy-default"
+
+
+class ReferrerPolicy(object):
+
+ NOREFERRER_SCHEMES = LOCAL_SCHEMES
+
+ def referrer(self, response_url, request_url):
+ raise NotImplementedError()
+
+ def stripped_referrer(self, url):
+ if urlparse(url).scheme not in self.NOREFERRER_SCHEMES:
+ return self.strip_url(url)
+
+ def origin_referrer(self, url):
+ if urlparse(url).scheme not in self.NOREFERRER_SCHEMES:
+ return self.origin(url)
+
+ def strip_url(self, url, origin_only=False):
+ """
+ https://www.w3.org/TR/referrer-policy/#strip-url
+
+ If url is null, return no referrer.
+ If url's scheme is a local scheme, then return no referrer.
+ Set url's username to the empty string.
+ Set url's password to null.
+ Set url's fragment to null.
+ If the origin-only flag is true, then:
+ Set url's path to null.
+ Set url's query to null.
+ Return url.
+ """
+ if not url:
+ return None
+ return strip_url(url,
+ strip_credentials=True,
+ strip_fragment=True,
+ strip_default_port=True,
+ origin_only=origin_only)
+
+ def origin(self, url):
+ """Return serialized origin (scheme, host, path) for a request or response URL."""
+ return self.strip_url(url, origin_only=True)
+
+ def potentially_trustworthy(self, url):
+ # Note: this does not follow https://w3c.github.io/webappsec-secure-contexts/#is-url-trustworthy
+ parsed_url = urlparse(url)
+ if parsed_url.scheme in ('data',):
+ return False
+ return self.tls_protected(url)
+
+ def tls_protected(self, url):
+ return urlparse(url).scheme in ('https', 'ftps')
+
+
+class NoReferrerPolicy(ReferrerPolicy):
+ """
+ https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer
+
+ The simplest policy is "no-referrer", which specifies that no referrer information
+ is to be sent along with requests made from a particular request client to any origin.
+ The header will be omitted entirely.
+ """
+ name = POLICY_NO_REFERRER
+
+ def referrer(self, response_url, request_url):
+ return None
+
+
+class NoReferrerWhenDowngradePolicy(ReferrerPolicy):
+ """
+ https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer-when-downgrade
+
+ The "no-referrer-when-downgrade" policy sends a full URL along with requests
+ from a TLS-protected environment settings object to a potentially trustworthy URL,
+ and requests from clients which are not TLS-protected to any origin.
+
+ Requests from TLS-protected clients to non-potentially trustworthy URLs,
+ on the other hand, will contain no referrer information.
+ A Referer HTTP header will not be sent.
+
+ This is a user agent's default behavior, if no policy is otherwise specified.
+ """
+ name = POLICY_NO_REFERRER_WHEN_DOWNGRADE
+
+ def referrer(self, response_url, request_url):
+ if not self.tls_protected(response_url) or self.tls_protected(request_url):
+ return self.stripped_referrer(response_url)
+
+
+class SameOriginPolicy(ReferrerPolicy):
+ """
+ https://www.w3.org/TR/referrer-policy/#referrer-policy-same-origin
+
+ The "same-origin" policy specifies that a full URL, stripped for use as a referrer,
+ is sent as referrer information when making same-origin requests from a particular request client.
+
+ Cross-origin requests, on the other hand, will contain no referrer information.
+ A Referer HTTP header will not be sent.
+ """
+ name = POLICY_SAME_ORIGIN
+
+ def referrer(self, response_url, request_url):
+ if self.origin(response_url) == self.origin(request_url):
+ return self.stripped_referrer(response_url)
+
+
+class OriginPolicy(ReferrerPolicy):
+ """
+ https://www.w3.org/TR/referrer-policy/#referrer-policy-origin
+
+ The "origin" policy specifies that only the ASCII serialization
+ of the origin of the request client is sent as referrer information
+ when making both same-origin requests and cross-origin requests
+ from a particular request client.
+ """
+ name = POLICY_ORIGIN
+
+ def referrer(self, response_url, request_url):
+ return self.origin_referrer(response_url)
+
+
+class StrictOriginPolicy(ReferrerPolicy):
+ """
+ https://www.w3.org/TR/referrer-policy/#referrer-policy-strict-origin
+
+ The "strict-origin" policy sends the ASCII serialization
+ of the origin of the request client when making requests:
+ - from a TLS-protected environment settings object to a potentially trustworthy URL, and
+ - from non-TLS-protected environment settings objects to any origin.
+
+ Requests from TLS-protected request clients to non- potentially trustworthy URLs,
+ on the other hand, will contain no referrer information.
+ A Referer HTTP header will not be sent.
+ """
+ name = POLICY_STRICT_ORIGIN
+
+ def referrer(self, response_url, request_url):
+ if ((self.tls_protected(response_url) and
+ self.potentially_trustworthy(request_url))
+ or not self.tls_protected(response_url)):
+ return self.origin_referrer(response_url)
+
+
+class OriginWhenCrossOriginPolicy(ReferrerPolicy):
+ """
+ https://www.w3.org/TR/referrer-policy/#referrer-policy-origin-when-cross-origin
+
+ The "origin-when-cross-origin" policy specifies that a full URL,
+ stripped for use as a referrer, is sent as referrer information
+ when making same-origin requests from a particular request client,
+ and only the ASCII serialization of the origin of the request client
+ is sent as referrer information when making cross-origin requests
+ from a particular request client.
+ """
+ name = POLICY_ORIGIN_WHEN_CROSS_ORIGIN
+
+ def referrer(self, response_url, request_url):
+ origin = self.origin(response_url)
+ if origin == self.origin(request_url):
+ return self.stripped_referrer(response_url)
+ else:
+ return origin
+
+
+class StrictOriginWhenCrossOriginPolicy(ReferrerPolicy):
+ """
+ https://www.w3.org/TR/referrer-policy/#referrer-policy-strict-origin-when-cross-origin
+
+ The "strict-origin-when-cross-origin" policy specifies that a full URL,
+ stripped for use as a referrer, is sent as referrer information
+ when making same-origin requests from a particular request client,
+ and only the ASCII serialization of the origin of the request client
+ when making cross-origin requests:
+
+ - from a TLS-protected environment settings object to a potentially trustworthy URL, and
+ - from non-TLS-protected environment settings objects to any origin.
+
+ Requests from TLS-protected clients to non- potentially trustworthy URLs,
+ on the other hand, will contain no referrer information.
+ A Referer HTTP header will not be sent.
+ """
+ name = POLICY_STRICT_ORIGIN_WHEN_CROSS_ORIGIN
+
+ def referrer(self, response_url, request_url):
+ origin = self.origin(response_url)
+ if origin == self.origin(request_url):
+ return self.stripped_referrer(response_url)
+ elif ((self.tls_protected(response_url) and
+ self.potentially_trustworthy(request_url))
+ or not self.tls_protected(response_url)):
+ return self.origin_referrer(response_url)
+
+
+class UnsafeUrlPolicy(ReferrerPolicy):
+ """
+ https://www.w3.org/TR/referrer-policy/#referrer-policy-unsafe-url
+
+ The "unsafe-url" policy specifies that a full URL, stripped for use as a referrer,
+ is sent along with both cross-origin requests
+ and same-origin requests made from a particular request client.
+
+ Note: The policy's name doesn't lie; it is unsafe.
+ This policy will leak origins and paths from TLS-protected resources
+ to insecure origins.
+ Carefully consider the impact of setting such a policy for potentially sensitive documents.
+ """
+ name = POLICY_UNSAFE_URL
+
+ def referrer(self, response_url, request_url):
+ return self.stripped_referrer(response_url)
+
+
+class DefaultReferrerPolicy(NoReferrerWhenDowngradePolicy):
+ """
+ A variant of "no-referrer-when-downgrade",
+ with the addition that "Referer" is not sent if the parent request was
+ using ``file://`` or ``s3://`` scheme.
+ """
+ NOREFERRER_SCHEMES = LOCAL_SCHEMES + ('file', 's3')
+ name = POLICY_SCRAPY_DEFAULT
+
+
+_policy_classes = {p.name: p for p in (
+ NoReferrerPolicy,
+ NoReferrerWhenDowngradePolicy,
+ SameOriginPolicy,
+ OriginPolicy,
+ StrictOriginPolicy,
+ OriginWhenCrossOriginPolicy,
+ StrictOriginWhenCrossOriginPolicy,
+ UnsafeUrlPolicy,
+ DefaultReferrerPolicy,
+)}
+
+
+def _load_policy_class(policy, warning_only=False):
+ """
+ Expect a string for the path to the policy class,
+ otherwise try to interpret the string as a standard value
+ from https://www.w3.org/TR/referrer-policy/#referrer-policies
+ """
+ try:
+ return load_object(policy)
+ except ValueError:
+ try:
+ return _policy_classes[policy.lower()]
+ except KeyError:
+ msg = "Could not load referrer policy %r" % policy
+ if not warning_only:
+ raise RuntimeError(msg)
+ else:
+ warnings.warn(msg, RuntimeWarning)
+ return None
+
class RefererMiddleware(object):
+ def __init__(self, settings=None):
+ self.default_policy = DefaultReferrerPolicy
+ if settings is not None:
+ self.default_policy = _load_policy_class(
+ settings.get('REFERRER_POLICY'))
+
@classmethod
def from_crawler(cls, crawler):
if not crawler.settings.getbool('REFERER_ENABLED'):
raise NotConfigured
- return cls()
+ mw = cls(crawler.settings)
+
+ # Note: this hook is a bit of a hack to intercept redirections
+ crawler.signals.connect(mw.request_scheduled, signal=signals.request_scheduled)
+
+ return mw
+
+ def policy(self, resp_or_url, request):
+ """
+ Determine Referrer-Policy to use from a parent Response (or URL),
+ and a Request to be sent.
+
+ - if a valid policy is set in Request meta, it is used.
+ - if the policy is set in meta but is wrong (e.g. a typo error),
+ the policy from settings is used
+ - if the policy is not set in Request meta,
+ but there is a Referrer-policy header in the parent response,
+ it is used if valid
+ - otherwise, the policy from settings is used.
+ """
+ policy_name = request.meta.get('referrer_policy')
+ if policy_name is None:
+ if isinstance(resp_or_url, Response):
+ policy_name = to_native_str(
+ resp_or_url.headers.get('Referrer-Policy', '').decode('latin1'))
+ if policy_name is None:
+ return self.default_policy()
+
+ cls = _load_policy_class(policy_name, warning_only=True)
+ return cls() if cls else self.default_policy()
def process_spider_output(self, response, result, spider):
def _set_referer(r):
if isinstance(r, Request):
- r.headers.setdefault('Referer', response.url)
+ referrer = self.policy(response, r).referrer(response.url, r.url)
+ if referrer is not None:
+ r.headers.setdefault('Referer', referrer)
return r
return (_set_referer(r) for r in result or ())
+ def request_scheduled(self, request, spider):
+ # check redirected request to patch "Referer" header if necessary
+ redirected_urls = request.meta.get('redirect_urls', [])
+ if redirected_urls:
+ request_referrer = request.headers.get('Referer')
+ # we don't patch the referrer value if there is none
+ if request_referrer is not None:
+ # the request's referrer header value acts as a surrogate
+ # for the parent response URL
+ #
+ # Note: if the 3xx response contained a Referrer-Policy header,
+ # the information is not available using this hook
+ parent_url = safe_url_string(request_referrer)
+ policy_referrer = self.policy(parent_url, request).referrer(
+ parent_url, request.url)
+ if policy_referrer != request_referrer:
+ if policy_referrer is None:
+ request.headers.pop('Referer')
+ else:
+ request.headers['Referer'] = policy_referrer
diff --git a/scrapy/utils/url.py b/scrapy/utils/url.py
index dc1cce4acb9..8eed31060ac 100644
--- a/scrapy/utils/url.py
+++ b/scrapy/utils/url.py
@@ -7,7 +7,7 @@
"""
import posixpath
import re
-from six.moves.urllib.parse import (ParseResult, urldefrag, urlparse)
+from six.moves.urllib.parse import (ParseResult, urldefrag, urlparse, urlunparse)
# scrapy.utils.url was moved to w3lib.url and import * ensures this
# move doesn't break old code
@@ -103,3 +103,34 @@ def guess_scheme(url):
return any_to_uri(url)
else:
return add_http_if_no_scheme(url)
+
+
+def strip_url(url, strip_credentials=True, strip_default_port=True, origin_only=False, strip_fragment=True):
+
+ """Strip URL string from some of its components:
+
+ - `strip_credentials` removes "user:password@"
+ - `strip_default_port` removes ":80" (resp. ":443", ":21")
+ from http:// (resp. https://, ftp://) URLs
+ - `origin_only` replaces path component with "/", also dropping
+ query and fragment components ; it also strips credentials
+ - `strip_fragment` drops any #fragment component
+ """
+
+ parsed_url = urlparse(url)
+ netloc = parsed_url.netloc
+ if (strip_credentials or origin_only) and (parsed_url.username or parsed_url.password):
+ netloc = netloc.split('@')[-1]
+ if strip_default_port and parsed_url.port:
+ if (parsed_url.scheme, parsed_url.port) in (('http', 80),
+ ('https', 443),
+ ('ftp', 21)):
+ netloc = netloc.replace(':{p.port}'.format(p=parsed_url), '')
+ return urlunparse((
+ parsed_url.scheme,
+ netloc,
+ '/' if origin_only else parsed_url.path,
+ '' if origin_only else parsed_url.params,
+ '' if origin_only else parsed_url.query,
+ '' if strip_fragment else parsed_url.fragment
+ ))
| diff --git a/tests/test_spidermiddleware_referer.py b/tests/test_spidermiddleware_referer.py
index bd7673efb8b..b1c81587670 100644
--- a/tests/test_spidermiddleware_referer.py
+++ b/tests/test_spidermiddleware_referer.py
@@ -1,21 +1,867 @@
+from six.moves.urllib.parse import urlparse
from unittest import TestCase
+import warnings
+from scrapy.exceptions import NotConfigured
from scrapy.http import Response, Request
+from scrapy.settings import Settings
from scrapy.spiders import Spider
-from scrapy.spidermiddlewares.referer import RefererMiddleware
+from scrapy.downloadermiddlewares.redirect import RedirectMiddleware
+from scrapy.spidermiddlewares.referer import RefererMiddleware, \
+ POLICY_NO_REFERRER, POLICY_NO_REFERRER_WHEN_DOWNGRADE, \
+ POLICY_SAME_ORIGIN, POLICY_ORIGIN, POLICY_ORIGIN_WHEN_CROSS_ORIGIN, \
+ POLICY_SCRAPY_DEFAULT, POLICY_UNSAFE_URL, \
+ POLICY_STRICT_ORIGIN, POLICY_STRICT_ORIGIN_WHEN_CROSS_ORIGIN, \
+ DefaultReferrerPolicy, \
+ NoReferrerPolicy, NoReferrerWhenDowngradePolicy, \
+ OriginWhenCrossOriginPolicy, OriginPolicy, \
+ StrictOriginWhenCrossOriginPolicy, StrictOriginPolicy, \
+ SameOriginPolicy, UnsafeUrlPolicy, ReferrerPolicy
class TestRefererMiddleware(TestCase):
+ req_meta = {}
+ resp_headers = {}
+ settings = {}
+ scenarii = [
+ ('http://scrapytest.org', 'http://scrapytest.org/', b'http://scrapytest.org'),
+ ]
+
def setUp(self):
self.spider = Spider('foo')
- self.mw = RefererMiddleware()
+ settings = Settings(self.settings)
+ self.mw = RefererMiddleware(settings)
+
+ def get_request(self, target):
+ return Request(target, meta=self.req_meta)
+
+ def get_response(self, origin):
+ return Response(origin, headers=self.resp_headers)
+
+ def test(self):
+
+ for origin, target, referrer in self.scenarii:
+ response = self.get_response(origin)
+ request = self.get_request(target)
+ out = list(self.mw.process_spider_output(response, [request], self.spider))
+ self.assertEquals(out[0].headers.get('Referer'), referrer)
+
+
+class MixinDefault(object):
+ """
+ Based on https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer-when-downgrade
+
+ with some additional filtering of s3://
+ """
+ scenarii = [
+ ('https://example.com/', 'https://scrapy.org/', b'https://example.com/'),
+ ('http://example.com/', 'http://scrapy.org/', b'http://example.com/'),
+ ('http://example.com/', 'https://scrapy.org/', b'http://example.com/'),
+ ('https://example.com/', 'http://scrapy.org/', None),
+
+ # no credentials leak
+ ('http://user:password@example.com/', 'https://scrapy.org/', b'http://example.com/'),
+
+ # no referrer leak for local schemes
+ ('file:///home/path/to/somefile.html', 'https://scrapy.org/', None),
+ ('file:///home/path/to/somefile.html', 'http://scrapy.org/', None),
+
+ # no referrer leak for s3 origins
+ ('s3://mybucket/path/to/data.csv', 'https://scrapy.org/', None),
+ ('s3://mybucket/path/to/data.csv', 'http://scrapy.org/', None),
+ ]
+
+
+class MixinNoReferrer(object):
+ scenarii = [
+ ('https://example.com/page.html', 'https://example.com/', None),
+ ('http://www.example.com/', 'https://scrapy.org/', None),
+ ('http://www.example.com/', 'http://scrapy.org/', None),
+ ('https://www.example.com/', 'http://scrapy.org/', None),
+ ('file:///home/path/to/somefile.html', 'http://scrapy.org/', None),
+ ]
+
+
+class MixinNoReferrerWhenDowngrade(object):
+ scenarii = [
+ # TLS to TLS: send non-empty referrer
+ ('https://example.com/page.html', 'https://not.example.com/', b'https://example.com/page.html'),
+ ('https://example.com/page.html', 'https://scrapy.org/', b'https://example.com/page.html'),
+ ('https://example.com:443/page.html', 'https://scrapy.org/', b'https://example.com/page.html'),
+ ('https://example.com:444/page.html', 'https://scrapy.org/', b'https://example.com:444/page.html'),
+ ('ftps://example.com/urls.zip', 'https://scrapy.org/', b'ftps://example.com/urls.zip'),
+
+ # TLS to non-TLS: do not send referrer
+ ('https://example.com/page.html', 'http://not.example.com/', None),
+ ('https://example.com/page.html', 'http://scrapy.org/', None),
+ ('ftps://example.com/urls.zip', 'http://scrapy.org/', None),
+
+ # non-TLS to TLS or non-TLS: send referrer
+ ('http://example.com/page.html', 'https://not.example.com/', b'http://example.com/page.html'),
+ ('http://example.com/page.html', 'https://scrapy.org/', b'http://example.com/page.html'),
+ ('http://example.com:8080/page.html', 'https://scrapy.org/', b'http://example.com:8080/page.html'),
+ ('http://example.com:80/page.html', 'http://not.example.com/', b'http://example.com/page.html'),
+ ('http://example.com/page.html', 'http://scrapy.org/', b'http://example.com/page.html'),
+ ('http://example.com:443/page.html', 'http://scrapy.org/', b'http://example.com:443/page.html'),
+ ('ftp://example.com/urls.zip', 'http://scrapy.org/', b'ftp://example.com/urls.zip'),
+ ('ftp://example.com/urls.zip', 'https://scrapy.org/', b'ftp://example.com/urls.zip'),
+
+ # test for user/password stripping
+ ('http://user:password@example.com/page.html', 'https://not.example.com/', b'http://example.com/page.html'),
+ ]
+
+
+class MixinSameOrigin(object):
+ scenarii = [
+ # Same origin (protocol, host, port): send referrer
+ ('https://example.com/page.html', 'https://example.com/not-page.html', b'https://example.com/page.html'),
+ ('http://example.com/page.html', 'http://example.com/not-page.html', b'http://example.com/page.html'),
+ ('https://example.com:443/page.html', 'https://example.com/not-page.html', b'https://example.com/page.html'),
+ ('http://example.com:80/page.html', 'http://example.com/not-page.html', b'http://example.com/page.html'),
+ ('http://example.com/page.html', 'http://example.com:80/not-page.html', b'http://example.com/page.html'),
+ ('http://example.com:8888/page.html', 'http://example.com:8888/not-page.html', b'http://example.com:8888/page.html'),
+
+ # Different host: do NOT send referrer
+ ('https://example.com/page.html', 'https://not.example.com/otherpage.html', None),
+ ('http://example.com/page.html', 'http://not.example.com/otherpage.html', None),
+ ('http://example.com/page.html', 'http://www.example.com/otherpage.html', None),
+
+ # Different port: do NOT send referrer
+ ('https://example.com:444/page.html', 'https://example.com/not-page.html', None),
+ ('http://example.com:81/page.html', 'http://example.com/not-page.html', None),
+ ('http://example.com/page.html', 'http://example.com:81/not-page.html', None),
+
+ # Different protocols: do NOT send refferer
+ ('https://example.com/page.html', 'http://example.com/not-page.html', None),
+ ('https://example.com/page.html', 'http://not.example.com/', None),
+ ('ftps://example.com/urls.zip', 'https://example.com/not-page.html', None),
+ ('ftp://example.com/urls.zip', 'http://example.com/not-page.html', None),
+ ('ftps://example.com/urls.zip', 'https://example.com/not-page.html', None),
+
+ # test for user/password stripping
+ ('https://user:password@example.com/page.html', 'https://example.com/not-page.html', b'https://example.com/page.html'),
+ ('https://user:password@example.com/page.html', 'http://example.com/not-page.html', None),
+ ]
+
+
+class MixinOrigin(object):
+ scenarii = [
+ # TLS or non-TLS to TLS or non-TLS: referrer origin is sent (yes, even for downgrades)
+ ('https://example.com/page.html', 'https://example.com/not-page.html', b'https://example.com/'),
+ ('https://example.com/page.html', 'https://scrapy.org', b'https://example.com/'),
+ ('https://example.com/page.html', 'http://scrapy.org', b'https://example.com/'),
+ ('http://example.com/page.html', 'http://scrapy.org', b'http://example.com/'),
+
+ # test for user/password stripping
+ ('https://user:password@example.com/page.html', 'http://scrapy.org', b'https://example.com/'),
+ ]
+
+
+class MixinStrictOrigin(object):
+ scenarii = [
+ # TLS or non-TLS to TLS or non-TLS: referrer origin is sent but not for downgrades
+ ('https://example.com/page.html', 'https://example.com/not-page.html', b'https://example.com/'),
+ ('https://example.com/page.html', 'https://scrapy.org', b'https://example.com/'),
+ ('http://example.com/page.html', 'http://scrapy.org', b'http://example.com/'),
+
+ # downgrade: send nothing
+ ('https://example.com/page.html', 'http://scrapy.org', None),
+
+ # upgrade: send origin
+ ('http://example.com/page.html', 'https://scrapy.org', b'http://example.com/'),
+
+ # test for user/password stripping
+ ('https://user:password@example.com/page.html', 'https://scrapy.org', b'https://example.com/'),
+ ('https://user:password@example.com/page.html', 'http://scrapy.org', None),
+ ]
+
+
+class MixinOriginWhenCrossOrigin(object):
+ scenarii = [
+ # Same origin (protocol, host, port): send referrer
+ ('https://example.com/page.html', 'https://example.com/not-page.html', b'https://example.com/page.html'),
+ ('http://example.com/page.html', 'http://example.com/not-page.html', b'http://example.com/page.html'),
+ ('https://example.com:443/page.html', 'https://example.com/not-page.html', b'https://example.com/page.html'),
+ ('http://example.com:80/page.html', 'http://example.com/not-page.html', b'http://example.com/page.html'),
+ ('http://example.com/page.html', 'http://example.com:80/not-page.html', b'http://example.com/page.html'),
+ ('http://example.com:8888/page.html', 'http://example.com:8888/not-page.html', b'http://example.com:8888/page.html'),
+
+ # Different host: send origin as referrer
+ ('https://example2.com/page.html', 'https://scrapy.org/otherpage.html', b'https://example2.com/'),
+ ('https://example2.com/page.html', 'https://not.example2.com/otherpage.html', b'https://example2.com/'),
+ ('http://example2.com/page.html', 'http://not.example2.com/otherpage.html', b'http://example2.com/'),
+ # exact match required
+ ('http://example2.com/page.html', 'http://www.example2.com/otherpage.html', b'http://example2.com/'),
+
+ # Different port: send origin as referrer
+ ('https://example3.com:444/page.html', 'https://example3.com/not-page.html', b'https://example3.com:444/'),
+ ('http://example3.com:81/page.html', 'http://example3.com/not-page.html', b'http://example3.com:81/'),
+
+ # Different protocols: send origin as referrer
+ ('https://example4.com/page.html', 'http://example4.com/not-page.html', b'https://example4.com/'),
+ ('https://example4.com/page.html', 'http://not.example4.com/', b'https://example4.com/'),
+ ('ftps://example4.com/urls.zip', 'https://example4.com/not-page.html', b'ftps://example4.com/'),
+ ('ftp://example4.com/urls.zip', 'http://example4.com/not-page.html', b'ftp://example4.com/'),
+ ('ftps://example4.com/urls.zip', 'https://example4.com/not-page.html', b'ftps://example4.com/'),
+
+ # test for user/password stripping
+ ('https://user:password@example5.com/page.html', 'https://example5.com/not-page.html', b'https://example5.com/page.html'),
+ # TLS to non-TLS downgrade: send origin
+ ('https://user:password@example5.com/page.html', 'http://example5.com/not-page.html', b'https://example5.com/'),
+ ]
+
+
+class MixinStrictOriginWhenCrossOrigin(object):
+ scenarii = [
+ # Same origin (protocol, host, port): send referrer
+ ('https://example.com/page.html', 'https://example.com/not-page.html', b'https://example.com/page.html'),
+ ('http://example.com/page.html', 'http://example.com/not-page.html', b'http://example.com/page.html'),
+ ('https://example.com:443/page.html', 'https://example.com/not-page.html', b'https://example.com/page.html'),
+ ('http://example.com:80/page.html', 'http://example.com/not-page.html', b'http://example.com/page.html'),
+ ('http://example.com/page.html', 'http://example.com:80/not-page.html', b'http://example.com/page.html'),
+ ('http://example.com:8888/page.html', 'http://example.com:8888/not-page.html', b'http://example.com:8888/page.html'),
+
+ # Different host: send origin as referrer
+ ('https://example2.com/page.html', 'https://scrapy.org/otherpage.html', b'https://example2.com/'),
+ ('https://example2.com/page.html', 'https://not.example2.com/otherpage.html', b'https://example2.com/'),
+ ('http://example2.com/page.html', 'http://not.example2.com/otherpage.html', b'http://example2.com/'),
+ # exact match required
+ ('http://example2.com/page.html', 'http://www.example2.com/otherpage.html', b'http://example2.com/'),
+
+ # Different port: send origin as referrer
+ ('https://example3.com:444/page.html', 'https://example3.com/not-page.html', b'https://example3.com:444/'),
+ ('http://example3.com:81/page.html', 'http://example3.com/not-page.html', b'http://example3.com:81/'),
+
+ # downgrade
+ ('https://example4.com/page.html', 'http://example4.com/not-page.html', None),
+ ('https://example4.com/page.html', 'http://not.example4.com/', None),
+
+ # non-TLS to non-TLS
+ ('ftp://example4.com/urls.zip', 'http://example4.com/not-page.html', b'ftp://example4.com/'),
+
+ # upgrade
+ ('http://example4.com/page.html', 'https://example4.com/not-page.html', b'http://example4.com/'),
+ ('http://example4.com/page.html', 'https://not.example4.com/', b'http://example4.com/'),
+
+ # Different protocols: send origin as referrer
+ ('ftps://example4.com/urls.zip', 'https://example4.com/not-page.html', b'ftps://example4.com/'),
+ ('ftps://example4.com/urls.zip', 'https://example4.com/not-page.html', b'ftps://example4.com/'),
+
+ # test for user/password stripping
+ ('https://user:password@example5.com/page.html', 'https://example5.com/not-page.html', b'https://example5.com/page.html'),
+
+ # TLS to non-TLS downgrade: send nothing
+ ('https://user:password@example5.com/page.html', 'http://example5.com/not-page.html', None),
+ ]
+
+
+class MixinUnsafeUrl(object):
+ scenarii = [
+ # TLS to TLS: send referrer
+ ('https://example.com/sekrit.html', 'http://not.example.com/', b'https://example.com/sekrit.html'),
+ ('https://example1.com/page.html', 'https://not.example1.com/', b'https://example1.com/page.html'),
+ ('https://example1.com/page.html', 'https://scrapy.org/', b'https://example1.com/page.html'),
+ ('https://example1.com:443/page.html', 'https://scrapy.org/', b'https://example1.com/page.html'),
+ ('https://example1.com:444/page.html', 'https://scrapy.org/', b'https://example1.com:444/page.html'),
+ ('ftps://example1.com/urls.zip', 'https://scrapy.org/', b'ftps://example1.com/urls.zip'),
+
+ # TLS to non-TLS: send referrer (yes, it's unsafe)
+ ('https://example2.com/page.html', 'http://not.example2.com/', b'https://example2.com/page.html'),
+ ('https://example2.com/page.html', 'http://scrapy.org/', b'https://example2.com/page.html'),
+ ('ftps://example2.com/urls.zip', 'http://scrapy.org/', b'ftps://example2.com/urls.zip'),
+
+ # non-TLS to TLS or non-TLS: send referrer (yes, it's unsafe)
+ ('http://example3.com/page.html', 'https://not.example3.com/', b'http://example3.com/page.html'),
+ ('http://example3.com/page.html', 'https://scrapy.org/', b'http://example3.com/page.html'),
+ ('http://example3.com:8080/page.html', 'https://scrapy.org/', b'http://example3.com:8080/page.html'),
+ ('http://example3.com:80/page.html', 'http://not.example3.com/', b'http://example3.com/page.html'),
+ ('http://example3.com/page.html', 'http://scrapy.org/', b'http://example3.com/page.html'),
+ ('http://example3.com:443/page.html', 'http://scrapy.org/', b'http://example3.com:443/page.html'),
+ ('ftp://example3.com/urls.zip', 'http://scrapy.org/', b'ftp://example3.com/urls.zip'),
+ ('ftp://example3.com/urls.zip', 'https://scrapy.org/', b'ftp://example3.com/urls.zip'),
+
+ # test for user/password stripping
+ ('http://user:password@example4.com/page.html', 'https://not.example4.com/', b'http://example4.com/page.html'),
+ ('https://user:password@example4.com/page.html', 'http://scrapy.org/', b'https://example4.com/page.html'),
+ ]
+
+
+class TestRefererMiddlewareDefault(MixinDefault, TestRefererMiddleware):
+ pass
+
+
+# --- Tests using settings to set policy using class path
+class TestSettingsNoReferrer(MixinNoReferrer, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.NoReferrerPolicy'}
+
+
+class TestSettingsNoReferrerWhenDowngrade(MixinNoReferrerWhenDowngrade, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.NoReferrerWhenDowngradePolicy'}
+
+
+class TestSettingsSameOrigin(MixinSameOrigin, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.SameOriginPolicy'}
+
+
+class TestSettingsOrigin(MixinOrigin, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.OriginPolicy'}
+
+
+class TestSettingsStrictOrigin(MixinStrictOrigin, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.StrictOriginPolicy'}
+
+
+class TestSettingsOriginWhenCrossOrigin(MixinOriginWhenCrossOrigin, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.OriginWhenCrossOriginPolicy'}
+
+
+class TestSettingsStrictOriginWhenCrossOrigin(MixinStrictOriginWhenCrossOrigin, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.StrictOriginWhenCrossOriginPolicy'}
+
+
+class TestSettingsUnsafeUrl(MixinUnsafeUrl, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.UnsafeUrlPolicy'}
+
+
+class CustomPythonOrgPolicy(ReferrerPolicy):
+ """
+ A dummy policy that returns referrer as http(s)://python.org
+ depending on the scheme of the target URL.
+ """
+ def referrer(self, response, request):
+ scheme = urlparse(request).scheme
+ if scheme == 'https':
+ return b'https://python.org/'
+ elif scheme == 'http':
+ return b'http://python.org/'
+
+
+class TestSettingsCustomPolicy(TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'tests.test_spidermiddleware_referer.CustomPythonOrgPolicy'}
+ scenarii = [
+ ('https://example.com/', 'https://scrapy.org/', b'https://python.org/'),
+ ('http://example.com/', 'http://scrapy.org/', b'http://python.org/'),
+ ('http://example.com/', 'https://scrapy.org/', b'https://python.org/'),
+ ('https://example.com/', 'http://scrapy.org/', b'http://python.org/'),
+ ('file:///home/path/to/somefile.html', 'https://scrapy.org/', b'https://python.org/'),
+ ('file:///home/path/to/somefile.html', 'http://scrapy.org/', b'http://python.org/'),
+
+ ]
+
+# --- Tests using Request meta dict to set policy
+class TestRequestMetaDefault(MixinDefault, TestRefererMiddleware):
+ req_meta = {'referrer_policy': POLICY_SCRAPY_DEFAULT}
+
+
+class TestRequestMetaNoReferrer(MixinNoReferrer, TestRefererMiddleware):
+ req_meta = {'referrer_policy': POLICY_NO_REFERRER}
+
+
+class TestRequestMetaNoReferrerWhenDowngrade(MixinNoReferrerWhenDowngrade, TestRefererMiddleware):
+ req_meta = {'referrer_policy': POLICY_NO_REFERRER_WHEN_DOWNGRADE}
+
+
+class TestRequestMetaSameOrigin(MixinSameOrigin, TestRefererMiddleware):
+ req_meta = {'referrer_policy': POLICY_SAME_ORIGIN}
+
+
+class TestRequestMetaOrigin(MixinOrigin, TestRefererMiddleware):
+ req_meta = {'referrer_policy': POLICY_ORIGIN}
+
+
+class TestRequestMetaSrictOrigin(MixinStrictOrigin, TestRefererMiddleware):
+ req_meta = {'referrer_policy': POLICY_STRICT_ORIGIN}
+
+
+class TestRequestMetaOriginWhenCrossOrigin(MixinOriginWhenCrossOrigin, TestRefererMiddleware):
+ req_meta = {'referrer_policy': POLICY_ORIGIN_WHEN_CROSS_ORIGIN}
+
+
+class TestRequestMetaStrictOriginWhenCrossOrigin(MixinStrictOriginWhenCrossOrigin, TestRefererMiddleware):
+ req_meta = {'referrer_policy': POLICY_STRICT_ORIGIN_WHEN_CROSS_ORIGIN}
+
+
+class TestRequestMetaUnsafeUrl(MixinUnsafeUrl, TestRefererMiddleware):
+ req_meta = {'referrer_policy': POLICY_UNSAFE_URL}
+
+
+class TestRequestMetaPredecence001(MixinUnsafeUrl, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.SameOriginPolicy'}
+ req_meta = {'referrer_policy': POLICY_UNSAFE_URL}
+
+
+class TestRequestMetaPredecence002(MixinNoReferrer, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.NoReferrerWhenDowngradePolicy'}
+ req_meta = {'referrer_policy': POLICY_NO_REFERRER}
+
+
+class TestRequestMetaPredecence003(MixinUnsafeUrl, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.OriginWhenCrossOriginPolicy'}
+ req_meta = {'referrer_policy': POLICY_UNSAFE_URL}
+
+
+class TestRequestMetaSettingFallback(TestCase):
+
+ params = [
+ (
+ # When an unknown policy is referenced in Request.meta
+ # (here, a typo error),
+ # the policy defined in settings takes precedence
+ {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.OriginWhenCrossOriginPolicy'},
+ {},
+ {'referrer_policy': 'ssscrapy-default'},
+ OriginWhenCrossOriginPolicy,
+ True
+ ),
+ (
+ # same as above but with string value for settings policy
+ {'REFERRER_POLICY': 'origin-when-cross-origin'},
+ {},
+ {'referrer_policy': 'ssscrapy-default'},
+ OriginWhenCrossOriginPolicy,
+ True
+ ),
+ (
+ # request meta references a wrong policy but it is set,
+ # so the Referrer-Policy header in response is not used,
+ # and the settings' policy is applied
+ {'REFERRER_POLICY': 'origin-when-cross-origin'},
+ {'Referrer-Policy': 'unsafe-url'},
+ {'referrer_policy': 'ssscrapy-default'},
+ OriginWhenCrossOriginPolicy,
+ True
+ ),
+ (
+ # here, request meta does not set the policy
+ # so response headers take precedence
+ {'REFERRER_POLICY': 'origin-when-cross-origin'},
+ {'Referrer-Policy': 'unsafe-url'},
+ {},
+ UnsafeUrlPolicy,
+ False
+ ),
+ (
+ # here, request meta does not set the policy,
+ # but response headers also use an unknown policy,
+ # so the settings' policy is used
+ {'REFERRER_POLICY': 'origin-when-cross-origin'},
+ {'Referrer-Policy': 'unknown'},
+ {},
+ OriginWhenCrossOriginPolicy,
+ True
+ )
+ ]
+
+ def test(self):
+
+ origin = 'http://www.scrapy.org'
+ target = 'http://www.example.com'
+
+ for settings, response_headers, request_meta, policy_class, check_warning in self.params[3:]:
+ spider = Spider('foo')
+ mw = RefererMiddleware(Settings(settings))
+
+ response = Response(origin, headers=response_headers)
+ request = Request(target, meta=request_meta)
+
+ with warnings.catch_warnings(record=True) as w:
+ policy = mw.policy(response, request)
+ self.assertIsInstance(policy, policy_class)
+
+ if check_warning:
+ self.assertEqual(len(w), 1)
+ self.assertEqual(w[0].category, RuntimeWarning, w[0].message)
+
+
+class TestSettingsPolicyByName(TestCase):
+
+ def test_valid_name(self):
+ for s, p in [
+ (POLICY_SCRAPY_DEFAULT, DefaultReferrerPolicy),
+ (POLICY_NO_REFERRER, NoReferrerPolicy),
+ (POLICY_NO_REFERRER_WHEN_DOWNGRADE, NoReferrerWhenDowngradePolicy),
+ (POLICY_SAME_ORIGIN, SameOriginPolicy),
+ (POLICY_ORIGIN, OriginPolicy),
+ (POLICY_STRICT_ORIGIN, StrictOriginPolicy),
+ (POLICY_ORIGIN_WHEN_CROSS_ORIGIN, OriginWhenCrossOriginPolicy),
+ (POLICY_STRICT_ORIGIN_WHEN_CROSS_ORIGIN, StrictOriginWhenCrossOriginPolicy),
+ (POLICY_UNSAFE_URL, UnsafeUrlPolicy),
+ ]:
+ settings = Settings({'REFERRER_POLICY': s})
+ mw = RefererMiddleware(settings)
+ self.assertEquals(mw.default_policy, p)
+
+ def test_valid_name_casevariants(self):
+ for s, p in [
+ (POLICY_SCRAPY_DEFAULT, DefaultReferrerPolicy),
+ (POLICY_NO_REFERRER, NoReferrerPolicy),
+ (POLICY_NO_REFERRER_WHEN_DOWNGRADE, NoReferrerWhenDowngradePolicy),
+ (POLICY_SAME_ORIGIN, SameOriginPolicy),
+ (POLICY_ORIGIN, OriginPolicy),
+ (POLICY_STRICT_ORIGIN, StrictOriginPolicy),
+ (POLICY_ORIGIN_WHEN_CROSS_ORIGIN, OriginWhenCrossOriginPolicy),
+ (POLICY_STRICT_ORIGIN_WHEN_CROSS_ORIGIN, StrictOriginWhenCrossOriginPolicy),
+ (POLICY_UNSAFE_URL, UnsafeUrlPolicy),
+ ]:
+ settings = Settings({'REFERRER_POLICY': s.upper()})
+ mw = RefererMiddleware(settings)
+ self.assertEquals(mw.default_policy, p)
+
+ def test_invalid_name(self):
+ settings = Settings({'REFERRER_POLICY': 'some-custom-unknown-policy'})
+ with self.assertRaises(RuntimeError):
+ mw = RefererMiddleware(settings)
+
+
+class TestPolicyHeaderPredecence001(MixinUnsafeUrl, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.SameOriginPolicy'}
+ resp_headers = {'Referrer-Policy': POLICY_UNSAFE_URL.upper()}
+
+class TestPolicyHeaderPredecence002(MixinNoReferrer, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.NoReferrerWhenDowngradePolicy'}
+ resp_headers = {'Referrer-Policy': POLICY_NO_REFERRER.swapcase()}
+
+class TestPolicyHeaderPredecence003(MixinNoReferrerWhenDowngrade, TestRefererMiddleware):
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.OriginWhenCrossOriginPolicy'}
+ resp_headers = {'Referrer-Policy': POLICY_NO_REFERRER_WHEN_DOWNGRADE.title()}
+
+
+class TestReferrerOnRedirect(TestRefererMiddleware):
+
+ settings = {'REFERRER_POLICY': 'scrapy.spidermiddlewares.referer.UnsafeUrlPolicy'}
+ scenarii = [
+ ( 'http://scrapytest.org/1', # parent
+ 'http://scrapytest.org/2', # target
+ (
+ # redirections: code, URL
+ (301, 'http://scrapytest.org/3'),
+ (301, 'http://scrapytest.org/4'),
+ ),
+ b'http://scrapytest.org/1', # expected initial referer
+ b'http://scrapytest.org/1', # expected referer for the redirection request
+ ),
+ ( 'https://scrapytest.org/1',
+ 'https://scrapytest.org/2',
+ (
+ # redirecting to non-secure URL
+ (301, 'http://scrapytest.org/3'),
+ ),
+ b'https://scrapytest.org/1',
+ b'https://scrapytest.org/1',
+ ),
+ ( 'https://scrapytest.org/1',
+ 'https://scrapytest.com/2',
+ (
+ # redirecting to non-secure URL: different origin
+ (301, 'http://scrapytest.com/3'),
+ ),
+ b'https://scrapytest.org/1',
+ b'https://scrapytest.org/1',
+ ),
+ ]
+
+ def setUp(self):
+ self.spider = Spider('foo')
+ settings = Settings(self.settings)
+ self.referrermw = RefererMiddleware(settings)
+ self.redirectmw = RedirectMiddleware(settings)
+
+ def test(self):
+
+ for parent, target, redirections, init_referrer, final_referrer in self.scenarii:
+ response = self.get_response(parent)
+ request = self.get_request(target)
+
+ out = list(self.referrermw.process_spider_output(response, [request], self.spider))
+ self.assertEquals(out[0].headers.get('Referer'), init_referrer)
+
+ for status, url in redirections:
+ response = Response(request.url, headers={'Location': url}, status=status)
+ request = self.redirectmw.process_response(request, response, self.spider)
+ self.referrermw.request_scheduled(request, self.spider)
+
+ assert isinstance(request, Request)
+ self.assertEquals(request.headers.get('Referer'), final_referrer)
+
+
+class TestReferrerOnRedirectNoReferrer(TestReferrerOnRedirect):
+ """
+ No Referrer policy never sets the "Referer" header.
+ HTTP redirections should not change that.
+ """
+ settings = {'REFERRER_POLICY': 'no-referrer'}
+ scenarii = [
+ ( 'http://scrapytest.org/1', # parent
+ 'http://scrapytest.org/2', # target
+ (
+ # redirections: code, URL
+ (301, 'http://scrapytest.org/3'),
+ (301, 'http://scrapytest.org/4'),
+ ),
+ None, # expected initial "Referer"
+ None, # expected "Referer" for the redirection request
+ ),
+ ( 'https://scrapytest.org/1',
+ 'https://scrapytest.org/2',
+ (
+ (301, 'http://scrapytest.org/3'),
+ ),
+ None,
+ None,
+ ),
+ ( 'https://scrapytest.org/1',
+ 'https://example.com/2', # different origin
+ (
+ (301, 'http://scrapytest.com/3'),
+ ),
+ None,
+ None,
+ ),
+ ]
+
+
+class TestReferrerOnRedirectSameOrigin(TestReferrerOnRedirect):
+ """
+ Same Origin policy sends the full URL as "Referer" if the target origin
+ is the same as the parent response (same protocol, same domain, same port).
+
+ HTTP redirections to a different domain or a lower secure level
+ should have the "Referer" removed.
+ """
+ settings = {'REFERRER_POLICY': 'same-origin'}
+ scenarii = [
+ ( 'http://scrapytest.org/101', # origin
+ 'http://scrapytest.org/102', # target
+ (
+ # redirections: code, URL
+ (301, 'http://scrapytest.org/103'),
+ (301, 'http://scrapytest.org/104'),
+ ),
+ b'http://scrapytest.org/101', # expected initial "Referer"
+ b'http://scrapytest.org/101', # expected referer for the redirection request
+ ),
+ ( 'https://scrapytest.org/201',
+ 'https://scrapytest.org/202',
+ (
+ # redirecting from secure to non-secure URL == different origin
+ (301, 'http://scrapytest.org/203'),
+ ),
+ b'https://scrapytest.org/201',
+ None,
+ ),
+ ( 'https://scrapytest.org/301',
+ 'https://scrapytest.org/302',
+ (
+ # different domain == different origin
+ (301, 'http://example.com/303'),
+ ),
+ b'https://scrapytest.org/301',
+ None,
+ ),
+ ]
+
+
+class TestReferrerOnRedirectStrictOrigin(TestReferrerOnRedirect):
+ """
+ Strict Origin policy will always send the "origin" as referrer
+ (think of it as the parent URL without the path part),
+ unless the security level is lower and no "Referer" is sent.
+
+ Redirections from secure to non-secure URLs should have the
+ "Referrer" header removed if necessary.
+ """
+ settings = {'REFERRER_POLICY': POLICY_STRICT_ORIGIN}
+ scenarii = [
+ ( 'http://scrapytest.org/101',
+ 'http://scrapytest.org/102',
+ (
+ (301, 'http://scrapytest.org/103'),
+ (301, 'http://scrapytest.org/104'),
+ ),
+ b'http://scrapytest.org/', # send origin
+ b'http://scrapytest.org/', # redirects to same origin: send origin
+ ),
+ ( 'https://scrapytest.org/201',
+ 'https://scrapytest.org/202',
+ (
+ # redirecting to non-secure URL: no referrer
+ (301, 'http://scrapytest.org/203'),
+ ),
+ b'https://scrapytest.org/',
+ None,
+ ),
+ ( 'https://scrapytest.org/301',
+ 'https://scrapytest.org/302',
+ (
+ # redirecting to non-secure URL (different domain): no referrer
+ (301, 'http://example.com/303'),
+ ),
+ b'https://scrapytest.org/',
+ None,
+ ),
+ ( 'http://scrapy.org/401',
+ 'http://example.com/402',
+ (
+ (301, 'http://scrapytest.org/403'),
+ ),
+ b'http://scrapy.org/',
+ b'http://scrapy.org/',
+ ),
+ ( 'https://scrapy.org/501',
+ 'https://example.com/502',
+ (
+ # HTTPS all along, so origin referrer is kept as-is
+ (301, 'https://google.com/503'),
+ (301, 'https://facebook.com/504'),
+ ),
+ b'https://scrapy.org/',
+ b'https://scrapy.org/',
+ ),
+ ( 'https://scrapytest.org/601',
+ 'http://scrapytest.org/602', # TLS to non-TLS: no referrer
+ (
+ (301, 'https://scrapytest.org/603'), # TLS URL again: (still) no referrer
+ ),
+ None,
+ None,
+ ),
+ ]
+
+
+class TestReferrerOnRedirectOriginWhenCrossOrigin(TestReferrerOnRedirect):
+ """
+ Origin When Cross-Origin policy sends the full URL as "Referer",
+ unless the target's origin is different (different domain, different protocol)
+ in which case only the origin is sent.
+
+ Redirections to a different origin should strip the "Referer"
+ to the parent origin.
+ """
+ settings = {'REFERRER_POLICY': POLICY_ORIGIN_WHEN_CROSS_ORIGIN}
+ scenarii = [
+ ( 'http://scrapytest.org/101', # origin
+ 'http://scrapytest.org/102', # target + redirection
+ (
+ # redirections: code, URL
+ (301, 'http://scrapytest.org/103'),
+ (301, 'http://scrapytest.org/104'),
+ ),
+ b'http://scrapytest.org/101', # expected initial referer
+ b'http://scrapytest.org/101', # expected referer for the redirection request
+ ),
+ ( 'https://scrapytest.org/201',
+ 'https://scrapytest.org/202',
+ (
+ # redirecting to non-secure URL: send origin
+ (301, 'http://scrapytest.org/203'),
+ ),
+ b'https://scrapytest.org/201',
+ b'https://scrapytest.org/',
+ ),
+ ( 'https://scrapytest.org/301',
+ 'https://scrapytest.org/302',
+ (
+ # redirecting to non-secure URL (different domain): send origin
+ (301, 'http://example.com/303'),
+ ),
+ b'https://scrapytest.org/301',
+ b'https://scrapytest.org/',
+ ),
+ ( 'http://scrapy.org/401',
+ 'http://example.com/402',
+ (
+ (301, 'http://scrapytest.org/403'),
+ ),
+ b'http://scrapy.org/',
+ b'http://scrapy.org/',
+ ),
+ ( 'https://scrapy.org/501',
+ 'https://example.com/502',
+ (
+ # all different domains: send origin
+ (301, 'https://google.com/503'),
+ (301, 'https://facebook.com/504'),
+ ),
+ b'https://scrapy.org/',
+ b'https://scrapy.org/',
+ ),
+ ( 'https://scrapytest.org/301',
+ 'http://scrapytest.org/302', # TLS to non-TLS: send origin
+ (
+ (301, 'https://scrapytest.org/303'), # TLS URL again: send origin (also)
+ ),
+ b'https://scrapytest.org/',
+ b'https://scrapytest.org/',
+ ),
+ ]
- def test_process_spider_output(self):
- res = Response('http://scrapytest.org')
- reqs = [Request('http://scrapytest.org/')]
- out = list(self.mw.process_spider_output(res, reqs, self.spider))
- self.assertEquals(out[0].headers.get('Referer'),
- b'http://scrapytest.org')
+class TestReferrerOnRedirectStrictOriginWhenCrossOrigin(TestReferrerOnRedirect):
+ """
+ Strict Origin When Cross-Origin policy sends the full URL as "Referer",
+ unless the target's origin is different (different domain, different protocol)
+ in which case only the origin is sent...
+ Unless there's also a downgrade in security and then the "Referer" header
+ is not sent.
+ Redirections to a different origin should strip the "Referer" to the parent origin,
+ and from https:// to http:// will remove the "Referer" header.
+ """
+ settings = {'REFERRER_POLICY': POLICY_STRICT_ORIGIN_WHEN_CROSS_ORIGIN}
+ scenarii = [
+ ( 'http://scrapytest.org/101', # origin
+ 'http://scrapytest.org/102', # target + redirection
+ (
+ # redirections: code, URL
+ (301, 'http://scrapytest.org/103'),
+ (301, 'http://scrapytest.org/104'),
+ ),
+ b'http://scrapytest.org/101', # expected initial referer
+ b'http://scrapytest.org/101', # expected referer for the redirection request
+ ),
+ ( 'https://scrapytest.org/201',
+ 'https://scrapytest.org/202',
+ (
+ # redirecting to non-secure URL: do not send the "Referer" header
+ (301, 'http://scrapytest.org/203'),
+ ),
+ b'https://scrapytest.org/201',
+ None,
+ ),
+ ( 'https://scrapytest.org/301',
+ 'https://scrapytest.org/302',
+ (
+ # redirecting to non-secure URL (different domain): send origin
+ (301, 'http://example.com/303'),
+ ),
+ b'https://scrapytest.org/301',
+ None,
+ ),
+ ( 'http://scrapy.org/401',
+ 'http://example.com/402',
+ (
+ (301, 'http://scrapytest.org/403'),
+ ),
+ b'http://scrapy.org/',
+ b'http://scrapy.org/',
+ ),
+ ( 'https://scrapy.org/501',
+ 'https://example.com/502',
+ (
+ # all different domains: send origin
+ (301, 'https://google.com/503'),
+ (301, 'https://facebook.com/504'),
+ ),
+ b'https://scrapy.org/',
+ b'https://scrapy.org/',
+ ),
+ ( 'https://scrapytest.org/601',
+ 'http://scrapytest.org/602', # TLS to non-TLS: do not send "Referer"
+ (
+ (301, 'https://scrapytest.org/603'), # TLS URL again: (still) send nothing
+ ),
+ None,
+ None,
+ ),
+ ]
diff --git a/tests/test_utils_url.py b/tests/test_utils_url.py
index f46d1d927af..c2b9fc17622 100644
--- a/tests/test_utils_url.py
+++ b/tests/test_utils_url.py
@@ -6,7 +6,8 @@
from scrapy.spiders import Spider
from scrapy.utils.url import (url_is_from_any_domain, url_is_from_spider,
- add_http_if_no_scheme, guess_scheme, parse_url)
+ add_http_if_no_scheme, guess_scheme,
+ parse_url, strip_url)
__doctests__ = ['scrapy.utils.url']
@@ -241,5 +242,171 @@ def do_expected(self):
setattr (GuessSchemeTest, t_method.__name__, t_method)
+class StripUrl(unittest.TestCase):
+
+ def test_noop(self):
+ self.assertEqual(strip_url(
+ 'http://www.example.com/index.html'),
+ 'http://www.example.com/index.html')
+
+ def test_noop_query_string(self):
+ self.assertEqual(strip_url(
+ 'http://www.example.com/index.html?somekey=somevalue'),
+ 'http://www.example.com/index.html?somekey=somevalue')
+
+ def test_fragments(self):
+ self.assertEqual(strip_url(
+ 'http://www.example.com/index.html?somekey=somevalue#section', strip_fragment=False),
+ 'http://www.example.com/index.html?somekey=somevalue#section')
+
+ def test_path(self):
+ for input_url, origin, output_url in [
+ ('http://www.example.com/',
+ False,
+ 'http://www.example.com/'),
+
+ ('http://www.example.com',
+ False,
+ 'http://www.example.com'),
+
+ ('http://www.example.com',
+ True,
+ 'http://www.example.com/'),
+ ]:
+ self.assertEqual(strip_url(input_url, origin_only=origin), output_url)
+
+ def test_credentials(self):
+ for i, o in [
+ ('http://username@www.example.com/index.html?somekey=somevalue#section',
+ 'http://www.example.com/index.html?somekey=somevalue'),
+
+ ('https://username:@www.example.com/index.html?somekey=somevalue#section',
+ 'https://www.example.com/index.html?somekey=somevalue'),
+
+ ('ftp://username:password@www.example.com/index.html?somekey=somevalue#section',
+ 'ftp://www.example.com/index.html?somekey=somevalue'),
+ ]:
+ self.assertEqual(strip_url(i, strip_credentials=True), o)
+
+ def test_credentials_encoded_delims(self):
+ for i, o in [
+ # user: "username@"
+ # password: none
+ ('http://username%40@www.example.com/index.html?somekey=somevalue#section',
+ 'http://www.example.com/index.html?somekey=somevalue'),
+
+ # user: "username:pass"
+ # password: ""
+ ('https://username%3Apass:@www.example.com/index.html?somekey=somevalue#section',
+ 'https://www.example.com/index.html?somekey=somevalue'),
+
+ # user: "me"
+ # password: "user@domain.com"
+ ('ftp://me:user%40domain.com@www.example.com/index.html?somekey=somevalue#section',
+ 'ftp://www.example.com/index.html?somekey=somevalue'),
+ ]:
+ self.assertEqual(strip_url(i, strip_credentials=True), o)
+
+ def test_default_ports_creds_off(self):
+ for i, o in [
+ ('http://username:password@www.example.com:80/index.html?somekey=somevalue#section',
+ 'http://www.example.com/index.html?somekey=somevalue'),
+
+ ('http://username:password@www.example.com:8080/index.html#section',
+ 'http://www.example.com:8080/index.html'),
+
+ ('http://username:password@www.example.com:443/index.html?somekey=somevalue&someotherkey=sov#section',
+ 'http://www.example.com:443/index.html?somekey=somevalue&someotherkey=sov'),
+
+ ('https://username:password@www.example.com:443/index.html',
+ 'https://www.example.com/index.html'),
+
+ ('https://username:password@www.example.com:442/index.html',
+ 'https://www.example.com:442/index.html'),
+
+ ('https://username:password@www.example.com:80/index.html',
+ 'https://www.example.com:80/index.html'),
+
+ ('ftp://username:password@www.example.com:21/file.txt',
+ 'ftp://www.example.com/file.txt'),
+
+ ('ftp://username:password@www.example.com:221/file.txt',
+ 'ftp://www.example.com:221/file.txt'),
+ ]:
+ self.assertEqual(strip_url(i), o)
+
+ def test_default_ports(self):
+ for i, o in [
+ ('http://username:password@www.example.com:80/index.html',
+ 'http://username:password@www.example.com/index.html'),
+
+ ('http://username:password@www.example.com:8080/index.html',
+ 'http://username:password@www.example.com:8080/index.html'),
+
+ ('http://username:password@www.example.com:443/index.html',
+ 'http://username:password@www.example.com:443/index.html'),
+
+ ('https://username:password@www.example.com:443/index.html',
+ 'https://username:password@www.example.com/index.html'),
+
+ ('https://username:password@www.example.com:442/index.html',
+ 'https://username:password@www.example.com:442/index.html'),
+
+ ('https://username:password@www.example.com:80/index.html',
+ 'https://username:password@www.example.com:80/index.html'),
+
+ ('ftp://username:password@www.example.com:21/file.txt',
+ 'ftp://username:password@www.example.com/file.txt'),
+
+ ('ftp://username:password@www.example.com:221/file.txt',
+ 'ftp://username:password@www.example.com:221/file.txt'),
+ ]:
+ self.assertEqual(strip_url(i, strip_default_port=True, strip_credentials=False), o)
+
+ def test_default_ports_keep(self):
+ for i, o in [
+ ('http://username:password@www.example.com:80/index.html?somekey=somevalue&someotherkey=sov#section',
+ 'http://username:password@www.example.com:80/index.html?somekey=somevalue&someotherkey=sov'),
+
+ ('http://username:password@www.example.com:8080/index.html?somekey=somevalue&someotherkey=sov#section',
+ 'http://username:password@www.example.com:8080/index.html?somekey=somevalue&someotherkey=sov'),
+
+ ('http://username:password@www.example.com:443/index.html',
+ 'http://username:password@www.example.com:443/index.html'),
+
+ ('https://username:password@www.example.com:443/index.html',
+ 'https://username:password@www.example.com:443/index.html'),
+
+ ('https://username:password@www.example.com:442/index.html',
+ 'https://username:password@www.example.com:442/index.html'),
+
+ ('https://username:password@www.example.com:80/index.html',
+ 'https://username:password@www.example.com:80/index.html'),
+
+ ('ftp://username:password@www.example.com:21/file.txt',
+ 'ftp://username:password@www.example.com:21/file.txt'),
+
+ ('ftp://username:password@www.example.com:221/file.txt',
+ 'ftp://username:password@www.example.com:221/file.txt'),
+ ]:
+ self.assertEqual(strip_url(i, strip_default_port=False, strip_credentials=False), o)
+
+ def test_origin_only(self):
+ for i, o in [
+ ('http://username:password@www.example.com/index.html',
+ 'http://www.example.com/'),
+
+ ('http://username:password@www.example.com:80/foo/bar?query=value#somefrag',
+ 'http://www.example.com/'),
+
+ ('http://username:password@www.example.com:8008/foo/bar?query=value#somefrag',
+ 'http://www.example.com:8008/'),
+
+ ('https://username:password@www.example.com:443/index.html',
+ 'https://www.example.com/'),
+ ]:
+ self.assertEqual(strip_url(i, origin_only=True), o)
+
+
if __name__ == "__main__":
unittest.main()
| diff --git a/docs/topics/request-response.rst b/docs/topics/request-response.rst
index 3d110b02d76..67f8ec28599 100644
--- a/docs/topics/request-response.rst
+++ b/docs/topics/request-response.rst
@@ -307,6 +307,7 @@ Those are:
* :reqmeta:`proxy`
* ``ftp_user`` (See :setting:`FTP_USER` for more info)
* ``ftp_password`` (See :setting:`FTP_PASSWORD` for more info)
+* :reqmeta:`referrer_policy`
.. reqmeta:: bindaddress
diff --git a/docs/topics/spider-middleware.rst b/docs/topics/spider-middleware.rst
index 8360827e8d7..9a0ccd0c172 100644
--- a/docs/topics/spider-middleware.rst
+++ b/docs/topics/spider-middleware.rst
@@ -95,7 +95,7 @@ following methods:
it has processed the response.
:meth:`process_spider_output` must return an iterable of
- :class:`~scrapy.http.Request`, dict or :class:`~scrapy.item.Item`
+ :class:`~scrapy.http.Request`, dict or :class:`~scrapy.item.Item`
objects.
:param response: the response which generated this output from the
@@ -328,6 +328,90 @@ Default: ``True``
Whether to enable referer middleware.
+.. setting:: REFERRER_POLICY
+
+REFERRER_POLICY
+^^^^^^^^^^^^^^^
+
+.. versionadded:: 1.4
+
+Default: ``'scrapy.spidermiddlewares.referer.DefaultReferrerPolicy'``
+
+.. reqmeta:: referrer_policy
+
+`Referrer Policy`_ to apply when populating Request "Referer" header.
+
+.. note::
+ You can also set the Referrer Policy per request,
+ using the special ``"referrer_policy"`` :ref:`Request.meta <topics-request-meta>` key,
+ with the same acceptable values as for the ``REFERRER_POLICY`` setting.
+
+Acceptable values for REFERRER_POLICY
+*************************************
+
+- either a path to a ``scrapy.spidermiddlewares.referer.ReferrerPolicy``
+ subclass — a custom policy or one of the built-in ones (see classes below),
+- or one of the standard W3C-defined string values,
+- or the special ``"scrapy-default"``.
+
+======================================= ========================================================================
+String value Class name (as a string)
+======================================= ========================================================================
+``"scrapy-default"`` (default) :class:`scrapy.spidermiddlewares.referer.DefaultReferrerPolicy`
+`"no-referrer"`_ :class:`scrapy.spidermiddlewares.referer.NoReferrerPolicy`
+`"no-referrer-when-downgrade"`_ :class:`scrapy.spidermiddlewares.referer.NoReferrerWhenDowngradePolicy`
+`"same-origin"`_ :class:`scrapy.spidermiddlewares.referer.SameOriginPolicy`
+`"origin"`_ :class:`scrapy.spidermiddlewares.referer.OriginPolicy`
+`"strict-origin"`_ :class:`scrapy.spidermiddlewares.referer.StrictOriginPolicy`
+`"origin-when-cross-origin"`_ :class:`scrapy.spidermiddlewares.referer.OriginWhenCrossOriginPolicy`
+`"strict-origin-when-cross-origin"`_ :class:`scrapy.spidermiddlewares.referer.StrictOriginWhenCrossOriginPolicy`
+`"unsafe-url"`_ :class:`scrapy.spidermiddlewares.referer.UnsafeUrlPolicy`
+======================================= ========================================================================
+
+.. autoclass:: DefaultReferrerPolicy
+.. warning::
+ Scrapy's default referrer policy — just like `"no-referrer-when-downgrade"`_,
+ the W3C-recommended value for browsers — will send a non-empty
+ "Referer" header from any ``http(s)://`` to any ``https://`` URL,
+ even if the domain is different.
+
+ `"same-origin"`_ may be a better choice if you want to remove referrer
+ information for cross-domain requests.
+
+.. autoclass:: NoReferrerPolicy
+
+.. autoclass:: NoReferrerWhenDowngradePolicy
+.. note::
+ "no-referrer-when-downgrade" policy is the W3C-recommended default,
+ and is used by major web browsers.
+
+ However, it is NOT Scrapy's default referrer policy (see :class:`DefaultReferrerPolicy`).
+
+.. autoclass:: SameOriginPolicy
+
+.. autoclass:: OriginPolicy
+
+.. autoclass:: StrictOriginPolicy
+
+.. autoclass:: OriginWhenCrossOriginPolicy
+
+.. autoclass:: StrictOriginWhenCrossOriginPolicy
+
+.. autoclass:: UnsafeUrlPolicy
+.. warning::
+ "unsafe-url" policy is NOT recommended.
+
+.. _Referrer Policy: https://www.w3.org/TR/referrer-policy
+.. _"no-referrer": https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer
+.. _"no-referrer-when-downgrade": https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer-when-downgrade
+.. _"same-origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-same-origin
+.. _"origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-origin
+.. _"strict-origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-strict-origin
+.. _"origin-when-cross-origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-origin-when-cross-origin
+.. _"strict-origin-when-cross-origin": https://www.w3.org/TR/referrer-policy/#referrer-policy-strict-origin-when-cross-origin
+.. _"unsafe-url": https://www.w3.org/TR/referrer-policy/#referrer-policy-unsafe-url
+
+
UrlLengthMiddleware
-------------------
| [
{
"components": [
{
"doc": "",
"lines": [
32,
81
],
"name": "ReferrerPolicy",
"signature": "class ReferrerPolicy(object):",
"type": "class"
},
{
"doc": "",
"lines": [
36,
37
],
... | [
"tests/test_spidermiddleware_referer.py::TestRefererMiddleware::test",
"tests/test_spidermiddleware_referer.py::TestRefererMiddlewareDefault::test",
"tests/test_spidermiddleware_referer.py::TestSettingsNoReferrer::test",
"tests/test_spidermiddleware_referer.py::TestSettingsNoReferrerWhenDowngrade::test",
"t... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
[MRG] Referrer policies in RefererMiddleware
Fixes #2142
I've based the tests and implementation on https://www.w3.org/TR/referrer-policy/
By default, with this change, `RefererMiddleware` does not send a `Referer` header (referrer value):
- when the source response is `file://` or `s3://`,
- nor from `https://` response to an `http://` request.
However, it does send a referrer value for any `https://` to another `https://` (as browsers do, [`"no-referrer-when-downgrade"` being the default policy](https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer-when-downgrade))
This needs discussion.
User can change the policy per-request, using a new `referrer_policy` meta key, with values from "no-referrer" / "no-referrer-when-downgrade" / "same-origin" / "origin" / "origin-when-cross-origin" / "unsafe-url".
~~Still missing:~~
- [x] Try to use `urlparse_cached()`
- [x] Make default policy customizable through a `REFERER_DEFAULT_POLICY` setting
- [x] Test custom policies
- [x] Docs updates
- [x] Add tests for policy given by response headers (`Referrer-Policy`) -- is this even used in practice by web servers?
- [x] Handle referrers [during redirects](https://www.w3.org/TR/referrer-policy/#set-requests-referrer-policy-on-redirect)
To handle redirects, I added a signal handler on `request_scheduled`. I did not find a way to test both redirect middleware and referer middleware (I did not search too much though).
Suggestions to do that are welcome.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in scrapy/spidermiddlewares/referer.py]
(definition of ReferrerPolicy:)
class ReferrerPolicy(object):
(definition of ReferrerPolicy.referrer:)
def referrer(self, response_url, request_url):
(definition of ReferrerPolicy.stripped_referrer:)
def stripped_referrer(self, url):
(definition of ReferrerPolicy.origin_referrer:)
def origin_referrer(self, url):
(definition of ReferrerPolicy.strip_url:)
def strip_url(self, url, origin_only=False):
"""https://www.w3.org/TR/referrer-policy/#strip-url
If url is null, return no referrer.
If url's scheme is a local scheme, then return no referrer.
Set url's username to the empty string.
Set url's password to null.
Set url's fragment to null.
If the origin-only flag is true, then:
Set url's path to null.
Set url's query to null.
Return url."""
(definition of ReferrerPolicy.origin:)
def origin(self, url):
"""Return serialized origin (scheme, host, path) for a request or response URL."""
(definition of ReferrerPolicy.potentially_trustworthy:)
def potentially_trustworthy(self, url):
(definition of ReferrerPolicy.tls_protected:)
def tls_protected(self, url):
(definition of NoReferrerPolicy:)
class NoReferrerPolicy(ReferrerPolicy):
"""https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer
The simplest policy is "no-referrer", which specifies that no referrer information
is to be sent along with requests made from a particular request client to any origin.
The header will be omitted entirely."""
(definition of NoReferrerPolicy.referrer:)
def referrer(self, response_url, request_url):
(definition of NoReferrerWhenDowngradePolicy:)
class NoReferrerWhenDowngradePolicy(ReferrerPolicy):
"""https://www.w3.org/TR/referrer-policy/#referrer-policy-no-referrer-when-downgrade
The "no-referrer-when-downgrade" policy sends a full URL along with requests
from a TLS-protected environment settings object to a potentially trustworthy URL,
and requests from clients which are not TLS-protected to any origin.
Requests from TLS-protected clients to non-potentially trustworthy URLs,
on the other hand, will contain no referrer information.
A Referer HTTP header will not be sent.
This is a user agent's default behavior, if no policy is otherwise specified."""
(definition of NoReferrerWhenDowngradePolicy.referrer:)
def referrer(self, response_url, request_url):
(definition of SameOriginPolicy:)
class SameOriginPolicy(ReferrerPolicy):
"""https://www.w3.org/TR/referrer-policy/#referrer-policy-same-origin
The "same-origin" policy specifies that a full URL, stripped for use as a referrer,
is sent as referrer information when making same-origin requests from a particular request client.
Cross-origin requests, on the other hand, will contain no referrer information.
A Referer HTTP header will not be sent."""
(definition of SameOriginPolicy.referrer:)
def referrer(self, response_url, request_url):
(definition of OriginPolicy:)
class OriginPolicy(ReferrerPolicy):
"""https://www.w3.org/TR/referrer-policy/#referrer-policy-origin
The "origin" policy specifies that only the ASCII serialization
of the origin of the request client is sent as referrer information
when making both same-origin requests and cross-origin requests
from a particular request client."""
(definition of OriginPolicy.referrer:)
def referrer(self, response_url, request_url):
(definition of StrictOriginPolicy:)
class StrictOriginPolicy(ReferrerPolicy):
"""https://www.w3.org/TR/referrer-policy/#referrer-policy-strict-origin
The "strict-origin" policy sends the ASCII serialization
of the origin of the request client when making requests:
- from a TLS-protected environment settings object to a potentially trustworthy URL, and
- from non-TLS-protected environment settings objects to any origin.
Requests from TLS-protected request clients to non- potentially trustworthy URLs,
on the other hand, will contain no referrer information.
A Referer HTTP header will not be sent."""
(definition of StrictOriginPolicy.referrer:)
def referrer(self, response_url, request_url):
(definition of OriginWhenCrossOriginPolicy:)
class OriginWhenCrossOriginPolicy(ReferrerPolicy):
"""https://www.w3.org/TR/referrer-policy/#referrer-policy-origin-when-cross-origin
The "origin-when-cross-origin" policy specifies that a full URL,
stripped for use as a referrer, is sent as referrer information
when making same-origin requests from a particular request client,
and only the ASCII serialization of the origin of the request client
is sent as referrer information when making cross-origin requests
from a particular request client."""
(definition of OriginWhenCrossOriginPolicy.referrer:)
def referrer(self, response_url, request_url):
(definition of StrictOriginWhenCrossOriginPolicy:)
class StrictOriginWhenCrossOriginPolicy(ReferrerPolicy):
"""https://www.w3.org/TR/referrer-policy/#referrer-policy-strict-origin-when-cross-origin
The "strict-origin-when-cross-origin" policy specifies that a full URL,
stripped for use as a referrer, is sent as referrer information
when making same-origin requests from a particular request client,
and only the ASCII serialization of the origin of the request client
when making cross-origin requests:
- from a TLS-protected environment settings object to a potentially trustworthy URL, and
- from non-TLS-protected environment settings objects to any origin.
Requests from TLS-protected clients to non- potentially trustworthy URLs,
on the other hand, will contain no referrer information.
A Referer HTTP header will not be sent."""
(definition of StrictOriginWhenCrossOriginPolicy.referrer:)
def referrer(self, response_url, request_url):
(definition of UnsafeUrlPolicy:)
class UnsafeUrlPolicy(ReferrerPolicy):
"""https://www.w3.org/TR/referrer-policy/#referrer-policy-unsafe-url
The "unsafe-url" policy specifies that a full URL, stripped for use as a referrer,
is sent along with both cross-origin requests
and same-origin requests made from a particular request client.
Note: The policy's name doesn't lie; it is unsafe.
This policy will leak origins and paths from TLS-protected resources
to insecure origins.
Carefully consider the impact of setting such a policy for potentially sensitive documents."""
(definition of UnsafeUrlPolicy.referrer:)
def referrer(self, response_url, request_url):
(definition of DefaultReferrerPolicy:)
class DefaultReferrerPolicy(NoReferrerWhenDowngradePolicy):
"""A variant of "no-referrer-when-downgrade",
with the addition that "Referer" is not sent if the parent request was
using ``file://`` or ``s3://`` scheme."""
(definition of _load_policy_class:)
def _load_policy_class(policy, warning_only=False):
"""Expect a string for the path to the policy class,
otherwise try to interpret the string as a standard value
from https://www.w3.org/TR/referrer-policy/#referrer-policies"""
(definition of RefererMiddleware.__init__:)
def __init__(self, settings=None):
(definition of RefererMiddleware.policy:)
def policy(self, resp_or_url, request):
"""Determine Referrer-Policy to use from a parent Response (or URL),
and a Request to be sent.
- if a valid policy is set in Request meta, it is used.
- if the policy is set in meta but is wrong (e.g. a typo error),
the policy from settings is used
- if the policy is not set in Request meta,
but there is a Referrer-policy header in the parent response,
it is used if valid
- otherwise, the policy from settings is used."""
(definition of RefererMiddleware.request_scheduled:)
def request_scheduled(self, request, spider):
[end of new definitions in scrapy/spidermiddlewares/referer.py]
[start of new definitions in scrapy/utils/url.py]
(definition of strip_url:)
def strip_url(url, strip_credentials=True, strip_default_port=True, origin_only=False, strip_fragment=True):
"""Strip URL string from some of its components:
- `strip_credentials` removes "user:password@"
- `strip_default_port` removes ":80" (resp. ":443", ":21")
from http:// (resp. https://, ftp://) URLs
- `origin_only` replaces path component with "/", also dropping
query and fragment components ; it also strips credentials
- `strip_fragment` drops any #fragment component"""
[end of new definitions in scrapy/utils/url.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Scrapy uses `file://` URL as referer
If I use a local file in a request (e.g.
``` python
def start_requests(self):
yield Request(path_to_file_uri(filename))
def parse(self, response):
for url in json.loads(response.text):
yield Request(url)
```
and then make a HTTP request in the callback, scrapy uses the file URL as the `referer` in the HTTP header.
I think `file://` URL should be avoided from being used as the `referer` value (or generally any protocol other than http/s) because
1. it is not part of the normal hyperlink navigation
2. it exposes the local file structure to the remote site
----------
+1, makes sense.
It seems there are more rules for Referer header which browsers implement. I haven't checked it myself, but
http://webmasters.stackexchange.com/questions/47405/how-can-i-pass-referrer-header-from-my-https-domain-to-http-domains suggests that Referer is not sent for requests sent from https pages to any pages not in the same domain. RFC doesn't require browsers to do so, but it looks they do it for security/privacy reasons.
http://www.redpill-linpro.com/sysadvent//2015/12/17/referer-header.html gives another list of rules.
--------------------
</issues> | 57a5460529ff71c42e4d0381265b1b512b1eb09b |
docker__docker-py-1230 | 1,230 | docker/docker-py | null | 52c2cc845346884218f566eeaeee5a5ca3e714ab | 2016-09-27T18:37:20Z | diff --git a/docker/api/swarm.py b/docker/api/swarm.py
index 7481c67532..2fc877448a 100644
--- a/docker/api/swarm.py
+++ b/docker/api/swarm.py
@@ -69,6 +69,13 @@ def nodes(self, filters=None):
return self._result(self._get(url, params=params), True)
+ @utils.minimum_version('1.24')
+ def update_node(self, node_id, version, node_spec=None):
+ url = self._url('/nodes/{0}/update?version={1}', node_id, str(version))
+ res = self._post_json(url, data=node_spec)
+ self._raise_for_status(res)
+ return True
+
@utils.minimum_version('1.24')
def update_swarm(self, version, swarm_spec=None, rotate_worker_token=False,
rotate_manager_token=False):
diff --git a/docs/api.md b/docs/api.md
index 1699344a66..5cadb83081 100644
--- a/docs/api.md
+++ b/docs/api.md
@@ -1129,6 +1129,11 @@ Update resource configs of one or more containers.
**Returns** (dict): Dictionary containing a `Warnings` key.
+## update_node
+
+Update a node.
+See the [Swarm documentation](swarm.md#clientupdate_node).
+
## update_service
Update a service, similar to the `docker service update` command. See the
diff --git a/docs/swarm.md b/docs/swarm.md
index 3cc44f8741..20c3945352 100644
--- a/docs/swarm.md
+++ b/docs/swarm.md
@@ -232,6 +232,30 @@ List Swarm nodes
**Returns:** A list of dictionaries containing data about each swarm node.
+### Client.update_node
+
+Update the Node's configuration
+
+**Params:**
+
+* version (int): The version number of the node object being updated. This
+ is required to avoid conflicting writes.
+* node_spec (dict): Configuration settings to update. Any values not provided
+ will be removed. See the official [Docker API documentation](https://docs.docker.com/engine/reference/api/docker_remote_api_v1.24/#/update-a-node) for more details.
+ Default: `None`.
+
+**Returns:** `True` if the request went through. Raises an `APIError` if it
+ fails.
+
+```python
+node_spec = {'Availability': 'active',
+ 'Name': 'node-name',
+ 'Role': 'manager',
+ 'Labels': {'foo': 'bar'}
+ }
+client.update_node(node_id='24ifsmvkjbyhk', version=8, node_spec=node_spec)
+```
+
### Client.update_swarm
Update the Swarm's configuration
| diff --git a/tests/integration/swarm_test.py b/tests/integration/swarm_test.py
index 8c62f2ec06..7f02c71170 100644
--- a/tests/integration/swarm_test.py
+++ b/tests/integration/swarm_test.py
@@ -1,3 +1,4 @@
+import copy
import docker
import pytest
@@ -138,3 +139,26 @@ def test_inspect_node(self):
node_data = self.client.inspect_node(node['ID'])
assert node['ID'] == node_data['ID']
assert node['Version'] == node_data['Version']
+
+ @requires_api_version('1.24')
+ def test_update_node(self):
+ assert self.client.init_swarm('eth0')
+ nodes_list = self.client.nodes()
+ node = nodes_list[0]
+ orig_spec = node['Spec']
+
+ # add a new label
+ new_spec = copy.deepcopy(orig_spec)
+ new_spec['Labels'] = {'new.label': 'new value'}
+ self.client.update_node(node_id=node['ID'],
+ version=node['Version']['Index'],
+ node_spec=new_spec)
+ updated_node = self.client.inspect_node(node['ID'])
+ assert new_spec == updated_node['Spec']
+
+ # Revert the changes
+ self.client.update_node(node_id=node['ID'],
+ version=updated_node['Version']['Index'],
+ node_spec=orig_spec)
+ reverted_node = self.client.inspect_node(node['ID'])
+ assert orig_spec == reverted_node['Spec']
diff --git a/tests/unit/fake_api.py b/tests/unit/fake_api.py
index 1e9d318df5..cfe6ef777f 100644
--- a/tests/unit/fake_api.py
+++ b/tests/unit/fake_api.py
@@ -14,6 +14,7 @@
FAKE_URL = 'myurl'
FAKE_PATH = '/path'
FAKE_VOLUME_NAME = 'perfectcherryblossom'
+FAKE_NODE_ID = '24ifsmvkjbyhk'
# Each method is prefixed with HTTP method (get, post...)
# for clarity and readability
@@ -406,6 +407,10 @@ def post_fake_update_container():
return 200, {'Warnings': []}
+def post_fake_update_node():
+ return 200, None
+
+
# Maps real api url to fake response callback
prefix = 'http+docker://localunixsocket'
fake_responses = {
@@ -504,4 +509,8 @@ def post_fake_update_container():
CURRENT_VERSION, prefix, FAKE_VOLUME_NAME
), 'DELETE'):
fake_remove_volume,
+ ('{1}/{0}/nodes/{2}/update?version=1'.format(
+ CURRENT_VERSION, prefix, FAKE_NODE_ID
+ ), 'POST'):
+ post_fake_update_node,
}
diff --git a/tests/unit/swarm_test.py b/tests/unit/swarm_test.py
new file mode 100644
index 0000000000..5580383406
--- /dev/null
+++ b/tests/unit/swarm_test.py
@@ -0,0 +1,32 @@
+# -*- coding: utf-8 -*-
+
+import json
+
+from . import fake_api
+from ..base import requires_api_version
+from .api_test import (DockerClientTest, url_prefix, fake_request)
+
+
+class SwarmTest(DockerClientTest):
+ @requires_api_version('1.24')
+ def test_node_update(self):
+ node_spec = {
+ 'Availability': 'active',
+ 'Name': 'node-name',
+ 'Role': 'manager',
+ 'Labels': {'foo': 'bar'}
+ }
+
+ self.client.update_node(
+ node_id=fake_api.FAKE_NODE_ID, version=1, node_spec=node_spec
+ )
+ args = fake_request.call_args
+ self.assertEqual(
+ args[0][1], url_prefix + 'nodes/24ifsmvkjbyhk/update?version=1'
+ )
+ self.assertEqual(
+ json.loads(args[1]['data']), node_spec
+ )
+ self.assertEqual(
+ args[1]['headers']['Content-Type'], 'application/json'
+ )
| diff --git a/docs/api.md b/docs/api.md
index 1699344a66..5cadb83081 100644
--- a/docs/api.md
+++ b/docs/api.md
@@ -1129,6 +1129,11 @@ Update resource configs of one or more containers.
**Returns** (dict): Dictionary containing a `Warnings` key.
+## update_node
+
+Update a node.
+See the [Swarm documentation](swarm.md#clientupdate_node).
+
## update_service
Update a service, similar to the `docker service update` command. See the
diff --git a/docs/swarm.md b/docs/swarm.md
index 3cc44f8741..20c3945352 100644
--- a/docs/swarm.md
+++ b/docs/swarm.md
@@ -232,6 +232,30 @@ List Swarm nodes
**Returns:** A list of dictionaries containing data about each swarm node.
+### Client.update_node
+
+Update the Node's configuration
+
+**Params:**
+
+* version (int): The version number of the node object being updated. This
+ is required to avoid conflicting writes.
+* node_spec (dict): Configuration settings to update. Any values not provided
+ will be removed. See the official [Docker API documentation](https://docs.docker.com/engine/reference/api/docker_remote_api_v1.24/#/update-a-node) for more details.
+ Default: `None`.
+
+**Returns:** `True` if the request went through. Raises an `APIError` if it
+ fails.
+
+```python
+node_spec = {'Availability': 'active',
+ 'Name': 'node-name',
+ 'Role': 'manager',
+ 'Labels': {'foo': 'bar'}
+ }
+client.update_node(node_id='24ifsmvkjbyhk', version=8, node_spec=node_spec)
+```
+
### Client.update_swarm
Update the Swarm's configuration
| [
{
"components": [
{
"doc": "",
"lines": [
73,
77
],
"name": "SwarmApiMixin.update_node",
"signature": "def update_node(self, node_id, version, node_spec=None):",
"type": "function"
}
],
"file": "docker/api/swarm.py"
}
] | [
"tests/unit/swarm_test.py::SwarmTest::test_node_update"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
enable setting of node labels #1225
Added update_node function to enable setting labels on nodes. This
exposes the Update a Node function from the Docker API and should
enable promoting/demoting manager nodes inside a swarm.
Signed-off-by: Nathan Shirlberg nshirlberg@labattfood.com
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in docker/api/swarm.py]
(definition of SwarmApiMixin.update_node:)
def update_node(self, node_id, version, node_spec=None):
[end of new definitions in docker/api/swarm.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 26753c81defff28a1a38a34788e9653c8eb87c3d | |
conan-io__conan-508 | 508 | conan-io/conan | null | 2167f1f59f670b87acb69efd117f79ff506ed99f | 2016-09-27T16:24:01Z | diff --git a/conans/client/deps_builder.py b/conans/client/deps_builder.py
index 341d1c41afb..83c9c88214f 100644
--- a/conans/client/deps_builder.py
+++ b/conans/client/deps_builder.py
@@ -416,7 +416,7 @@ def _config_node(self, conanfile, conanref, down_reqs, down_ref, down_options):
def _create_new_node(self, current_node, dep_graph, requirement, public_deps, name_req):
""" creates and adds a new node to the dependency graph
"""
- conanfile_path = self._retriever.get_conanfile(requirement.conan_reference)
+ conanfile_path = self._retriever.get_recipe(requirement.conan_reference)
output = ScopedOutput(str(requirement.conan_reference), self._output)
dep_conanfile = self._loader.load_conan(conanfile_path, output)
if dep_conanfile:
diff --git a/conans/client/proxy.py b/conans/client/proxy.py
index 57ee121b91e..cdecc8855c0 100644
--- a/conans/client/proxy.py
+++ b/conans/client/proxy.py
@@ -70,17 +70,17 @@ def handle_package_manifest(self, package_reference, installed):
remote = self._registry.get_ref(package_reference.conan)
self._manifest_manager.check_package(package_reference, remote)
- def get_conanfile(self, conan_reference):
+ def get_recipe(self, conan_reference):
output = ScopedOutput(str(conan_reference), self._out)
def _refresh():
- conan_dir_path = self._client_cache.export(conan_reference)
- rmdir(conan_dir_path)
+ export_path = self._client_cache.export(conan_reference)
+ rmdir(export_path)
# It might need to remove shortpath
rmdir(self._client_cache.source(conan_reference), True)
current_remote, _ = self._get_remote(conan_reference)
output.info("Retrieving from remote '%s'..." % current_remote.name)
- self._remote_manager.get_conanfile(conan_reference, current_remote)
+ self._remote_manager.get_recipe(conan_reference, export_path, current_remote)
if self._update:
output.info("Updated!")
else:
@@ -88,7 +88,6 @@ def _refresh():
# check if it is in disk
conanfile_path = self._client_cache.conanfile(conan_reference)
-
path_exist = path_exists(conanfile_path, self._client_cache.store)
if path_exist:
@@ -122,7 +121,7 @@ def _refresh():
"to replace it." % (remote.name, conan_reference))
else:
- self._retrieve_conanfile(conan_reference, output)
+ self._retrieve_recipe(conan_reference, output)
if self._manifest_manager:
remote = self._registry.get_ref(conan_reference)
@@ -146,13 +145,14 @@ def update_available(self, conan_reference):
return 0
- def _retrieve_conanfile(self, conan_reference, output):
+ def _retrieve_recipe(self, conan_reference, output):
""" returns the requested conanfile object, retrieving it from
remotes if necessary. Can raise NotFoundException
"""
def _retrieve_from_remote(remote):
output.info("Trying with '%s'..." % remote.name)
- result = self._remote_manager.get_conanfile(conan_reference, remote)
+ export_path = self._client_cache.export(conan_reference)
+ result = self._remote_manager.get_recipe(conan_reference, export_path, remote)
self._registry.set_ref(conan_reference, remote)
return result
@@ -261,7 +261,8 @@ def remove_packages(self, conan_ref, remove_ids):
def download_packages(self, reference, package_ids):
assert(isinstance(package_ids, list))
remote, _ = self._get_remote(reference)
- self._remote_manager.get_conanfile(reference, remote)
+ export_path = self._client_cache.export(reference)
+ self._remote_manager.get_recipe(reference, export_path, remote)
self._registry.set_ref(reference, remote)
output = ScopedOutput(str(reference), self._out)
for package_id in package_ids:
@@ -280,7 +281,8 @@ def _retrieve_remote_package(self, package_reference, output, remote=None):
try:
output.info("Looking for package %s in remote '%s' " % (package_id, remote.name))
# Will raise if not found NotFoundException
- self._remote_manager.get_package(package_reference, remote)
+ package_path = self._client_cache.package(package_reference)
+ self._remote_manager.get_package(package_reference, package_path, remote)
output.success('Package installed %s' % package_id)
return True
except ConanConnectionError:
diff --git a/conans/client/remote_manager.py b/conans/client/remote_manager.py
index 819cfa96c3d..1d716f8cb13 100644
--- a/conans/client/remote_manager.py
+++ b/conans/client/remote_manager.py
@@ -1,16 +1,17 @@
-from conans.errors import ConanException, ConanConnectionError
+import os
+import shutil
+import tarfile
+import time
+import traceback
+
from requests.exceptions import ConnectionError
-from conans.util.files import save, tar_extract, rmdir
+
+from conans.errors import ConanException, ConanConnectionError
+from conans.util.files import tar_extract, rmdir, relative_dirs, mkdir
from conans.util.log import logger
-import traceback
-import os
from conans.paths import PACKAGE_TGZ_NAME, CONANINFO, CONAN_MANIFEST, CONANFILE, EXPORT_TGZ_NAME
-from io import BytesIO
-import tarfile
from conans.util.files import gzopen_without_timestamps
from conans.util.files import touch
-import shutil
-import time
class RemoteManager(object):
@@ -85,35 +86,35 @@ def get_package_digest(self, package_reference, remote):
returns (ConanDigest, remote_name)"""
return self._call_remote(remote, "get_package_digest", package_reference)
- def get_conanfile(self, conan_reference, remote):
+ def get_recipe(self, conan_reference, dest_folder, remote):
"""
Read the conans from remotes
Will iterate the remotes to find the conans unless remote was specified
- returns (dict relative_filepath:content , remote_name)"""
- export_files = self._call_remote(remote, "get_conanfile", conan_reference)
- export_folder = self._client_cache.export(conan_reference)
- uncompress_files(export_files, export_folder, EXPORT_TGZ_NAME)
+ returns (dict relative_filepath:abs_path , remote_name)"""
+ zipped_files = self._call_remote(remote, "get_recipe", conan_reference, dest_folder)
+ files = unzip_and_get_files(zipped_files, dest_folder, EXPORT_TGZ_NAME)
# Make sure that the source dir is deleted
rmdir(self._client_cache.source(conan_reference), True)
# TODO: Download only the CONANFILE file and only download the rest of files
# in install if needed (not found remote package)
+ return files
- def get_package(self, package_reference, remote):
+ def get_package(self, package_reference, dest_folder, remote):
"""
Read the conans package from remotes
Will iterate the remotes to find the conans unless remote was specified
- returns (dict relative_filepath:content , remote_name)"""
- package_files = self._call_remote(remote, "get_package", package_reference)
- destination_dir = self._client_cache.package(package_reference)
- uncompress_files(package_files, destination_dir, PACKAGE_TGZ_NAME)
-
+ returns (dict relative_filepath:abs_path , remote_name)"""
+ zipped_files = self._call_remote(remote, "get_package", package_reference, dest_folder)
+ files = unzip_and_get_files(zipped_files, dest_folder, PACKAGE_TGZ_NAME)
# Issue #214 https://github.com/conan-io/conan/issues/214
- for dirname, _, files in os.walk(destination_dir):
+ for dirname, _, files in os.walk(dest_folder):
for fname in files:
touch(os.path.join(dirname, fname))
+ return files
+
def search(self, remote, pattern=None, ignorecase=True):
"""
Search exported conans information from remotes
@@ -211,20 +212,27 @@ def addfile(name, abs_path, tar):
return ret
-def uncompress_files(files, folder, name):
+def unzip_and_get_files(files, destination_dir, tgz_name):
+ '''Moves all files from package_files, {relative_name: tmp_abs_path}
+ to destination_dir, unzipping the "tgz_name" if found'''
+
+ tgz_file = files.pop(tgz_name, None)
+ if tgz_file:
+ uncompress_file(tgz_file, destination_dir)
+
+ return relative_dirs(destination_dir)
+
+
+def uncompress_file(src_path, dest_folder):
try:
- for file_name, content in files:
- if os.path.basename(file_name) == name:
- # Unzip the file and not keep the tgz
- tar_extract(BytesIO(content), folder)
- else:
- save(os.path.join(folder, file_name), content)
+ with open(src_path, 'rb') as file_handler:
+ tar_extract(file_handler, dest_folder)
except Exception as e:
- error_msg = "Error while downloading/extracting files to %s\n%s\n" % (folder, str(e))
+ error_msg = "Error while downloading/extracting files to %s\n%s\n" % (dest_folder, str(e))
# try to remove the files
try:
- if os.path.exists(folder):
- shutil.rmtree(folder)
+ if os.path.exists(dest_folder):
+ shutil.rmtree(dest_folder)
error_msg += "Folder removed"
except Exception as e:
error_msg += "Folder not removed, files/package might be damaged, remove manually"
diff --git a/conans/client/rest/auth_manager.py b/conans/client/rest/auth_manager.py
index 966a2773d08..24cb1f43af5 100644
--- a/conans/client/rest/auth_manager.py
+++ b/conans/client/rest/auth_manager.py
@@ -142,12 +142,12 @@ def get_package_digest(self, package_reference):
return self._rest_client.get_package_digest(package_reference)
@input_credentials_if_unauthorized
- def get_conanfile(self, conan_reference):
- return self._rest_client.get_conanfile(conan_reference)
+ def get_recipe(self, conan_reference, dest_folder):
+ return self._rest_client.get_recipe(conan_reference, dest_folder)
@input_credentials_if_unauthorized
- def get_package(self, package_reference):
- return self._rest_client.get_package(package_reference)
+ def get_package(self, package_reference, dest_folder):
+ return self._rest_client.get_package(package_reference, dest_folder)
@input_credentials_if_unauthorized
def search(self, pattern, ignorecase):
diff --git a/conans/client/rest/rest_client.py b/conans/client/rest/rest_client.py
index 68e28d8e751..fcf6affe212 100644
--- a/conans/client/rest/rest_client.py
+++ b/conans/client/rest/rest_client.py
@@ -12,6 +12,7 @@
from conans.client.rest.uploader_downloader import Uploader, Downloader
from conans.model.ref import ConanFileReference
from six.moves.urllib.parse import urlsplit, parse_qs
+import tempfile
def handle_return_deserializer(deserializer=None):
@@ -107,7 +108,7 @@ def get_package_digest(self, package_reference):
contents = {key: decode_text(value) for key, value in dict(contents).items()}
return FileTreeManifest.loads(contents[CONAN_MANIFEST])
- def get_conanfile(self, conan_reference):
+ def get_recipe(self, conan_reference, dest_folder):
"""Gets a dict of filename:contents from conans"""
# Get the conanfile snapshot first
url = "%s/conans/%s/download_urls" % (self._remote_api_url, "/".join(conan_reference))
@@ -117,12 +118,10 @@ def get_conanfile(self, conan_reference):
raise NotFoundException("Conan '%s' doesn't have a %s!" % (conan_reference, CONANFILE))
# TODO: Get fist an snapshot and compare files and download only required?
+ file_paths = self.download_files_to_folder(urls, dest_folder, self._output)
+ return file_paths
- # Download the resources
- contents = self.download_files(urls, self._output)
- return contents
-
- def get_package(self, package_reference):
+ def get_package(self, package_reference, dest_folder):
"""Gets a dict of filename:contents from package"""
url = "%s/conans/%s/packages/%s/download_urls" % (self._remote_api_url,
"/".join(package_reference.conan),
@@ -133,8 +132,8 @@ def get_package(self, package_reference):
# TODO: Get fist an snapshot and compare files and download only required?
# Download the resources
- contents = self.download_files(urls, self._output)
- return contents
+ file_paths = self.download_files_to_folder(urls, dest_folder, self._output)
+ return file_paths
def upload_conan(self, conan_reference, the_files):
"""
@@ -361,6 +360,25 @@ def download_files(self, file_urls, output=None):
output.writeln("")
yield os.path.normpath(filename), contents
+ def download_files_to_folder(self, file_urls, to_folder, output=None):
+ """
+ :param: file_urls is a dict with {filename: abs_path}
+
+ It writes downloaded files to disk (appending to file, only keeps chunks in memory)
+ """
+ downloader = Downloader(self.requester, output, self.VERIFY_SSL)
+ ret = {}
+ for filename, resource_url in file_urls.items():
+ if output:
+ output.writeln("Downloading %s" % filename)
+ auth, _ = self._file_server_capabilities(resource_url)
+ abs_path = os.path.join(to_folder, filename)
+ downloader.download(resource_url, abs_path, auth=auth)
+ if output:
+ output.writeln("")
+ ret[filename] = abs_path
+ return ret
+
def upload_files(self, file_urls, files, output):
t1 = time.time()
failed = {}
diff --git a/conans/client/rest/uploader_downloader.py b/conans/client/rest/uploader_downloader.py
index eec4d8e846d..475dfb45024 100644
--- a/conans/client/rest/uploader_downloader.py
+++ b/conans/client/rest/uploader_downloader.py
@@ -125,8 +125,10 @@ def download(self, url, file_path=None, auth=None):
if self.output:
print_progress(self.output, units)
last_progress = units
-
- return bytes(ret)
+ if not file_path:
+ return bytes(ret)
+ else:
+ return
except Exception as e:
logger.debug(e.__class__)
logger.debug(traceback.format_exc())
diff --git a/conans/server/store/file_manager.py b/conans/server/store/file_manager.py
index a7f95ab6513..6b6d4295bea 100644
--- a/conans/server/store/file_manager.py
+++ b/conans/server/store/file_manager.py
@@ -17,7 +17,7 @@ def __init__(self, paths, storage_adapter):
self._storage_adapter = storage_adapter
# ############ SNAPSHOTS
- def get_conanfile(self, conan_reference):
+ def get_recipe(self, conan_reference):
conanfile_path = self.paths.conanfile(conan_reference)
return self._storage_adapter.get_file(conanfile_path)
| diff --git a/conans/test/download_test.py b/conans/test/download_test.py
index e1ed5a9861b..e41c6e29a2c 100644
--- a/conans/test/download_test.py
+++ b/conans/test/download_test.py
@@ -69,7 +69,7 @@ def complete_test(self):
client2.remote_manager,
"default")
- installer.get_conanfile(conan_ref)
+ installer.get_recipe(conan_ref)
installer.get_package(package_ref, force_build=False)
reg_path = client2.paths.export(ConanFileReference.loads("Hello/1.2.1/frodo/stable"))
diff --git a/conans/test/model/order_libs_test.py b/conans/test/model/order_libs_test.py
index 5e70ef0ccc3..99dfa93aa02 100644
--- a/conans/test/model/order_libs_test.py
+++ b/conans/test/model/order_libs_test.py
@@ -48,7 +48,7 @@ def conan(self, name, requires=None):
content = base_content % (name, self._reqs(requires), name, self._libs(name))
save(conan_path, content)
- def get_conanfile(self, conan_ref):
+ def get_recipe(self, conan_ref):
conan_path = os.path.join(self.folder, "/".join(conan_ref), CONANFILE)
return conan_path
diff --git a/conans/test/model/transitive_reqs_test.py b/conans/test/model/transitive_reqs_test.py
index 441b7f8f5d7..a8ec49e2e04 100644
--- a/conans/test/model/transitive_reqs_test.py
+++ b/conans/test/model/transitive_reqs_test.py
@@ -36,7 +36,7 @@ def conan(self, conan_ref, content):
conan_path = os.path.join(self.folder, "/".join(conan_ref), CONANFILE)
save(conan_path, content)
- def get_conanfile(self, conan_ref):
+ def get_recipe(self, conan_ref):
conan_path = os.path.join(self.folder, "/".join(conan_ref), CONANFILE)
return conan_path
diff --git a/conans/test/remote_manager_test.py b/conans/test/remote_manager_test.py
index 8e02eaf41bd..3362245b264 100644
--- a/conans/test/remote_manager_test.py
+++ b/conans/test/remote_manager_test.py
@@ -1,17 +1,20 @@
+import os
+import tempfile
import unittest
-from conans.client.remote_manager import RemoteManager
+
from mock import Mock
+
+from conans.client.client_cache import ClientCache
+from conans.client.remote_manager import RemoteManager
+from conans.client.remote_registry import Remote
from conans.errors import NotFoundException
from conans.model.ref import ConanFileReference, PackageReference
+from conans.model.manifest import FileTreeManifest
+from conans.paths import CONANFILE, CONAN_MANIFEST, CONANINFO
from conans.test.tools import TestBufferConanOutput, TestClient
from conans.test.utils.test_files import temp_folder
from conans.test.utils.cpp_test_files import cpp_hello_conan_files
-from conans.client.remote_registry import Remote
-from conans.client.client_cache import ClientCache
from conans.util.files import save
-from conans.paths import CONANFILE, CONAN_MANIFEST, CONANINFO
-import os
-from conans.model.manifest import FileTreeManifest
class MockRemoteClient(object):
@@ -19,8 +22,13 @@ class MockRemoteClient(object):
def __init__(self):
self.upload_package = Mock()
self.get_conan_digest = Mock()
- self.get_conanfile = Mock(return_value=[("one.txt", "ONE")])
- self.get_package = Mock(return_value=[("one.txt", "ONE")])
+ tmp_folder = tempfile.mkdtemp(suffix='conan_download')
+ save(os.path.join(tmp_folder, "one.txt"), "ONE")
+ self.get_recipe = Mock(return_value={"one.txt": os.path.join(tmp_folder, "one.txt")})
+
+ tmp_folder = tempfile.mkdtemp(suffix='conan_download')
+ save(os.path.join(tmp_folder, "one.txt"), "ONE")
+ self.get_package = Mock(return_value={"one.txt": os.path.join(tmp_folder, "one.txt")})
self.remote_url = None
self.raise_count = 0
@@ -78,10 +86,10 @@ def method_called_test(self):
self.manager.get_conan_digest(self.conan_reference, Remote("other", "url"))
self.assertTrue(self.remote_client.get_conan_digest.called)
- self.assertFalse(self.remote_client.get_conanfile.called)
- self.manager.get_conanfile(self.conan_reference, Remote("other", "url"))
- self.assertTrue(self.remote_client.get_conanfile.called)
+ self.assertFalse(self.remote_client.get_recipe.called)
+ self.manager.get_recipe(self.conan_reference, temp_folder(), Remote("other", "url"))
+ self.assertTrue(self.remote_client.get_recipe.called)
self.assertFalse(self.remote_client.get_package.called)
- self.manager.get_package(self.package_reference, Remote("other", "url"))
+ self.manager.get_package(self.package_reference, temp_folder(), Remote("other", "url"))
self.assertTrue(self.remote_client.get_package.called)
| [
{
"components": [
{
"doc": "",
"lines": [
73,
130
],
"name": "ConanProxy.get_recipe",
"signature": "def get_recipe(self, conan_reference):",
"type": "function"
},
{
"doc": "",
"lines": [
76,
... | [
"conans/test/model/order_libs_test.py::ConanRequirementsTest::test_diamond_no_conflict",
"conans/test/model/transitive_reqs_test.py::ConanRequirementsTest::test_basic_transitive_option",
"conans/test/model/transitive_reqs_test.py::ConanRequirementsTest::test_conditional",
"conans/test/model/transitive_reqs_te... | [
"conans/test/model/transitive_reqs_test.py::ConanRequirementsTest::test_basic",
"conans/test/model/transitive_reqs_test.py::ConanRequirementsTest::test_basic_option",
"conans/test/model/transitive_reqs_test.py::CoreSettingsTest::test_basic",
"conans/test/model/transitive_reqs_test.py::CoreSettingsTest::test_c... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Feature/download to tmp
Download recipes and packages to a tmp file (appending from network buffer).
Fixes #501
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/client/proxy.py]
(definition of ConanProxy.get_recipe:)
def get_recipe(self, conan_reference):
(definition of ConanProxy.get_recipe._refresh:)
def _refresh():
(definition of ConanProxy._retrieve_recipe:)
def _retrieve_recipe(self, conan_reference, output):
"""returns the requested conanfile object, retrieving it from
remotes if necessary. Can raise NotFoundException"""
(definition of ConanProxy._retrieve_recipe._retrieve_from_remote:)
def _retrieve_from_remote(remote):
[end of new definitions in conans/client/proxy.py]
[start of new definitions in conans/client/remote_manager.py]
(definition of RemoteManager.get_recipe:)
def get_recipe(self, conan_reference, dest_folder, remote):
"""Read the conans from remotes
Will iterate the remotes to find the conans unless remote was specified
returns (dict relative_filepath:abs_path , remote_name)"""
(definition of unzip_and_get_files:)
def unzip_and_get_files(files, destination_dir, tgz_name):
"""Moves all files from package_files, {relative_name: tmp_abs_path}
to destination_dir, unzipping the "tgz_name" if found"""
(definition of uncompress_file:)
def uncompress_file(src_path, dest_folder):
[end of new definitions in conans/client/remote_manager.py]
[start of new definitions in conans/client/rest/auth_manager.py]
(definition of ConanApiAuthManager.get_recipe:)
def get_recipe(self, conan_reference, dest_folder):
[end of new definitions in conans/client/rest/auth_manager.py]
[start of new definitions in conans/client/rest/rest_client.py]
(definition of RestApiClient.get_recipe:)
def get_recipe(self, conan_reference, dest_folder):
"""Gets a dict of filename:contents from conans"""
(definition of RestApiClient.download_files_to_folder:)
def download_files_to_folder(self, file_urls, to_folder, output=None):
""":param: file_urls is a dict with {filename: abs_path}
It writes downloaded files to disk (appending to file, only keeps chunks in memory)"""
[end of new definitions in conans/client/rest/rest_client.py]
[start of new definitions in conans/server/store/file_manager.py]
(definition of FileManager.get_recipe:)
def get_recipe(self, conan_reference):
[end of new definitions in conans/server/store/file_manager.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Memory error when download very big packages
`conan install node/6.1.0@silkedit/stable -s compiler="Visual Studio" -s compiler.version=14`
Downloader.download method keeps in memory too much data.
```
DEBUG :uploader_downloader.py[74]: <type 'exceptions.MemoryError'> [2016-09-23 15:15:02,983]
DEBUG :uploader_downloader.py[75]: Traceback (most recent call last):
File "c:\python27\lib\site-packages\conans\client\rest\uploader_downloader.py", line 62, in download
ret.extend(data)
MemoryError
```
----------
--------------------
</issues> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | |
joke2k__faker-389 | 389 | joke2k/faker | null | fb2878c5452965d5d89e3e7bf64a57eee0f5853e | 2016-09-20T14:19:57Z | diff --git a/faker/providers/date_time/__init__.py b/faker/providers/date_time/__init__.py
index d5fc3d6dec..3ffa68945e 100644
--- a/faker/providers/date_time/__init__.py
+++ b/faker/providers/date_time/__init__.py
@@ -287,6 +287,14 @@ def date(cls, pattern='%Y-%m-%d'):
"""
return cls.date_time().strftime(pattern)
+ @classmethod
+ def date_object(cls):
+ """
+ Get a date object between January 1, 1970 and now
+ :example datetime.date(2016, 9, 20)
+ """
+ return cls.date_time().date()
+
@classmethod
def time(cls, pattern='%H:%M:%S'):
"""
@@ -296,6 +304,14 @@ def time(cls, pattern='%H:%M:%S'):
"""
return cls.date_time().time().strftime(pattern)
+ @classmethod
+ def time_object(cls):
+ """
+ Get a time object
+ :example datetime.time(15, 56, 56, 772876)
+ """
+ return cls.date_time().time()
+
@classmethod
def _parse_date_time(cls, text, tzinfo=None):
if isinstance(text, (datetime, date, real_datetime, real_date)):
| diff --git a/faker/tests/__init__.py b/faker/tests/__init__.py
index 2b17711493..cd5240bd8d 100644
--- a/faker/tests/__init__.py
+++ b/faker/tests/__init__.py
@@ -333,6 +333,18 @@ def test_datetimes_with_and_without_tzinfo(self):
self.assertFalse(provider.iso8601().endswith('+00:00'))
self.assertTrue(provider.iso8601(utc).endswith('+00:00'))
+ def test_date_object(self):
+ from faker.providers.date_time import Provider
+ provider = Provider
+
+ self.assertIsInstance(provider.date_object(), datetime.date)
+
+ def test_time_object(self):
+ from faker.providers.date_time import Provider
+ provider = Provider
+
+ self.assertIsInstance(provider.time_object(), datetime.time)
+
def test_date_time_between_dates(self):
from faker.providers.date_time import Provider
provider = Provider
| [
{
"components": [
{
"doc": "Get a date object between January 1, 1970 and now\n:example datetime.date(2016, 9, 20)",
"lines": [
291,
296
],
"name": "Provider.date_object",
"signature": "def date_object(cls):",
"type": "function"
},
... | [
"faker/tests/__init__.py::FactoryTestCase::test_date_object",
"faker/tests/__init__.py::FactoryTestCase::test_time_object"
] | [
"faker/tests/__init__.py::UtilsTestCase::test_add_dicts",
"faker/tests/__init__.py::UtilsTestCase::test_choice_distribution",
"faker/tests/__init__.py::UtilsTestCase::test_find_available_locales",
"faker/tests/__init__.py::UtilsTestCase::test_find_available_providers",
"faker/tests/__init__.py::FactoryTestC... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add date and time object providers
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/date_time/__init__.py]
(definition of Provider.date_object:)
def date_object(cls):
"""Get a date object between January 1, 1970 and now
:example datetime.date(2016, 9, 20)"""
(definition of Provider.time_object:)
def time_object(cls):
"""Get a time object
:example datetime.time(15, 56, 56, 772876)"""
[end of new definitions in faker/providers/date_time/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
falconry__falcon-901 | 901 | falconry/falcon | null | b36ffe6179e6fe3c8a7f4eae3c6070d282de7129 | 2016-09-17T22:50:41Z | diff --git a/.travis.yml b/.travis.yml
index 081ee6f8c..1ac94eb07 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -9,10 +9,10 @@ matrix:
include:
- python: 2.7 # these are just to make travis's UI a bit prettier
env: JYTHON=true
- - python: pypy
+ - python: pypy-5.3
env: TOXENV=pypy
- - python: pypy3
- env: TOXENV=pypy3
+ # - python: pypy3
+ # env: TOXENV=pypy3
- python: 2.7
env: TOXENV=pep8
- python: 2.6
diff --git a/falcon/request_helpers.py b/falcon/request_helpers.py
index c027e7267..bacd87bff 100644
--- a/falcon/request_helpers.py
+++ b/falcon/request_helpers.py
@@ -14,6 +14,8 @@
"""Utilities for the Request class."""
+import io
+
def header_property(wsgi_name):
"""Creates a read-only header property.
@@ -36,7 +38,7 @@ def fget(self):
return property(fget)
-class BoundedStream(object):
+class BoundedStream(io.IOBase):
"""Wrap *wsgi.input* streams to make them more robust.
``socket._fileobject`` and ``io.BufferedReader`` are sometimes used
@@ -96,6 +98,18 @@ def _read(self, size, target):
self._bytes_remaining -= size
return target(size)
+ def readable(self):
+ """Always returns ``True``."""
+ return True
+
+ def seekable(self):
+ """Always returns ``False``."""
+ return False
+
+ def writeable(self):
+ """Always returns ``False``."""
+ return False
+
def read(self, size=None):
"""Read from the stream.
@@ -138,6 +152,11 @@ def readlines(self, hint=None):
return self._read(hint, self.stream.readlines)
+ def write(self, data):
+ """Always raises IOError; writing is not supported."""
+
+ raise IOError('Stream is not writeable')
+
# NOTE(kgriffs): Alias for backwards-compat
Body = BoundedStream
diff --git a/tox.ini b/tox.ini
index 68c0f05c9..31772f686 100644
--- a/tox.ini
+++ b/tox.ini
@@ -58,6 +58,7 @@ deps = -r{toxinidir}/tools/test-requires
[testenv:py27_debug]
deps = {[with-debug-tools]deps}
+ funcsigs
[testenv:py34_debug]
deps = {[with-debug-tools]deps}
| diff --git a/docs/api/testing.rst b/docs/api/testing.rst
index c11007d5d..940edb50a 100644
--- a/docs/api/testing.rst
+++ b/docs/api/testing.rst
@@ -4,7 +4,7 @@ Testing
=======
.. automodule:: falcon.testing
- :members: Result,
+ :members: Result, Cookie,
simulate_request, simulate_get, simulate_head, simulate_post,
simulate_put, simulate_options, simulate_patch, simulate_delete,
TestClient, TestCase, SimpleTestResource, StartResponseMock,
diff --git a/falcon/testing/client.py b/falcon/testing/client.py
index 65bf245b6..d197db0bd 100644
--- a/falcon/testing/client.py
+++ b/falcon/testing/client.py
@@ -33,11 +33,15 @@
"""
import json
+import re
+import sys
import wsgiref.validate
+from six.moves import http_cookies
+
from falcon.testing import helpers
from falcon.testing.srmock import StartResponseMock
-from falcon.util import CaseInsensitiveDict, to_query_str
+from falcon.util import CaseInsensitiveDict, http_date_to_dt, to_query_str
class Result(object):
@@ -55,7 +59,19 @@ class Result(object):
status (str): HTTP status string given in the response
status_code (int): The code portion of the HTTP status string
headers (CaseInsensitiveDict): A case-insensitive dictionary
- containing all the headers in the response
+ containing all the headers in the response, except for
+ cookies, which may be accessed via the `cookies`
+ attribute.
+
+ Note:
+
+ Multiple instances of a header in the response are
+ currently not supported; it is unspecified which value
+ will "win" and be represented in `headers`.
+
+ cookies (dict): A dictionary of
+ :py:class:`falcon.testing.Cookie` values parsed from the
+ response, by name.
encoding (str): Text encoding of the response body, or ``None``
if the encoding can not be determined.
content (bytes): Raw response body, or ``bytes`` if the
@@ -79,6 +95,41 @@ def __init__(self, iterable, status, headers):
self._status_code = int(status[:3])
self._headers = CaseInsensitiveDict(headers)
+ cookies = http_cookies.SimpleCookie()
+ for name, value in headers:
+ if name.lower() == 'set-cookie':
+ cookies.load(value)
+
+ if sys.version_info < (2, 7):
+ match = re.match('([^=]+)=', value)
+ assert match
+
+ cookie_name = match.group(1)
+
+ # NOTE(kgriffs): py26 has a bug that causes
+ # SimpleCookie to incorrectly parse the "expires"
+ # attribute, so we have to do it ourselves. This
+ # algorithm is obviously very naive, but it should
+ # work well enough until we stop supporting
+ # 2.6, at which time we can remove this code.
+ match = re.search('expires=([^;]+)', value)
+ if match:
+ cookies[cookie_name]['expires'] = match.group(1)
+
+ # NOTE(kgriffs): py26's SimpleCookie won't parse
+ # the "httponly" and "secure" attributes, so we
+ # have to do it ourselves.
+ if 'httponly' in value:
+ cookies[cookie_name]['httponly'] = True
+
+ if 'secure' in value:
+ cookies[cookie_name]['secure'] = True
+
+ self._cookies = dict(
+ (morsel.key, Cookie(morsel))
+ for morsel in cookies.values()
+ )
+
self._encoding = helpers.get_encoding_from_headers(self._headers)
@property
@@ -93,6 +144,10 @@ def status_code(self):
def headers(self):
return self._headers
+ @property
+ def cookies(self):
+ return self._cookies
+
@property
def encoding(self):
return self._encoding
@@ -121,6 +176,81 @@ def json(self):
return json.loads(self.text)
+class Cookie(object):
+ """Represents a cookie returned by a simulated request.
+
+ Args:
+ morsel: A ``Morsel`` object from which to derive the cookie
+ data.
+
+ Attributes:
+ name (str): The cookie's name.
+ value (str): The value of the cookie.
+ expires(datetime.datetime): Expiration timestamp for the cookie,
+ or ``None`` if not specified.
+ path (str): The path prefix to which this cookie is restricted,
+ or ``None`` if not specified.
+ domain (str): The domain to which this cookie is restricted,
+ or ``None`` if not specified.
+ max_age (int): The lifetime of the cookie in seconds, or
+ ``None`` if not specified.
+ secure (bool): Whether or not the cookie may only only be
+ transmitted from the client via HTTPS.
+ http_only (bool): Whether or not the cookie may only be
+ included in unscripted requests from the client.
+ """
+
+ def __init__(self, morsel):
+ self._name = morsel.key
+ self._value = morsel.value
+
+ for name in (
+ 'expires',
+ 'path',
+ 'domain',
+ 'max_age',
+ 'secure',
+ 'httponly',
+ ):
+ value = morsel[name.replace('_', '-')] or None
+ setattr(self, '_' + name, value)
+
+ @property
+ def name(self):
+ return self._name
+
+ @property
+ def value(self):
+ return self._value
+
+ @property
+ def expires(self):
+ if self._expires:
+ return http_date_to_dt(self._expires, obs_date=True)
+
+ return None
+
+ @property
+ def path(self):
+ return self._path
+
+ @property
+ def domain(self):
+ return self._domain
+
+ @property
+ def max_age(self):
+ return int(self._max_age) if self._max_age else None
+
+ @property
+ def secure(self):
+ return bool(self._secure)
+
+ @property
+ def http_only(self):
+ return bool(self._httponly)
+
+
def simulate_request(app, method='GET', path='/', query_string=None,
headers=None, body=None, file_wrapper=None,
params=None, params_csv=True):
diff --git a/tests/test_access_route.py b/tests/test_access_route.py
index c66eb186f..006fb2b61 100644
--- a/tests/test_access_route.py
+++ b/tests/test_access_route.py
@@ -2,81 +2,98 @@
import falcon.testing as testing
-class TestAccessRoute(testing.TestBase):
-
- def test_remote_addr_only(self):
- req = Request(testing.create_environ(
- host='example.com',
- path='/access_route',
- headers={
- 'Forwarded': ('for=192.0.2.43, for="[2001:db8:cafe::17]:555",'
- 'for="unknown", by=_hidden,for="\\"\\\\",'
- 'for="198\\.51\\.100\\.17\\:1236";'
- 'proto=https;host=example.com')
- }))
- self.assertEqual(req.remote_addr, '127.0.0.1')
-
- def test_rfc_forwarded(self):
- req = Request(testing.create_environ(
- host='example.com',
- path='/access_route',
- headers={
- 'Forwarded': ('for=192.0.2.43,for=,'
- 'for="[2001:db8:cafe::17]:555",'
- 'for=x,'
- 'for="unknown", by=_hidden,for="\\"\\\\",'
- 'for="_don\\\"t_\\try_this\\\\at_home_\\42",'
- 'for="198\\.51\\.100\\.17\\:1236";'
- 'proto=https;host=example.com')
- }))
- compares = ['192.0.2.43', '2001:db8:cafe::17', 'x',
- 'unknown', '"\\', '_don"t_try_this\\at_home_42',
- '198.51.100.17']
- self.assertEqual(req.access_route, compares)
- # test cached
- self.assertEqual(req.access_route, compares)
-
- def test_malformed_rfc_forwarded(self):
- req = Request(testing.create_environ(
- host='example.com',
- path='/access_route',
- headers={
- 'Forwarded': 'for'
- }))
- self.assertEqual(req.access_route, [])
- # test cached
- self.assertEqual(req.access_route, [])
-
- def test_x_forwarded_for(self):
- req = Request(testing.create_environ(
- host='example.com',
- path='/access_route',
- headers={
- 'X-Forwarded-For': ('192.0.2.43, 2001:db8:cafe::17,'
- 'unknown, _hidden, 203.0.113.60')
- }))
- self.assertEqual(req.access_route,
- ['192.0.2.43', '2001:db8:cafe::17',
- 'unknown', '_hidden', '203.0.113.60'])
-
- def test_x_real_ip(self):
- req = Request(testing.create_environ(
- host='example.com',
- path='/access_route',
- headers={
- 'X-Real-IP': '2001:db8:cafe::17'
- }))
- self.assertEqual(req.access_route, ['2001:db8:cafe::17'])
-
- def test_remote_addr(self):
- req = Request(testing.create_environ(
- host='example.com',
- path='/access_route'))
- self.assertEqual(req.access_route, ['127.0.0.1'])
-
- def test_remote_addr_missing(self):
- env = testing.create_environ(host='example.com', path='/access_route')
- del env['REMOTE_ADDR']
-
- req = Request(env)
- self.assertEqual(req.access_route, [])
+def test_remote_addr_only():
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'Forwarded': ('for=192.0.2.43, for="[2001:db8:cafe::17]:555",'
+ 'for="unknown", by=_hidden,for="\\"\\\\",'
+ 'for="198\\.51\\.100\\.17\\:1236";'
+ 'proto=https;host=example.com')
+ }))
+
+ assert req.remote_addr == '127.0.0.1'
+
+
+def test_rfc_forwarded():
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'Forwarded': ('for=192.0.2.43,for=,'
+ 'for="[2001:db8:cafe::17]:555",'
+ 'for=x,'
+ 'for="unknown", by=_hidden,for="\\"\\\\",'
+ 'for="_don\\\"t_\\try_this\\\\at_home_\\42",'
+ 'for="198\\.51\\.100\\.17\\:1236";'
+ 'proto=https;host=example.com')
+ }))
+
+ compares = ['192.0.2.43', '2001:db8:cafe::17', 'x',
+ 'unknown', '"\\', '_don"t_try_this\\at_home_42',
+ '198.51.100.17']
+
+ req.access_route == compares
+
+ # test cached
+ req.access_route == compares
+
+
+def test_malformed_rfc_forwarded():
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'Forwarded': 'for'
+ }))
+
+ req.access_route == []
+
+ # test cached
+ req.access_route == []
+
+
+def test_x_forwarded_for():
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'X-Forwarded-For': ('192.0.2.43, 2001:db8:cafe::17,'
+ 'unknown, _hidden, 203.0.113.60')
+ }))
+
+ assert req.access_route == [
+ '192.0.2.43',
+ '2001:db8:cafe::17',
+ 'unknown',
+ '_hidden',
+ '203.0.113.60'
+ ]
+
+
+def test_x_real_ip():
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'X-Real-IP': '2001:db8:cafe::17'
+ }))
+
+ assert req.access_route == ['2001:db8:cafe::17']
+
+
+def test_remote_addr():
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route'))
+
+ assert req.access_route == ['127.0.0.1']
+
+
+def test_remote_addr_missing():
+ env = testing.create_environ(host='example.com', path='/access_route')
+ del env['REMOTE_ADDR']
+
+ req = Request(env)
+ assert req.access_route == []
diff --git a/tests/test_before_hooks.py b/tests/test_before_hooks.py
index bfdb14a63..6a904f4e4 100644
--- a/tests/test_before_hooks.py
+++ b/tests/test_before_hooks.py
@@ -173,11 +173,13 @@ def on_get(self, req, resp, bunnies, frogs, fish):
self.fish = fish
-class TestHooks(testing.TestBase):
+class TestHooks(testing.TestCase):
+
+ def setUp(self):
+ super(TestHooks, self).setUp()
- def before(self):
self.resource = WrappedRespondersResource()
- self.api.add_route(self.test_route, self.resource)
+ self.api.add_route('/', self.resource)
self.field_resource = TestFieldResource()
self.api.add_route('/queue/{id}/messages', self.field_resource)
@@ -190,76 +192,73 @@ def before(self):
def test_multiple_resource_hooks(self):
zoo_resource = ZooResource()
- self.api.add_route(self.test_route, zoo_resource)
+ self.api.add_route('/', zoo_resource)
- self.simulate_request(self.test_route)
+ result = self.simulate_get('/')
- self.assertEqual('not fluffy', self.srmock.headers_dict['X-Frogs'])
- self.assertEqual('fluffy', self.srmock.headers_dict['X-Bunnies'])
+ self.assertEqual('not fluffy', result.headers['X-Frogs'])
+ self.assertEqual('fluffy', result.headers['X-Bunnies'])
self.assertEqual('fluffy', zoo_resource.bunnies)
self.assertEqual('not fluffy', zoo_resource.frogs)
self.assertEqual('slippery', zoo_resource.fish)
def test_input_validator(self):
- self.simulate_request(self.test_route, method='PUT')
- self.assertEqual(falcon.HTTP_400, self.srmock.status)
+ result = self.simulate_put('/')
+ self.assertEqual(result.status_code, 400)
def test_param_validator(self):
- self.simulate_request(self.test_route, query_string='limit=10',
- body='{}')
- self.assertEqual(falcon.HTTP_200, self.srmock.status)
+ result = self.simulate_get('/', query_string='limit=10', body='{}')
+ self.assertEqual(result.status_code, 200)
- self.simulate_request(self.test_route, query_string='limit=101')
- self.assertEqual(falcon.HTTP_400, self.srmock.status)
+ result = self.simulate_get('/', query_string='limit=101')
+ self.assertEqual(result.status_code, 400)
def test_field_validator(self):
- self.simulate_request('/queue/10/messages')
- self.assertEqual(falcon.HTTP_200, self.srmock.status)
+ result = self.simulate_get('/queue/10/messages')
+ self.assertEqual(result.status_code, 200)
self.assertEqual(self.field_resource.id, 10)
- self.simulate_request('/queue/bogus/messages')
- self.assertEqual(falcon.HTTP_400, self.srmock.status)
+ result = self.simulate_get('/queue/bogus/messages')
+ self.assertEqual(result.status_code, 400)
def test_parser(self):
- self.simulate_request(self.test_route,
- body=json.dumps({'animal': 'falcon'}))
-
+ self.simulate_get('/', body=json.dumps({'animal': 'falcon'}))
self.assertEqual(self.resource.doc, {'animal': 'falcon'})
def test_wrapped_resource(self):
- self.simulate_request('/wrapped', method='PATCH')
- self.assertEqual(falcon.HTTP_405, self.srmock.status)
+ result = self.simulate_patch('/wrapped')
+ self.assertEqual(result.status_code, 405)
- self.simulate_request('/wrapped', query_string='limit=10')
- self.assertEqual(falcon.HTTP_200, self.srmock.status)
+ result = self.simulate_get('/wrapped', query_string='limit=10')
+ self.assertEqual(result.status_code, 200)
self.assertEqual('fuzzy', self.wrapped_resource.bunnies)
- self.simulate_request('/wrapped', method='HEAD')
- self.assertEqual(falcon.HTTP_200, self.srmock.status)
+ result = self.simulate_head('/wrapped')
+ self.assertEqual(result.status_code, 200)
self.assertEqual('fuzzy', self.wrapped_resource.bunnies)
- self.simulate_request('/wrapped', method='POST')
- self.assertEqual(falcon.HTTP_200, self.srmock.status)
+ result = self.simulate_post('/wrapped')
+ self.assertEqual(result.status_code, 200)
self.assertEqual('slippery', self.wrapped_resource.fish)
- self.simulate_request('/wrapped', query_string='limit=101')
- self.assertEqual(falcon.HTTP_400, self.srmock.status)
- self.assertEqual('fuzzy', self.wrapped_resource.bunnies)
+ result = self.simulate_get('/wrapped', query_string='limit=101')
+ self.assertEqual(result.status_code, 400)
+ self.assertEqual(self.wrapped_resource.bunnies, 'fuzzy')
def test_wrapped_resource_with_hooks_aware_of_resource(self):
- self.simulate_request('/wrapped_aware', method='PATCH')
- self.assertEqual(falcon.HTTP_405, self.srmock.status)
+ result = self.simulate_patch('/wrapped_aware')
+ self.assertEqual(result.status_code, 405)
- self.simulate_request('/wrapped_aware', query_string='limit=10')
- self.assertEqual(falcon.HTTP_200, self.srmock.status)
- self.assertEqual('fuzzy', self.wrapped_aware_resource.bunnies)
+ result = self.simulate_get('/wrapped_aware', query_string='limit=10')
+ self.assertEqual(result.status_code, 200)
+ self.assertEqual(self.wrapped_aware_resource.bunnies, 'fuzzy')
for method in ('HEAD', 'PUT', 'POST'):
- self.simulate_request('/wrapped_aware', method=method)
- self.assertEqual(falcon.HTTP_200, self.srmock.status)
- self.assertEqual('fuzzy', self.wrapped_aware_resource.bunnies)
+ result = self.simulate_request(method, '/wrapped_aware')
+ self.assertEqual(result.status_code, 200)
+ self.assertEqual(self.wrapped_aware_resource.bunnies, 'fuzzy')
- self.simulate_request('/wrapped_aware', query_string='limit=101')
- self.assertEqual(falcon.HTTP_400, self.srmock.status)
- self.assertEqual('fuzzy', self.wrapped_aware_resource.bunnies)
+ result = self.simulate_get('/wrapped_aware', query_string='limit=101')
+ self.assertEqual(result.status_code, 400)
+ self.assertEqual(self.wrapped_aware_resource.bunnies, 'fuzzy')
diff --git a/tests/test_boundedstream.py b/tests/test_boundedstream.py
new file mode 100644
index 000000000..d28ce61a4
--- /dev/null
+++ b/tests/test_boundedstream.py
@@ -0,0 +1,17 @@
+import io
+
+import pytest
+
+from falcon.request_helpers import BoundedStream
+
+
+@pytest.fixture
+def bounded_stream():
+ return BoundedStream(io.BytesIO(), 1024)
+
+
+def test_not_writeable(bounded_stream):
+ assert not bounded_stream.writeable()
+
+ with pytest.raises(IOError):
+ bounded_stream.write(b'something something')
diff --git a/tests/test_cookies.py b/tests/test_cookies.py
index d71421696..e966bfdd9 100644
--- a/tests/test_cookies.py
+++ b/tests/test_cookies.py
@@ -1,10 +1,8 @@
from datetime import datetime, timedelta, tzinfo
import re
-import sys
-import ddt
+import pytest
from six.moves.http_cookies import Morsel
-from testtools.matchers import LessThan
import falcon
import falcon.testing as testing
@@ -59,167 +57,241 @@ def on_get(self, req, resp):
'foostring', 'bar', max_age='15', secure=False, http_only=False)
-@ddt.ddt
-class TestCookies(testing.TestBase):
-
- #
- # Response
- #
-
- def test_response_base_case(self):
- self.resource = CookieResource()
- self.api.add_route(self.test_route, self.resource)
- self.simulate_request(self.test_route, method='GET')
- if sys.version_info >= (3, 4, 3):
- value = 'foo=bar; Domain=example.com; HttpOnly; Path=/; Secure'
- else:
- value = 'foo=bar; Domain=example.com; httponly; Path=/; secure'
- self.assertIn(('set-cookie', value), self.srmock.headers)
-
- def test_response_complex_case(self):
- self.resource = CookieResource()
- self.api.add_route(self.test_route, self.resource)
- self.simulate_request(self.test_route, method='HEAD')
- if sys.version_info >= (3, 4, 3):
- value = 'foo=bar; HttpOnly; Max-Age=300; Secure'
- else:
- value = 'foo=bar; httponly; Max-Age=300; secure'
- self.assertIn(('set-cookie', value), self.srmock.headers)
- if sys.version_info >= (3, 4, 3):
- value = 'bar=baz; Secure'
- else:
- value = 'bar=baz; secure'
- self.assertIn(('set-cookie', value), self.srmock.headers)
- self.assertNotIn(('set-cookie', 'bad=cookie'), self.srmock.headers)
-
- def test_cookie_expires_naive(self):
- self.resource = CookieResource()
- self.api.add_route(self.test_route, self.resource)
- self.simulate_request(self.test_route, method='POST')
- self.assertIn(
- ('set-cookie', 'foo=bar; expires=Sat, 01 Jan 2050 00:00:00 GMT'),
- self.srmock.headers)
-
- def test_cookie_expires_aware(self):
- self.resource = CookieResource()
- self.api.add_route(self.test_route, self.resource)
- self.simulate_request(self.test_route, method='PUT')
- self.assertIn(
- ('set-cookie', 'foo=bar; expires=Fri, 31 Dec 2049 23:00:00 GMT'),
- self.srmock.headers)
-
- def test_cookies_setable(self):
- resp = falcon.Response()
-
- self.assertIsNone(resp._cookies)
-
- resp.set_cookie('foo', 'wrong-cookie', max_age=301)
- resp.set_cookie('foo', 'bar', max_age=300)
- morsel = resp._cookies['foo']
-
- self.assertIsInstance(morsel, Morsel)
- self.assertEqual(morsel.key, 'foo')
- self.assertEqual(morsel.value, 'bar')
- self.assertEqual(morsel['max-age'], 300)
-
- def test_cookie_max_age_float_and_string(self):
- # Falcon implicitly converts max-age values to integers,
- # for ensuring RFC 6265-compliance of the attribute value.
- self.resource = CookieResourceMaxAgeFloatString()
- self.api.add_route(self.test_route, self.resource)
- self.simulate_request(self.test_route, method='GET')
- self.assertIn(
- ('set-cookie', 'foofloat=bar; Max-Age=15'), self.srmock.headers)
- self.assertIn(
- ('set-cookie', 'foostring=bar; Max-Age=15'), self.srmock.headers)
-
- def test_response_unset_cookie(self):
- resp = falcon.Response()
- resp.unset_cookie('bad')
- resp.set_cookie('bad', 'cookie', max_age=300)
- resp.unset_cookie('bad')
+@pytest.fixture(scope='module')
+def client():
+ app = falcon.API()
+ app.add_route('/', CookieResource())
+ app.add_route('/test-convert', CookieResourceMaxAgeFloatString())
+
+ return testing.TestClient(app)
+
+
+# =====================================================================
+# Response
+# =====================================================================
+
+
+def test_response_base_case(client):
+ result = client.simulate_get('/')
+
+ cookie = result.cookies['foo']
+ assert cookie.name == 'foo'
+ assert cookie.value == 'bar'
+ assert cookie.domain == 'example.com'
+ assert cookie.http_only
+
+ # NOTE(kgriffs): Explicitly test for None to ensure
+ # falcon.testing.Cookie is returning exactly what we
+ # expect. Apps using falcon.testing.Cookie can be a
+ # bit more cavalier if they wish.
+ assert cookie.max_age is None
+ assert cookie.expires is None
+
+ assert cookie.path == '/'
+ assert cookie.secure
+
+
+def test_response_complex_case(client):
+ result = client.simulate_head('/')
+
+ assert len(result.cookies) == 3
+
+ cookie = result.cookies['foo']
+ assert cookie.value == 'bar'
+ assert cookie.domain is None
+ assert cookie.expires is None
+ assert cookie.http_only
+ assert cookie.max_age == 300
+ assert cookie.path is None
+ assert cookie.secure
+
+ cookie = result.cookies['bar']
+ assert cookie.value == 'baz'
+ assert cookie.domain is None
+ assert cookie.expires is None
+ assert not cookie.http_only
+ assert cookie.max_age is None
+ assert cookie.path is None
+ assert cookie.secure
+
+ cookie = result.cookies['bad']
+ assert cookie.value == '' # An unset cookie has an empty value
+ assert cookie.domain is None
+
+ assert cookie.expires < datetime.utcnow()
+
+ # NOTE(kgriffs): I know accessing a private attr like this is
+ # naughty of me, but we just need to sanity-check that the
+ # string is GMT.
+ assert cookie._expires.endswith('GMT')
+
+ assert cookie.http_only
+ assert cookie.max_age is None
+ assert cookie.path is None
+ assert cookie.secure
+
+
+def test_cookie_expires_naive(client):
+ result = client.simulate_post('/')
- morsels = list(resp._cookies.values())
- self.assertEqual(len(morsels), 1)
+ cookie = result.cookies['foo']
+ assert cookie.value == 'bar'
+ assert cookie.domain is None
+ assert cookie.expires == datetime(year=2050, month=1, day=1)
+ assert not cookie.http_only
+ assert cookie.max_age is None
+ assert cookie.path is None
+ assert not cookie.secure
- bad_cookie = morsels[0]
- self.assertEqual(bad_cookie['expires'], -1)
- output = bad_cookie.OutputString()
- self.assertTrue('bad=;' in output or 'bad="";' in output)
+def test_cookie_expires_aware(client):
+ result = client.simulate_put('/')
- match = re.search('expires=([^;]+)', output)
- self.assertIsNotNone(match)
+ cookie = result.cookies['foo']
+ assert cookie.value == 'bar'
+ assert cookie.domain is None
+ assert cookie.expires == datetime(year=2049, month=12, day=31, hour=23)
+ assert not cookie.http_only
+ assert cookie.max_age is None
+ assert cookie.path is None
+ assert not cookie.secure
- expiration = http_date_to_dt(match.group(1), obs_date=True)
- self.assertThat(expiration, LessThan(datetime.utcnow()))
- def test_cookie_timezone(self):
- tz = TimezoneGMT()
- self.assertEqual('GMT', tz.tzname(timedelta(0)))
+def test_cookies_setable(client):
+ resp = falcon.Response()
- #
- # Request
- #
+ assert resp._cookies is None
- def test_request_cookie_parsing(self):
- # testing with a github-ish set of cookies
- headers = [
- ('Cookie', '''
- logged_in=no;_gh_sess=eyJzZXXzaW9uX2lkIjoiN2;
- tz=Europe/Berlin; _ga=GA1.2.332347814.1422308165;
- _gat=1;
- _octo=GH1.1.201722077.1422308165'''),
- ]
+ resp.set_cookie('foo', 'wrong-cookie', max_age=301)
+ resp.set_cookie('foo', 'bar', max_age=300)
+ morsel = resp._cookies['foo']
- environ = testing.create_environ(headers=headers)
- req = falcon.Request(environ)
+ assert isinstance(morsel, Morsel)
+ assert morsel.key == 'foo'
+ assert morsel.value == 'bar'
+ assert morsel['max-age'] == 300
- self.assertEqual('no', req.cookies['logged_in'])
- self.assertEqual('Europe/Berlin', req.cookies['tz'])
- self.assertEqual('GH1.1.201722077.1422308165', req.cookies['_octo'])
- self.assertIn('logged_in', req.cookies)
- self.assertIn('_gh_sess', req.cookies)
- self.assertIn('tz', req.cookies)
- self.assertIn('_ga', req.cookies)
- self.assertIn('_gat', req.cookies)
- self.assertIn('_octo', req.cookies)
+@pytest.mark.parametrize('cookie_name', ('foofloat', 'foostring'))
+def test_cookie_max_age_float_and_string(client, cookie_name):
+ # NOTE(tbug): Falcon implicitly converts max-age values to integers,
+ # to ensure RFC 6265-compliance of the attribute value.
- def test_unicode_inside_ascii_range(self):
- resp = falcon.Response()
+ result = client.simulate_get('/test-convert')
- # should be ok
- resp.set_cookie('non_unicode_ascii_name_1', 'ascii_value')
- resp.set_cookie(u'unicode_ascii_name_1', 'ascii_value')
- resp.set_cookie('non_unicode_ascii_name_2', u'unicode_ascii_value')
- resp.set_cookie(u'unicode_ascii_name_2', u'unicode_ascii_value')
+ cookie = result.cookies[cookie_name]
+ assert cookie.value == 'bar'
+ assert cookie.domain is None
+ assert cookie.expires is None
+ assert not cookie.http_only
+ assert cookie.max_age == 15
+ assert cookie.path is None
+ assert not cookie.secure
- @ddt.data(
+
+def test_response_unset_cookie(client):
+ resp = falcon.Response()
+ resp.unset_cookie('bad')
+ resp.set_cookie('bad', 'cookie', max_age=300)
+ resp.unset_cookie('bad')
+
+ morsels = list(resp._cookies.values())
+ len(morsels) == 1
+
+ bad_cookie = morsels[0]
+ bad_cookie['expires'] == -1
+
+ output = bad_cookie.OutputString()
+ assert 'bad=;' in output or 'bad="";' in output
+
+ match = re.search('expires=([^;]+)', output)
+ assert match
+
+ expiration = http_date_to_dt(match.group(1), obs_date=True)
+ assert expiration < datetime.utcnow()
+
+
+def test_cookie_timezone(client):
+ tz = TimezoneGMT()
+ assert tz.tzname(timedelta(0)) == 'GMT'
+
+
+# =====================================================================
+# Request
+# =====================================================================
+
+
+def test_request_cookie_parsing():
+ # testing with a github-ish set of cookies
+ headers = [
+ (
+ 'Cookie',
+ '''
+ logged_in=no;_gh_sess=eyJzZXXzaW9uX2lkIjoiN2;
+ tz=Europe/Berlin; _ga=GA1.2.332347814.1422308165;
+ _gat=1;
+ _octo=GH1.1.201722077.1422308165
+ '''
+ ),
+ ]
+
+ environ = testing.create_environ(headers=headers)
+ req = falcon.Request(environ)
+
+ assert req.cookies['logged_in'] == 'no'
+ assert req.cookies['tz'] == 'Europe/Berlin'
+ assert req.cookies['_octo'] == 'GH1.1.201722077.1422308165'
+
+ assert 'logged_in' in req.cookies
+ assert '_gh_sess' in req.cookies
+ assert 'tz' in req.cookies
+ assert '_ga' in req.cookies
+ assert '_gat' in req.cookies
+ assert '_octo' in req.cookies
+
+
+def test_unicode_inside_ascii_range():
+ resp = falcon.Response()
+
+ # should be ok
+ resp.set_cookie('non_unicode_ascii_name_1', 'ascii_value')
+ resp.set_cookie(u'unicode_ascii_name_1', 'ascii_value')
+ resp.set_cookie('non_unicode_ascii_name_2', u'unicode_ascii_value')
+ resp.set_cookie(u'unicode_ascii_name_2', u'unicode_ascii_value')
+
+
+@pytest.mark.parametrize(
+ 'name',
+ (
UNICODE_TEST_STRING,
UNICODE_TEST_STRING.encode('utf-8'),
42
)
- def test_non_ascii_name(self, name):
- resp = falcon.Response()
- self.assertRaises(KeyError, resp.set_cookie,
- name, 'ok_value')
+)
+def test_non_ascii_name(name):
+ resp = falcon.Response()
+ with pytest.raises(KeyError):
+ resp.set_cookie(name, 'ok_value')
+
- @ddt.data(
+@pytest.mark.parametrize(
+ 'value',
+ (
UNICODE_TEST_STRING,
UNICODE_TEST_STRING.encode('utf-8'),
42
)
- def test_non_ascii_value(self, value):
- resp = falcon.Response()
-
- # NOTE(tbug): we need to grab the exception to check
- # that it is not instance of UnicodeEncodeError, so
- # we cannot simply use assertRaises
- try:
- resp.set_cookie('ok_name', value)
- except ValueError as e:
- self.assertIsInstance(e, ValueError)
- self.assertNotIsInstance(e, UnicodeEncodeError)
- else:
- self.fail('set_bad_cookie_value did not fail as expected')
+)
+def test_non_ascii_value(value):
+ resp = falcon.Response()
+
+ # NOTE(tbug): we need to grab the exception to check
+ # that it is not instance of UnicodeEncodeError, so
+ # we cannot simply use pytest.raises
+ try:
+ resp.set_cookie('ok_name', value)
+ except ValueError as e:
+ assert isinstance(e, ValueError)
+ assert not isinstance(e, UnicodeEncodeError)
+ else:
+ pytest.fail('set_bad_cookie_value did not fail as expected')
| diff --git a/.travis.yml b/.travis.yml
index 081ee6f8c..1ac94eb07 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -9,10 +9,10 @@ matrix:
include:
- python: 2.7 # these are just to make travis's UI a bit prettier
env: JYTHON=true
- - python: pypy
+ - python: pypy-5.3
env: TOXENV=pypy
- - python: pypy3
- env: TOXENV=pypy3
+ # - python: pypy3
+ # env: TOXENV=pypy3
- python: 2.7
env: TOXENV=pep8
- python: 2.6
diff --git a/tox.ini b/tox.ini
index 68c0f05c9..31772f686 100644
--- a/tox.ini
+++ b/tox.ini
@@ -58,6 +58,7 @@ deps = -r{toxinidir}/tools/test-requires
[testenv:py27_debug]
deps = {[with-debug-tools]deps}
+ funcsigs
[testenv:py34_debug]
deps = {[with-debug-tools]deps}
| [
{
"components": [
{
"doc": "Always returns ``True``.",
"lines": [
101,
103
],
"name": "BoundedStream.readable",
"signature": "def readable(self):",
"type": "function"
},
{
"doc": "Always returns ``False``.",
"l... | [
"tests/test_before_hooks.py::TestHooks::test_param_validator",
"tests/test_before_hooks.py::TestHooks::test_parser",
"tests/test_boundedstream.py::test_not_writeable"
] | [
"tests/test_access_route.py::test_remote_addr_only",
"tests/test_access_route.py::test_rfc_forwarded",
"tests/test_access_route.py::test_malformed_rfc_forwarded",
"tests/test_access_route.py::test_x_forwarded_for",
"tests/test_access_route.py::test_x_real_ip",
"tests/test_access_route.py::test_remote_addr... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
test: Migrate several test modules away from the deprecated framework
- Migrate additional test modules away from the deprecated framework,
taking care to demonstrate both unittest- and pytest-style testing.
- Add an additional Cookie class and parse cookies in the Result object
to facilitate cookie testing.
- Extend BoundStream to implement more of the IO protocol so that
it works with mock WSGI input objects (and potentially improves
compatibility as a drop-in replacement for the native WSGI input
object).
- Disable pypy3 testing on Travis, since it uses a buggy cookie impl,
and also we don't support pypy3 (yet) since overall it isn't fully
baked.
- Specify latest pypy version for Travis testing, which has improved
cookie handling.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/request_helpers.py]
(definition of BoundedStream.readable:)
def readable(self):
"""Always returns ``True``."""
(definition of BoundedStream.seekable:)
def seekable(self):
"""Always returns ``False``."""
(definition of BoundedStream.writeable:)
def writeable(self):
"""Always returns ``False``."""
(definition of BoundedStream.write:)
def write(self, data):
"""Always raises IOError; writing is not supported."""
[end of new definitions in falcon/request_helpers.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | |
conan-io__conan-463 | 463 | conan-io/conan | null | 18baa9739d738c286e38eb463dff6d7b948d54e8 | 2016-09-06T19:24:02Z | diff --git a/conans/client/command.py b/conans/client/command.py
index cbe56b01132..5fc51819800 100644
--- a/conans/client/command.py
+++ b/conans/client/command.py
@@ -24,7 +24,7 @@
from conans.client.remote_registry import RemoteRegistry
from conans.model.scope import Scopes
import re
-from conans.search import DiskSearchManager
+from conans.search import DiskSearchManager, DiskSearchAdapter
import sys
import os
from conans.client.client_cache import ClientCache
@@ -748,10 +748,12 @@ def instance_remote_manager(client_cache):
client_cache = ClientCache(user_folder, None, out)
# obtain a temp ConanManager instance to execute the migrations
remote_manager = instance_remote_manager(client_cache)
- # Get a SearchManager
- search_manager = DiskSearchManager(client_cache)
+
+ # Get a DiskSearchManager
+ search_adapter = DiskSearchAdapter()
+ search_manager = DiskSearchManager(client_cache, search_adapter)
manager = ConanManager(client_cache, user_io, ConanRunner(), remote_manager, search_manager)
-
+
client_cache = migrate_and_get_client_cache(user_folder, out, manager)
except Exception as e:
out.error(str(e))
@@ -761,7 +763,8 @@ def instance_remote_manager(client_cache):
remote_manager = instance_remote_manager(client_cache)
# Get a search manager
- search_manager = DiskSearchManager(client_cache)
+ search_adapter = DiskSearchAdapter()
+ search_manager = DiskSearchManager(client_cache, search_adapter)
command = Command(client_cache, user_io, ConanRunner(), remote_manager, search_manager)
return command
diff --git a/conans/client/manager.py b/conans/client/manager.py
index 14d0df86f89..1a5804fcbef 100644
--- a/conans/client/manager.py
+++ b/conans/client/manager.py
@@ -131,8 +131,7 @@ def download(self, reference, package_ids, remote=None):
else: # Not specified packages, download all
packages_props = remote_proxy.search_packages(reference, None) # No filter by properties
if not packages_props:
- remote = remote or self._remote_manager.default_remote
- raise ConanException("'%s' not found in remote '%s'" % (str(reference), remote))
+ raise ConanException("'%s' not found in remote" % str(reference))
remote_proxy.download_packages(reference, list(packages_props.keys()))
diff --git a/conans/info.py b/conans/info.py
deleted file mode 100644
index 5333b426f39..00000000000
--- a/conans/info.py
+++ /dev/null
@@ -1,27 +0,0 @@
-from conans.model.ref import ConanFileReference
-from conans.model.info import ConanInfo
-import json
-from conans.util.files import decode_text
-
-
-class SearchInfo(dict):
- """ {ConanFileReference: dict{package_id: ConanInfo}
- """
-
- def serialize(self):
- serialize_info = {}
- for ref, conan_info in self.items():
- serialize_info[repr(ref)] = {k: v.serialize() for k, v in conan_info.items()}
- return serialize_info
-
- @staticmethod
- def deserialize(data):
- tmp = json.loads(decode_text(data))
- ret = SearchInfo()
- for conan_ref, packages in tmp.items():
- conan_ref = ConanFileReference.loads(conan_ref)
- ret[conan_ref] = {}
- for package_id, info in packages.items():
- ret[conan_ref][package_id] = ConanInfo.deserialize(info)
-
- return ret
diff --git a/conans/search.py b/conans/search.py
index 1ea883a270b..cb14531f373 100644
--- a/conans/search.py
+++ b/conans/search.py
@@ -1,4 +1,3 @@
-import os
import re
from abc import ABCMeta, abstractmethod
@@ -9,10 +8,50 @@
from conans.model.ref import PackageReference, ConanFileReference
from conans.paths import CONANINFO
from conans.util.log import logger
-from conans.util.files import load, path_exists, list_folder_subdirs
-class SearchManager(object):
+class SearchAdapterABC(object):
+ """Methods that allows access to disk or s3 or whatever to make a search"""
+ __metaclass__ = ABCMeta
+
+ @abstractmethod
+ def list_folder_subdirs(self, basedir, level):
+ pass
+
+ @abstractmethod
+ def path_exists(self, path):
+ pass
+
+ @abstractmethod
+ def load(self, filepath):
+ pass
+
+ @abstractmethod
+ def join_paths(self, *args):
+ pass
+
+
+class DiskSearchAdapter(SearchAdapterABC):
+
+ def list_folder_subdirs(self, basedir, level):
+ from conans.util.files import list_folder_subdirs
+ return list_folder_subdirs(basedir, level)
+
+ def path_exists(self, path, basedir=None):
+ from conans.util.files import path_exists
+ return path_exists(path, basedir)
+
+ def load(self, filepath):
+ from conans.util.files import load
+ return load(filepath)
+
+ def join_paths(self, *args):
+ import os
+ return os.path.join(*args)
+
+
+class SearchManagerABC(object):
+ """Methods that allows access to disk or s3 or whatever to make a search"""
__metaclass__ = ABCMeta
@abstractmethod
@@ -24,11 +63,13 @@ def search_packages(self, reference, query):
pass
-class DiskSearchManager(SearchManager):
- """Will search recipes and packages using a file system"""
+class DiskSearchManager(SearchManagerABC):
+ """Will search recipes and packages using a file system.
+ Can be used with a SearchAdapter"""
- def __init__(self, paths):
+ def __init__(self, paths, disk_search_adapter):
self._paths = paths
+ self._adapter = disk_search_adapter
def search(self, pattern=None, ignorecase=True):
# Conan references in main storage
@@ -36,7 +77,7 @@ def search(self, pattern=None, ignorecase=True):
pattern = translate(pattern)
pattern = re.compile(pattern, re.IGNORECASE) if ignorecase else re.compile(pattern)
- subdirs = list_folder_subdirs(basedir=self._paths.store, level=4)
+ subdirs = self._adapter.list_folder_subdirs(basedir=self._paths.store, level=4)
if not pattern:
return sorted([ConanFileReference(*folder.split("/")) for folder in subdirs])
@@ -72,14 +113,14 @@ def search_packages(self, reference, query):
logger.debug("SEARCH PACKAGE PROPERTIES: %s" % properties)
result = {}
packages_path = self._paths.packages(reference)
- subdirs = list_folder_subdirs(packages_path, level=1)
+ subdirs = self._adapter.list_folder_subdirs(packages_path, level=1)
for package_id in subdirs:
try:
package_reference = PackageReference(reference, package_id)
- info_path = os.path.join(self._paths.package(package_reference), CONANINFO)
- if not path_exists(info_path, self._paths.store):
+ info_path = self._adapter.join_paths(self._paths.package(package_reference), CONANINFO)
+ if not self._adapter.path_exists(info_path, self._paths.store):
raise NotFoundException("")
- conan_info_content = load(info_path)
+ conan_info_content = self._adapter.load(info_path)
conan_vars_info = ConanInfo.loads(conan_info_content)
if not self._filtered_by_properties(conan_vars_info, properties):
result[package_id] = conan_vars_info.serialize_min()
@@ -91,25 +132,27 @@ def search_packages(self, reference, query):
return result
def _filtered_by_properties(self, conan_vars_info, properties):
-
+
def compatible_prop(setting_value, prop_value):
return setting_value is None or prop_value == setting_value
-
+
for prop_name, prop_value in properties.items():
if prop_name == "os" and not compatible_prop(conan_vars_info.settings.os, prop_value):
return True
- elif prop_name == "compiler" and not compatible_prop(conan_vars_info.settings.compiler, prop_value):
+ elif prop_name == "compiler" and not compatible_prop(conan_vars_info.settings.compiler,
+ prop_value):
return True
elif prop_name.startswith("compiler."):
subsetting = prop_name[9:]
- if not compatible_prop(getattr(conan_vars_info.settings.compiler, subsetting), prop_value):
+ if not compatible_prop(getattr(conan_vars_info.settings.compiler, subsetting),
+ prop_value):
return True
elif prop_name == "arch" and not compatible_prop(conan_vars_info.settings.arch, prop_value):
return True
- elif prop_name == "build_type" and not compatible_prop(conan_vars_info.settings.build_type, prop_value):
+ elif prop_name == "build_type" and not compatible_prop(conan_vars_info.settings.build_type,
+ prop_value):
return True
else:
if getattr(conan_vars_info.options, prop_name) == prop_value:
- return True
- return False
-
\ No newline at end of file
+ return True
+ return False
diff --git a/conans/server/server_launcher.py b/conans/server/server_launcher.py
index 3b1ef7ed952..bee0a666861 100644
--- a/conans/server/server_launcher.py
+++ b/conans/server/server_launcher.py
@@ -9,7 +9,7 @@
from conans.server.migrate import migrate_and_get_server_config
from conans import __version__ as SERVER_VERSION
from conans.paths import conan_expand_user, SimplePaths
-from conans.search import DiskSearchManager
+from conans.search import DiskSearchManager, SearchAdapter
class ServerLauncher(object):
@@ -29,7 +29,10 @@ def __init__(self):
server_config.authorize_timeout)
file_manager = get_file_manager(server_config, updown_auth_manager=updown_auth_manager)
- search_manager = DiskSearchManager(SimplePaths(server_config.disk_storage_path))
+
+ search_adapter = DiskSearchAdapter()
+ search_manager = DiskSearchManager(SimplePaths(server_config.disk_storage_path),
+ search_adapter)
self.ra = ConanServer(server_config.port, server_config.ssl_enabled,
credentials_manager, updown_auth_manager,
authorizer, authenticator, file_manager, search_manager,
| diff --git a/conans/server/test/service/service_test.py b/conans/server/test/service/service_test.py
index a1efae49941..0cf94116b50 100644
--- a/conans/server/test/service/service_test.py
+++ b/conans/server/test/service/service_test.py
@@ -12,11 +12,10 @@
from conans.server.crypto.jwt.jwt_updown_manager import JWTUpDownAuthManager
from datetime import timedelta
from time import sleep
-from conans.model.info import ConanInfo
from conans.model.manifest import FileTreeManifest
from conans.test.utils.test_files import temp_folder
from conans.server.store.disk_adapter import ServerDiskAdapter
-from conans.search import DiskSearchManager
+from conans.search import DiskSearchManager, DiskSearchAdapter
class MockFileSaver():
@@ -91,7 +90,9 @@ def setUp(self):
adapter = ServerDiskAdapter(self.fake_url, self.tmp_dir, updown_auth_manager)
self.paths = SimplePaths(self.tmp_dir)
self.file_manager = FileManager(self.paths, adapter)
- self.search_manager = DiskSearchManager(self.paths)
+
+ search_adapter = DiskSearchAdapter()
+ self.search_manager = DiskSearchManager(self.paths, search_adapter)
self.service = ConanService(authorizer, self.file_manager, "lasote")
self.search_service = SearchService(authorizer, self.search_manager, "lasote")
diff --git a/conans/server/test/utils/server_launcher.py b/conans/server/test/utils/server_launcher.py
index 75dcdc9eafd..b9dfde7c88a 100644
--- a/conans/server/test/utils/server_launcher.py
+++ b/conans/server/test/utils/server_launcher.py
@@ -9,7 +9,7 @@
from conans.util.files import mkdir
from conans.test.utils.test_files import temp_folder
from conans.server.migrate import migrate_and_get_server_config
-from conans.search import DiskSearchManager
+from conans.search import DiskSearchAdapter, DiskSearchManager
from conans.paths import SimplePaths
@@ -47,7 +47,8 @@ def __init__(self, base_path=None, read_permissions=None,
self.file_manager = get_file_manager(server_config, public_url=base_url,
updown_auth_manager=updown_auth_manager)
- self.search_manager = DiskSearchManager(SimplePaths(server_config.disk_storage_path))
+ search_adapter = DiskSearchAdapter()
+ self.search_manager = DiskSearchManager(SimplePaths(server_config.disk_storage_path), search_adapter)
# Prepare some test users
if not read_permissions:
read_permissions = server_config.read_permissions
diff --git a/conans/test/paths_test.py b/conans/test/paths_test.py
index 3cd46e61bd3..397ae9ea1b6 100644
--- a/conans/test/paths_test.py
+++ b/conans/test/paths_test.py
@@ -5,7 +5,7 @@
SimplePaths, CONANINFO)
from conans.model.ref import ConanFileReference, PackageReference
from conans.test.utils.test_files import temp_folder
-from conans.search import DiskSearchManager
+from conans.search import DiskSearchManager, DiskSearchAdapter
from conans.util.files import save
from conans.model.info import ConanInfo
@@ -44,7 +44,8 @@ def basic_test(self):
def basic_test2(self):
folder = temp_folder()
paths = SimplePaths(folder)
- search_manager = DiskSearchManager(paths)
+ search_adapter = DiskSearchAdapter()
+ search_manager = DiskSearchManager(paths, search_adapter)
os.chdir(paths.store)
@@ -92,7 +93,8 @@ def basic_test2(self):
os.makedirs("%s/%s" % (root_folder5, EXPORT_FOLDER))
# Case insensitive searches
- search_manager = DiskSearchManager(paths)
+ search_adapter = DiskSearchAdapter()
+ search_manager = DiskSearchManager(paths, search_adapter)
reg_conans = sorted([str(_reg) for _reg in search_manager.search("*")])
self.assertEqual(reg_conans, [str(conan_ref5),
diff --git a/conans/test/tools.py b/conans/test/tools.py
index 312861a417d..c4d87e9d908 100644
--- a/conans/test/tools.py
+++ b/conans/test/tools.py
@@ -32,7 +32,7 @@
import six
from conans.client.rest.uploader_downloader import IterableToFileAdapter
from conans.client.client_cache import ClientCache
-from conans.search import DiskSearchManager
+from conans.search import DiskSearchManager, DiskSearchAdapter
class TestingResponse(object):
@@ -298,7 +298,10 @@ def __init__(self, base_folder=None, current_folder=None,
# Define storage_folder, if not, it will be read from conf file & pointed to real user home
self.storage_folder = os.path.join(self.base_folder, ".conan", "data")
self.client_cache = ClientCache(self.base_folder, self.storage_folder, TestBufferConanOutput())
- self.search_manager = DiskSearchManager(self.client_cache)
+
+ search_adapter = DiskSearchAdapter()
+ self.search_manager = DiskSearchManager(self.client_cache, search_adapter)
+
self.default_settings(get_env("CONAN_COMPILER", "gcc"),
get_env("CONAN_COMPILER_VERSION", "4.8"),
get_env("CONAN_LIBCXX", "libstdc++"))
| [
{
"components": [
{
"doc": "Methods that allows access to disk or s3 or whatever to make a search",
"lines": [
13,
31
],
"name": "SearchAdapterABC",
"signature": "class SearchAdapterABC(object):",
"type": "class"
},
{
... | [
"conans/server/test/service/service_test.py::FileUploadDownloadServiceTest::test_file_download",
"conans/server/test/service/service_test.py::FileUploadDownloadServiceTest::test_file_upload",
"conans/server/test/service/service_test.py::ConanServiceTest::test_get_conanfile_download_urls",
"conans/server/test/... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Feature/better search manager
Prepared to use the same search manager with different storage types
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/search.py]
(definition of SearchAdapterABC:)
class SearchAdapterABC(object):
"""Methods that allows access to disk or s3 or whatever to make a search"""
(definition of SearchAdapterABC.list_folder_subdirs:)
def list_folder_subdirs(self, basedir, level):
(definition of SearchAdapterABC.path_exists:)
def path_exists(self, path):
(definition of SearchAdapterABC.load:)
def load(self, filepath):
(definition of SearchAdapterABC.join_paths:)
def join_paths(self, *args):
(definition of DiskSearchAdapter:)
class DiskSearchAdapter(SearchAdapterABC):
(definition of DiskSearchAdapter.list_folder_subdirs:)
def list_folder_subdirs(self, basedir, level):
(definition of DiskSearchAdapter.path_exists:)
def path_exists(self, path, basedir=None):
(definition of DiskSearchAdapter.load:)
def load(self, filepath):
(definition of DiskSearchAdapter.join_paths:)
def join_paths(self, *args):
(definition of SearchManagerABC:)
class SearchManagerABC(object):
"""Methods that allows access to disk or s3 or whatever to make a search"""
(definition of SearchManagerABC.search:)
def search(self, pattern=None, ignorecase=True):
(definition of SearchManagerABC.search_packages:)
def search_packages(self, reference, query):
[end of new definitions in conans/search.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | ||
falconry__falcon-890 | 890 | falconry/falcon | null | c2cb7091e28680671cb9092bf5f8a1840f6e6308 | 2016-09-06T05:35:08Z | diff --git a/docs/api/middleware.rst b/docs/api/middleware.rst
index f947b7fd6..b6e9036df 100644
--- a/docs/api/middleware.rst
+++ b/docs/api/middleware.rst
@@ -47,7 +47,7 @@ Falcon's middleware interface is defined as follows:
method as keyword arguments.
"""
- def process_response(self, req, resp, resource):
+ def process_response(self, req, resp, resource, req_succeeded):
"""Post-processing of the response (after routing).
Args:
@@ -56,6 +56,9 @@ Falcon's middleware interface is defined as follows:
resource: Resource object to which the request was
routed. May be None if no route was found
for the request.
+ req_succeeded: True if no exceptions were raised while
+ the framework processed and routed the request;
+ otherwise False.
"""
.. Tip::
diff --git a/falcon/api.py b/falcon/api.py
index 853abb18a..e1e985a78 100644
--- a/falcon/api.py
+++ b/falcon/api.py
@@ -73,7 +73,7 @@ def process_resource(self, req, resp, resource, params):
arguments.
\"\"\"
- def process_response(self, req, resp, resource)
+ def process_response(self, req, resp, resource, req_succeeded)
\"\"\"Post-processing of the response (after routing).
Args:
@@ -82,6 +82,9 @@ def process_response(self, req, resp, resource)
resource: Resource object to which the request was
routed. May be None if no route was found
for the request.
+ req_succeeded: True if no exceptions were raised
+ while the framework processed and routed the
+ request; otherwise False.
\"\"\"
See also :ref:`Middleware <middleware>`.
@@ -161,9 +164,11 @@ def __call__(self, env, start_response): # noqa: C901
req = self._request_type(env, options=self.req_options)
resp = self._response_type()
resource = None
- mw_pr_stack = [] # Keep track of executed middleware components
params = {}
+ mw_pr_stack = [] # Keep track of executed middleware components
+ req_succeeded = False
+
try:
try:
# NOTE(ealogar): The execution of request middleware
@@ -202,6 +207,7 @@ def __call__(self, env, start_response): # noqa: C901
process_resource(req, resp, resource, params)
responder(req, resp, **params)
+ req_succeeded = True
except Exception as ex:
if not self._handle_exception(ex, req, resp, params):
raise
@@ -217,11 +223,13 @@ def __call__(self, env, start_response): # noqa: C901
while mw_pr_stack:
process_response = mw_pr_stack.pop()
try:
- process_response(req, resp, resource)
+ process_response(req, resp, resource, req_succeeded)
except Exception as ex:
if not self._handle_exception(ex, req, resp, params):
raise
+ req_succeeded = False
+
#
# Set status and headers
#
diff --git a/falcon/api_helpers.py b/falcon/api_helpers.py
index 9f91608ed..04a5979da 100644
--- a/falcon/api_helpers.py
+++ b/falcon/api_helpers.py
@@ -14,6 +14,9 @@
"""Utilities for the API class."""
+from functools import wraps
+import inspect
+
from falcon import util
@@ -49,6 +52,21 @@ def prepare_middleware(middleware=None):
msg = '{0} does not implement the middleware interface'
raise TypeError(msg.format(component))
+ if process_response:
+ # NOTE(kgriffs): Shim older implementations to ensure
+ # backwards-compatibility.
+ spec = inspect.getargspec(process_response)
+
+ if len(spec.args) == 4: # (self, req, resp, resource)
+ def let(process_response=process_response):
+ @wraps(process_response)
+ def shim(req, resp, resource, req_succeeded):
+ process_response(req, resp, resource)
+
+ return shim
+
+ process_response = let()
+
prepared_middleware.append((process_request, process_resource,
process_response))
| diff --git a/tests/test_middleware.py b/tests/test_middleware.py
index 652306fef..9d48c4cc7 100644
--- a/tests/test_middleware.py
+++ b/tests/test_middleware.py
@@ -11,8 +11,11 @@
class CaptureResponseMiddleware(object):
- def process_response(self, req, resp, resource):
+ def process_response(self, req, resp, resource, req_succeeded):
+ self.req = req
self.resp = resp
+ self.resource = resource
+ self.req_succeeded = req_succeeded
class RequestTimeMiddleware(object):
@@ -25,9 +28,10 @@ def process_resource(self, req, resp, resource, params):
global context
context['mid_time'] = datetime.utcnow()
- def process_response(self, req, resp, resource):
+ def process_response(self, req, resp, resource, req_succeeded):
global context
context['end_time'] = datetime.utcnow()
+ context['req_succeeded'] = req_succeeded
class TransactionIdMiddleware(object):
@@ -36,6 +40,9 @@ def process_request(self, req, resp):
global context
context['transaction_id'] = 'unique-req-id'
+ def process_response(self, req, resp, resource):
+ pass
+
class ExecutedFirstMiddleware(object):
@@ -49,11 +56,18 @@ def process_resource(self, req, resp, resource, params):
context['executed_methods'].append(
'{0}.{1}'.format(self.__class__.__name__, 'process_resource'))
+ # NOTE(kgriffs): This also tests that the framework can continue to
+ # call process_response() methods that do not have a 'req_succeeded'
+ # arg.
def process_response(self, req, resp, resource):
global context
context['executed_methods'].append(
'{0}.{1}'.format(self.__class__.__name__, 'process_response'))
+ context['req'] = req
+ context['resp'] = resp
+ context['resource'] = resource
+
class ExecutedLastMiddleware(ExecutedFirstMiddleware):
pass
@@ -110,6 +124,7 @@ def test_skip_process_resource(self):
self.assertIn('start_time', context)
self.assertNotIn('mid_time', context)
self.assertIn('end_time', context)
+ self.assertFalse(context['req_succeeded'])
def test_add_invalid_middleware(self):
"""Test than an invalid class can not be added as middleware"""
@@ -147,6 +162,7 @@ def test_log_get_request(self):
body = self.simulate_json_request(self.test_route)
self.assertEqual(_EXPECTED_BODY, body)
self.assertEqual(self.srmock.status, falcon.HTTP_200)
+
self.assertIn('start_time', context)
self.assertIn('mid_time', context)
self.assertIn('end_time', context)
@@ -155,6 +171,8 @@ def test_log_get_request(self):
self.assertTrue(context['end_time'] >= context['start_time'],
'process_response not executed after request')
+ self.assertTrue(context['req_succeeded'])
+
class TestTransactionIdMiddleware(TestMiddleware):
@@ -194,6 +212,16 @@ def test_generate_trans_id_and_time_with_request(self):
self.assertTrue(context['end_time'] >= context['start_time'],
'process_response not executed after request')
+ def test_legacy_middleware_called_with_correct_args(self):
+ global context
+ self.api = falcon.API(middleware=[ExecutedFirstMiddleware()])
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ self.simulate_request(self.test_route)
+ self.assertIsInstance(context['req'], falcon.Request)
+ self.assertIsInstance(context['resp'], falcon.Response)
+ self.assertIsInstance(context['resource'], MiddlewareClassResource)
+
def test_middleware_execution_order(self):
global context
self.api = falcon.API(middleware=[ExecutedFirstMiddleware(),
@@ -220,6 +248,8 @@ def test_multiple_reponse_mw_throw_exception(self):
"""Test that error in inner middleware leaves"""
global context
+ context['req_succeeded'] = []
+
class RaiseStatusMiddleware(object):
def process_response(self, req, resp, resource):
raise falcon.HTTPStatus(falcon.HTTP_201)
@@ -229,10 +259,12 @@ def process_response(self, req, resp, resource):
raise falcon.HTTPError(falcon.HTTP_748)
class ProcessResponseMiddleware(object):
- def process_response(self, req, resp, resource):
+ def process_response(self, req, resp, resource, req_succeeded):
context['executed_methods'].append('process_response')
+ context['req_succeeded'].append(req_succeeded)
- self.api = falcon.API(middleware=[RaiseErrorMiddleware(),
+ self.api = falcon.API(middleware=[ProcessResponseMiddleware(),
+ RaiseErrorMiddleware(),
ProcessResponseMiddleware(),
RaiseStatusMiddleware(),
ProcessResponseMiddleware()])
@@ -242,8 +274,9 @@ def process_response(self, req, resp, resource):
self.assertEqual(self.srmock.status, falcon.HTTP_748)
- expected_methods = ['process_response', 'process_response']
+ expected_methods = ['process_response'] * 3
self.assertEqual(context['executed_methods'], expected_methods)
+ self.assertEqual(context['req_succeeded'], [True, False, False])
def test_inner_mw_throw_exception(self):
"""Test that error in inner middleware leaves"""
@@ -475,6 +508,13 @@ def test_error_composed_before_resp_middleware_called(self):
composed_body = json.loads(self.mw.resp.body)
self.assertEqual(composed_body['title'], self.srmock.status)
+ self.assertFalse(self.mw.req_succeeded)
+
+ # NOTE(kgriffs): Sanity-check the other params passed to
+ # process_response()
+ self.assertIsInstance(self.mw.req, falcon.Request)
+ self.assertIsInstance(self.mw.resource, MiddlewareClassResource)
+
def test_http_status_raised_from_error_handler(self):
def _http_error_handler(error, req, resp, params):
raise falcon.HTTPStatus(falcon.HTTP_201)
| diff --git a/docs/api/middleware.rst b/docs/api/middleware.rst
index f947b7fd6..b6e9036df 100644
--- a/docs/api/middleware.rst
+++ b/docs/api/middleware.rst
@@ -47,7 +47,7 @@ Falcon's middleware interface is defined as follows:
method as keyword arguments.
"""
- def process_response(self, req, resp, resource):
+ def process_response(self, req, resp, resource, req_succeeded):
"""Post-processing of the response (after routing).
Args:
@@ -56,6 +56,9 @@ Falcon's middleware interface is defined as follows:
resource: Resource object to which the request was
routed. May be None if no route was found
for the request.
+ req_succeeded: True if no exceptions were raised while
+ the framework processed and routed the request;
+ otherwise False.
"""
.. Tip::
| [
{
"components": [
{
"doc": "",
"lines": [
61,
66
],
"name": "prepare_middleware.let",
"signature": "def let(process_response=process_response): @wraps(process_response)",
"type": "function"
},
{
"doc": "",
"lin... | [
"tests/test_middleware.py::TestRequestTimeMiddleware::test_log_get_request",
"tests/test_middleware.py::TestRequestTimeMiddleware::test_skip_process_resource",
"tests/test_middleware.py::TestSeveralMiddlewares::test_generate_trans_id_and_time_with_request",
"tests/test_middleware.py::TestSeveralMiddlewares::t... | [
"tests/test_middleware.py::TestRequestTimeMiddleware::test_add_invalid_middleware",
"tests/test_middleware.py::TestRequestTimeMiddleware::test_response_middleware_raises_exception",
"tests/test_middleware.py::TestTransactionIdMiddleware::test_generate_trans_id_with_request",
"tests/test_middleware.py::TestSev... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat(middleware): Pass a request status flag to process_request()
Pass a flag to process_request() to indicate whether or not the request
succeeded (i.e., no exceptions were raised by other middleware methods
hooks, responders, etc.)
Closes #606
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/api_helpers.py]
(definition of prepare_middleware.let:)
def let(process_response=process_response): @wraps(process_response)
(definition of prepare_middleware.let.shim:)
def shim(req, resp, resource, req_succeeded):
[end of new definitions in falcon/api_helpers.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Middleware post process error handling
When middleware is added to the api and we have a post_process on it, any exceptions thrown in the responder are not aggregated to an appropriate response. Instead, they are ignored and the post_process attempts to run on as though nothing happened.
It does not matter whether an exception handler is added to the api or not.
Let me know if I just misunderstood how to handle these
----------
Hi @esmalifesum, thanks for bringing up this issue. You are correct in that the framework does not skip `process_response` methods when an exception is raised. This is done in case those methods need to perform critical post-processing steps.
When an instance of `falcon.HTTPError` is raised in a responder, the framework should render that into a response, while also executing `process_response`. Similarly, when other types of exception are raised, custom error handlers are given the chance to render that into a response. If you are seeing a situation in which the errors are not rendering into a response, that's definitely a bug.
Also, while looking at the code, I found a a case in which the exception is rendered to a response only after passing through the middleware components. Conceptually, as well as for the sake of consistency, I think it would make more sense to always render the response from the exception first.
Is there a particular type of middleware you hand in mind that should be short-circuited when an error is raised? If that functionality is useful, we can brainstorm some ways to support it.
Hi Kurt,
We have an exception handler currently wrapped around the responders. If
any of these raise an exception, we will handle the exception as our app
needs within the handler. If this happens, I would expect for my post
processors not to even be called as they are there to process only
valid/sane responses from the responders.
Are you saying that each middleware's post process method should have an
exception handler on its own as well?
Maybe I'm not thinking around this the right way and there's a better
pattern for it. If so, can you provide a more thorough scenario/pattern for
us to follow?
Esma
On Thu, Sep 17, 2015 at 5:31 AM, Kurt Griffiths notifications@github.com
wrote:
> Hi @esmalifesum https://github.com/esmalifesum, thanks for bringing up
> this issue. You are correct in that the framework does not skip
> process_response methods when an exception is raised. This is done in
> case those methods need to perform critical post-processing steps.
>
> When an instance of falcon.HTTPError is raised in a responder, the
> framework should render that into a response, while also executing
> process_response. Similarly, when other types of exception are raised,
> custom error handlers are given the chance to render that into a response.
> If you are seeing a situation in which the errors are not rendering into a
> response, that's definitely a bug.
>
> Also, while looking at the code, I found a a case in which the exception
> is rendered to a response only after passing through the middleware
> components. Conceptually, as well as for the sake of consistency, I think
> it would make more sense to always render the response from the exception
> first.
>
> Is there a particular type of middleware you hand in mind that should be
> short-circuited when an error is raised? If that functionality is useful,
> we can brainstorm some ways to support it.
>
> —
> Reply to this email directly or view it on GitHub
> https://github.com/falconry/falcon/issues/606#issuecomment-140959511.
There are times when you may want to perform post-processing regardless of whether an error was raised. But I can see your point about a need to be more selective in some cases.
One way we could support this would be by adding an additional middleware method, `process_successful_response()`. Thoughts?
Yes, I think something like that would allow us more flexibility and let us avoid coding nonsense into middleware just to handle exceptions when there is already an exceptions handler. My team would definitely use your suggested method.
Is there any progress regarding this issue?
I need to validate responses agains swagger schema.
When a falcon HTTP exception is raised in a responder or in process_resource middleware method (which raises HTTPBadRequest when request body is invalid), the status code of the response is still 200 when it comes to process_response.
At which point does falcon set appropriate response status and body when an HTTP error is raised?
How can I process only successful responses?
How can I identify that a response is not successful in a process_response middleware method?
I've created a stack overflow question where you can find a minimal example: http://stackoverflow.com/questions/38079209/error-handling-in-falcon-middleware
--------------------
</issues> | 77d5e6394a88ead151c9469494749f95f06b24bf |
boto__boto3-739 | 739 | boto/boto3 | null | 231c0ecade7787da1f6e6dc7c0725759c10b5a1d | 2016-07-28T19:17:44Z | diff --git a/.changes/next-release/feature-Session-9354.json b/.changes/next-release/feature-Session-9354.json
new file mode 100644
index 0000000000..c3a55c3402
--- /dev/null
+++ b/.changes/next-release/feature-Session-9354.json
@@ -0,0 +1,5 @@
+{
+ "category": "Session",
+ "type": "feature",
+ "description": "Expose available_profiles property for Session"
+}
diff --git a/boto3/session.py b/boto3/session.py
index 6a216dc838..6efe41bc3c 100644
--- a/boto3/session.py
+++ b/boto3/session.py
@@ -106,6 +106,13 @@ def events(self):
"""
return self._session.get_component('event_emitter')
+ @property
+ def available_profiles(self):
+ """
+ The profiles available to the session credentials
+ """
+ return self._session.available_profiles
+
def _setup_loader(self):
"""
Setup loader paths so that we can load resources.
| diff --git a/tests/unit/test_session.py b/tests/unit/test_session.py
index f5892d8b3f..d7df5aad3c 100644
--- a/tests/unit/test_session.py
+++ b/tests/unit/test_session.py
@@ -22,6 +22,7 @@
class TestSession(BaseTestCase):
+
def test_repr(self):
bc_session = self.bc_session_cls.return_value
bc_session.get_credentials.return_value.access_key = 'abc123'
@@ -56,7 +57,7 @@ def test_arguments_not_required(self):
Session()
self.assertTrue(self.bc_session_cls.called,
- 'Botocore session was not created')
+ 'Botocore session was not created')
def test_credentials_can_be_set(self):
bc_session = self.bc_session_cls.return_value
@@ -67,9 +68,9 @@ def test_credentials_can_be_set(self):
aws_session_token='token')
self.assertTrue(self.bc_session_cls.called,
- 'Botocore session was not created')
+ 'Botocore session was not created')
self.assertTrue(bc_session.set_credentials.called,
- 'Botocore session set_credentials not called from constructor')
+ 'Botocore session set_credentials not called from constructor')
bc_session.set_credentials.assert_called_with(
'key', 'secret', 'token')
@@ -115,6 +116,13 @@ def test_profile_default(self):
self.assertEqual(session.profile_name, 'default')
+ def test_available_profiles(self):
+ bc_session = mock.Mock()
+ bc_session.available_profiles.return_value = ['foo','bar']
+ session = Session(botocore_session=bc_session)
+ profiles = session.available_profiles
+ self.assertEqual(len(profiles.return_value), 2)
+
def test_custom_session(self):
bc_session = self.bc_session_cls()
self.bc_session_cls.reset_mock()
@@ -170,7 +178,7 @@ def test_get_available_services(self):
session.get_available_services()
self.assertTrue(bc_session.get_available_services.called,
- 'Botocore session get_available_services not called')
+ 'Botocore session get_available_services not called')
def test_get_available_resources(self):
mock_bc_session = mock.Mock()
@@ -207,7 +215,7 @@ def test_create_client(self):
client = session.client('sqs', region_name='us-west-2')
self.assertTrue(client,
- 'No low-level client was returned')
+ 'No low-level client was returned')
def test_create_client_with_args(self):
bc_session = self.bc_session_cls.return_value
@@ -225,7 +233,8 @@ def test_create_resource_with_args(self):
mock_bc_session = mock.Mock()
loader = mock.Mock(spec=loaders.Loader)
loader.determine_latest_version.return_value = '2014-11-02'
- loader.load_service_model.return_value = {'resources': [], 'service': []}
+ loader.load_service_model.return_value = {
+ 'resources': [], 'service': []}
mock_bc_session.get_component.return_value = loader
session = Session(botocore_session=mock_bc_session)
session.resource_factory.load_from_definition = mock.Mock()
@@ -246,7 +255,8 @@ def test_create_resource_with_config(self):
mock_bc_session = mock.Mock()
loader = mock.Mock(spec=loaders.Loader)
loader.determine_latest_version.return_value = '2014-11-02'
- loader.load_service_model.return_value = {'resources': [], 'service': []}
+ loader.load_service_model.return_value = {
+ 'resources': [], 'service': []}
mock_bc_session.get_component.return_value = loader
session = Session(botocore_session=mock_bc_session)
session.resource_factory.load_from_definition = mock.Mock()
@@ -268,7 +278,8 @@ def test_create_resource_with_config_override_user_agent_extra(self):
mock_bc_session = mock.Mock()
loader = mock.Mock(spec=loaders.Loader)
loader.determine_latest_version.return_value = '2014-11-02'
- loader.load_service_model.return_value = {'resources': [], 'service': []}
+ loader.load_service_model.return_value = {
+ 'resources': [], 'service': []}
mock_bc_session.get_component.return_value = loader
session = Session(botocore_session=mock_bc_session)
session.resource_factory.load_from_definition = mock.Mock()
@@ -290,7 +301,8 @@ def test_create_resource_latest_version(self):
mock_bc_session = mock.Mock()
loader = mock.Mock(spec=loaders.Loader)
loader.determine_latest_version.return_value = '2014-11-02'
- loader.load_service_model.return_value = {'resources': [], 'service': []}
+ loader.load_service_model.return_value = {
+ 'resources': [], 'service': []}
mock_bc_session.get_component.return_value = loader
session = Session(botocore_session=mock_bc_session)
session.resource_factory.load_from_definition = mock.Mock()
| diff --git a/.changes/next-release/feature-Session-9354.json b/.changes/next-release/feature-Session-9354.json
new file mode 100644
index 0000000000..c3a55c3402
--- /dev/null
+++ b/.changes/next-release/feature-Session-9354.json
@@ -0,0 +1,5 @@
+{
+ "category": "Session",
+ "type": "feature",
+ "description": "Expose available_profiles property for Session"
+}
| [
{
"components": [
{
"doc": "The profiles available to the session credentials",
"lines": [
110,
114
],
"name": "Session.available_profiles",
"signature": "def available_profiles(self):",
"type": "function"
}
],
"file": "boto... | [
"tests/unit/test_session.py::TestSession::test_available_profiles"
] | [
"tests/unit/test_session.py::TestSession::test_arguments_not_required",
"tests/unit/test_session.py::TestSession::test_bad_resource_name",
"tests/unit/test_session.py::TestSession::test_bad_resource_name_with_no_client_has_simple_err_msg",
"tests/unit/test_session.py::TestSession::test_can_access_region_name"... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Expose session.available_profiles from botocore to Session
Implements https://github.com/boto/boto3/issues/704
Exposes Session.available_profiles from botocore to session in boto3
also my linter wanted to cleanup the test_session.py file in addition to adding my test there.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/session.py]
(definition of Session.available_profiles:)
def available_profiles(self):
"""The profiles available to the session credentials"""
[end of new definitions in boto3/session.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | |
sympy__sympy-11431 | 11,431 | sympy/sympy | 1.0 | 17d046e8386fb4098d442b04f4b7bcf8a798f5b9 | 2016-07-25T17:47:23Z | diff --git a/doc/src/modules/physics/mechanics/api/system.rst b/doc/src/modules/physics/mechanics/api/system.rst
new file mode 100644
index 000000000000..fb2da1898af2
--- /dev/null
+++ b/doc/src/modules/physics/mechanics/api/system.rst
@@ -0,0 +1,9 @@
+===========================
+SymbolicSystem (Docstrings)
+===========================
+
+SymbolicSystem
+==============
+
+.. automodule:: sympy.physics.mechanics.system
+ :members:
diff --git a/doc/src/modules/physics/mechanics/index.rst b/doc/src/modules/physics/mechanics/index.rst
index e6cb40f9dac9..7a40f880f09e 100644
--- a/doc/src/modules/physics/mechanics/index.rst
+++ b/doc/src/modules/physics/mechanics/index.rst
@@ -76,6 +76,7 @@ Guide to Mechanics
masses.rst
kane.rst
lagrange.rst
+ symsystem.rst
linearize.rst
examples.rst
advanced.rst
@@ -89,6 +90,7 @@ Mechanics API
api/part_bod.rst
api/kane_lagrange.rst
+ api/system.rst
api/linearize.rst
api/expr_manip.rst
api/printing.rst
diff --git a/doc/src/modules/physics/mechanics/symsystem.rst b/doc/src/modules/physics/mechanics/symsystem.rst
new file mode 100644
index 000000000000..a5ca30feb8b7
--- /dev/null
+++ b/doc/src/modules/physics/mechanics/symsystem.rst
@@ -0,0 +1,202 @@
+=====================================
+Symbolic Systems in Physics/Mechanics
+=====================================
+
+The `SymbolicSystem` class in physics/mechanics is a location for the pertinent
+information of a multibody dynamic system. In its most basic form it contains
+the equations of motion for the dynamic system, however, it can also contain
+information regarding the loads that the system is subject to, the bodies that
+the system is comprised of and any additional equations the user feels is
+important for the system. The goal of this class is to provide a unified output
+format for the equations of motion that numerical analysis code can be designed
+around.
+
+SymbolicSystem Example Usage
+============================
+
+This code will go over the manual input of the equations of motion for the
+simple pendulum that uses the Cartesian location of the mass as the generalized
+coordinates into `SymbolicSystem`.
+
+The equations of motion are formed in the physics/mechanics/examples_. In that
+spot the variables q1 and q2 are used in place of x and y and the reference
+frame is rotated 90 degrees.
+
+.. _examples: ../examples/lin_pend_nonmin_example.html
+
+::
+
+ >>> from sympy import atan, symbols, Matrix
+ >>> from sympy.physics.mechanics import (dynamicsymbols, ReferenceFrame,
+ ... Particle, Point)
+ >>> import sympy.physics.mechanics.system as system
+
+The first step will be to initialize all of the dynamic and constant symbols. ::
+
+ >>> x, y, u, v, lam = dynamicsymbols('x y u v lambda')
+ >>> m, l, g = symbols('m l g')
+
+Next step is to define the equations of motion in multiple forms:
+
+ [1] Explicit form where the kinematics and dynamics are combined
+ x' = F_1(x, t, r, p)
+
+ [2] Implicit form where the kinematics and dynamics are combined
+ M_2(x, p) x' = F_2(x, t, r, p)
+
+ [3] Implicit form where the kinematics and dynamics are separate
+ M_3(q, p) u' = F_3(q, u, t, r, p)
+ q' = G(q, u, t, r, p)
+
+where
+
+ x : states, e.g. [q, u]
+ t : time
+ r : specified (exogenous) inputs
+ p : constants
+ q : generalized coordinates
+ u : generalized speeds
+ F_1 : right hand side of the combined equations in explicit form
+ F_2 : right hand side of the combined equations in implicit form
+ F_3 : right hand side of the dynamical equations in implicit form
+ M_2 : mass matrix of the combined equations in implicit form
+ M_3 : mass matrix of the dynamical equations in implicit form
+ G : right hand side of the kinematical differential equations
+
+::
+
+ >>> dyn_implicit_mat = Matrix([[1, 0, -x/m],
+ ... [0, 1, -y/m],
+ ... [0, 0, l**2/m]])
+ >>> dyn_implicit_rhs = Matrix([0, 0, u**2 + v**2 - g*y])
+ >>> comb_implicit_mat = Matrix([[1, 0, 0, 0, 0],
+ ... [0, 1, 0, 0, 0],
+ ... [0, 0, 1, 0, -x/m],
+ ... [0, 0, 0, 1, -y/m],
+ ... [0, 0, 0, 0, l**2/m]])
+ >>> comb_implicit_rhs = Matrix([u, v, 0, 0, u**2 + v**2 - g*y])
+ >>> kin_explicit_rhs = Matrix([u, v])
+ >>> comb_explicit_rhs = comb_implicit_mat.LUsolve(comb_implicit_rhs)
+
+Now the reference frames, points and particles will be set up so this
+information can be passed into `system.SymbolicSystem` in the form of a bodies
+and loads iterable. ::
+
+ >>> theta = atan(x/y)
+ >>> omega = dynamicsymbols('omega')
+ >>> N = ReferenceFrame('N')
+ >>> A = N.orientnew('A', 'Axis', [theta, N.z])
+ >>> A.set_ang_vel(N, omega * N.z)
+ >>> O = Point('O')
+ >>> O.set_vel(N, 0)
+ >>> P = O.locatenew('P', l * A.x)
+ >>> P.v2pt_theory(O, N, A)
+ l*omega*A.y
+ >>> Pa = Particle('Pa', P, m)
+
+Now the bodies and loads iterables need to be initialized. ::
+
+ >>> bodies = [Pa]
+ >>> loads = [(P, g * m * N.x)]
+
+The equations of motion are in the form of a differential algebraic equation
+(DAE) and DAE solvers need to know which of the equations are the algebraic
+expressions. This information is passed into `SymbolicSystem` as a list
+specifying which rows are the algebraic equations. In this example it is a
+different row based on the chosen equations of motion format. The row index
+should always correspond to the mass matrix that is being input to the
+`SymbolicSystem` class but will always correspond to the row index of the
+combined dynamics and kinematics when being accessed from the `SymbolicSystem`
+class. ::
+
+ >>> alg_con = [2]
+ >>> alg_con_full = [4]
+
+An iterable containing the states now needs to be created for the system. The
+`SymbolicSystem` class can determine which of the states are considered
+coordinates or speeds by passing in the indexes of the coordinates and speeds.
+If these indexes are not passed in the object will not be able to differentiate
+between coordinates and speeds. ::
+
+ >>> states = (x, y, u, v, lam)
+ >>> coord_idxs = (0, 1)
+ >>> speed_idxs = (2, 3)
+
+Now the equations of motion instances can be created using the above mentioned
+equations of motion formats. ::
+
+ >>> symsystem1 = system.SymbolicSystem(states, comb_explicit_rhs,
+ ... alg_con=alg_con_full, bodies=bodies,
+ ... loads=loads)
+ >>> symsystem2 = system.SymbolicSystem(states, comb_implicit_rhs,
+ ... mass_matrix=comb_implicit_mat,
+ ... alg_con=alg_con_full,
+ ... coord_idxs=coord_idxs)
+ >>> symsystem3 = system.SymbolicSystem(states, dyn_implicit_rhs,
+ ... mass_matrix=dyn_implicit_mat,
+ ... coordinate_derivatives=kin_explicit_rhs,
+ ... alg_con=alg_con,
+ ... coord_idxs=coord_idxs,
+ ... speed_idxs=speed_idxs)
+
+ Like coordinates and speeds, the bodies and loads attributes can only be
+ accessed if they are specified during initialization of the `SymbolicSystem`
+ class. Lastly here are some attributes accessible from the `SymbolicSystem`
+ class. ::
+
+ >>> symsystem1.states
+ Matrix([
+ [ x(t)],
+ [ y(t)],
+ [ u(t)],
+ [ v(t)],
+ [lambda(t)]])
+ >>> symsystem2.coordinates
+ Matrix([
+ [x(t)],
+ [y(t)]])
+ >>> symsystem3.speeds
+ Matrix([
+ [u(t)],
+ [v(t)]])
+ >>> symsystem1.comb_explicit_rhs
+ Matrix([
+ [ u(t)],
+ [ v(t)],
+ [(-g*y(t) + u(t)**2 + v(t)**2)*x(t)/l**2],
+ [(-g*y(t) + u(t)**2 + v(t)**2)*y(t)/l**2],
+ [ m*(-g*y(t) + u(t)**2 + v(t)**2)/l**2]])
+ >>> symsystem2.comb_implicit_rhs
+ Matrix([
+ [ u(t)],
+ [ v(t)],
+ [ 0],
+ [ 0],
+ [-g*y(t) + u(t)**2 + v(t)**2]])
+ >>> symsystem2.comb_implicit_mat
+ Matrix([
+ [1, 0, 0, 0, 0],
+ [0, 1, 0, 0, 0],
+ [0, 0, 1, 0, -x(t)/m],
+ [0, 0, 0, 1, -y(t)/m],
+ [0, 0, 0, 0, l**2/m]])
+ >>> symsystem3.dyn_implicit_rhs
+ Matrix([
+ [ 0],
+ [ 0],
+ [-g*y(t) + u(t)**2 + v(t)**2]])
+ >>> symsystem3.dyn_implicit_mat
+ Matrix([
+ [1, 0, -x(t)/m],
+ [0, 1, -y(t)/m],
+ [0, 0, l**2/m]])
+ >>> symsystem3.kin_explicit_rhs
+ Matrix([
+ [u(t)],
+ [v(t)]])
+ >>> symsystem1.alg_con
+ [4]
+ >>> symsystem1.bodies
+ (Pa,)
+ >>> symsystem1.loads
+ ((P, g*m*N.x),)
diff --git a/sympy/physics/mechanics/__init__.py b/sympy/physics/mechanics/__init__.py
index 20dfe76d1f14..56fb5f7ee2a6 100644
--- a/sympy/physics/mechanics/__init__.py
+++ b/sympy/physics/mechanics/__init__.py
@@ -42,3 +42,7 @@
from . import body
from .body import *
__all__.extend(body.__all__)
+
+from . import system
+from .system import *
+__all__.extend(system.__all__)
diff --git a/sympy/physics/mechanics/system.py b/sympy/physics/mechanics/system.py
new file mode 100644
index 000000000000..15bbce6d7141
--- /dev/null
+++ b/sympy/physics/mechanics/system.py
@@ -0,0 +1,441 @@
+from sympy import eye, Matrix, zeros
+from sympy.physics.mechanics import dynamicsymbols
+from sympy.physics.mechanics.functions import find_dynamicsymbols
+
+__all__ = ['SymbolicSystem']
+
+
+class SymbolicSystem(object):
+ """SymbolicSystem is a class that contains all the information about a
+ system in a symbolic format such as the equations of motions and the bodies
+ and loads in the system.
+
+ There are three ways that the equations of motion can be described for
+ Symbolic System:
+
+
+ [1] Explicit form where the kinematics and dynamics are combined
+ x' = F_1(x, t, r, p)
+
+ [2] Implicit form where the kinematics and dynamics are combined
+ M_2(x, p) x' = F_2(x, t, r, p)
+
+ [3] Implicit form where the kinematics and dynamics are separate
+ M_3(q, p) u' = F_3(q, u, t, r, p)
+ q' = G(q, u, t, r, p)
+
+ where
+
+ x : states, e.g. [q, u]
+ t : time
+ r : specified (exogenous) inputs
+ p : constants
+ q : generalized coordinates
+ u : generalized speeds
+ F_1 : right hand side of the combined equations in explicit form
+ F_2 : right hand side of the combined equations in implicit form
+ F_3 : right hand side of the dynamical equations in implicit form
+ M_2 : mass matrix of the combined equations in implicit form
+ M_3 : mass matrix of the dynamical equations in implicit form
+ G : right hand side of the kinematical differential equations
+
+ Parameters
+ ==========
+
+ coord_states : ordered iterable of functions of time
+ This input will either be a collection of the coordinates or states
+ of the system depending on whether or not the speeds are also given.
+ If speeds are specified this input will be assumed to be the
+ coordinates otherwise this input will be assumed to be the states.
+
+ right_hand_side : Matrix
+ This variable is the right hand side of the equations of motion in
+ any of the forms. The specific form will be assumed depending on
+ whether a mass matrix or coordinate derivatives are given.
+
+ speeds : ordered iterable of functions of time, optional
+ This is a collection of the generalized speeds of the system. If
+ given it will be assumed that the first argument (coord_states) will
+ represent the generalized coordinates of the system.
+
+ mass_matrix : Matrix, optional
+ The matrix of the implicit forms of the equations of motion (forms
+ [2] and [3]). The distinction between the forms is determined by
+ whether or not the coordinate derivatives are passed in. If they are
+ given form [3] will be assumed otherwise form [2] is assumed.
+
+ coordinate_derivatives : Matrix, optional
+ The right hand side of the kinematical equations in explicit form.
+ If given it will be assumed that the equations of motion are being
+ entered in form [3].
+
+ alg_con : Iterable, optional
+ The indexes of the rows in the equations of motion that contain
+ algebraic constraints instead of differential equations. If the
+ equations are input in form [3], it will be assumed the indexes are
+ referencing the mass_matrix/right_hand_side combination and not the
+ coordinate_derivatives.
+
+ output_eqns : Dictionary, optional
+ Any output equations that are desired to be tracked are stored in a
+ dictionary where the key corresponds to the name given for the
+ specific equation and the value is the equation itself in symbolic
+ form
+
+ coord_idxs : Iterable, optional
+ If coord_states corresponds to the states rather than the
+ coordinates this variable will tell SymbolicSystem which indexes of
+ the states correspond to generalized coordinates.
+
+ speed_idxs : Iterable, optional
+ If coord_states corresponds to the states rather than the
+ coordinates this variable will tell SymbolicSystem which indexes of
+ the states correspond to generalized speeds.
+
+ bodies : iterable of Body/Rigidbody objects, optional
+ Iterable containing the bodies of the system
+
+ loads : iterable of load instances (described below), optional
+ Iterable containing the loads of the system where forces are given
+ by (point of application, force vector) and torques are given by
+ (reference frame acting upon, torque vector). Ex [(point, force),
+ (ref_frame, torque)]
+
+ Attributes
+ ==========
+
+ coordinates : Matrix, shape(n, 1)
+ This is a matrix containing the generalized coordinates of the system
+
+ speeds : Matrix, shape(m, 1)
+ This is a matrix containing the generalized speeds of the system
+
+ states : Matrix, shape(o, 1)
+ This is a matrix containing the state variables of the system
+
+ alg_con : List
+ This list contains the indices of the algebraic constraints in the
+ combined equations of motion. The presence of these constraints requires
+ that a DAE solver be used instead of an ODE solver. If the system is
+ given in form [3] the alg_con variable will be adjusted such that it is
+ a representation of the combined kinematics and dynamics thus make sure
+ it always matches the mass matrix entered.
+
+ dyn_implicit_mat : Matrix, shape(m, m)
+ This is the M matrix in form [3] of the equations of motion (the mass
+ matrix or generalized inertia matrix of the dynamical equations of
+ motion in implicit form).
+
+ dyn_implicit_rhs : Matrix, shape(m, 1)
+ This is the F vector in form [3] of the equations of motion (the right
+ hand side of the dynamical equations of motion in implicit form).
+
+ comb_implicit_mat : Matrix, shape(o, o)
+ This is the M matrix in form [2] of the equations of motion. This matrix
+ contains a block diagonal structure where the top left block (the first
+ rows) represent the matrix in the implicit form of the kinematical
+ equations and the bottom right block (the last rows) represent the
+ matrix in the implicit form of the dynamical equations.
+
+ comb_implicit_rhs : Matrix, shape(o, 1)
+ This is the F vector in form [2] of the equations of motion. The top
+ part of the vector represents the right hand side of the implicit form
+ of the kinemaical equations and the bottom of the vector represents the
+ right hand side of the implicit form of the dynamical equations of
+ motion.
+
+ comb_explicit_rhs : Matrix, shape(o, 1)
+ This vector represents the right hand side of the combined equations of
+ motion in explicit form (form [1] from above).
+
+ kin_explicit_rhs : Matrix, shape(m, 1)
+ This is the right hand side of the explicit form of the kinematical
+ equations of motion as can be seen in form [3] (the G matrix).
+
+ output_eqns : Dictionary
+ If output equations were given they are stored in a dictionary where the
+ key corresponds to the name given for the specific equation and the
+ value is the equation itself in symbolic form
+
+ bodies : Tuple
+ If the bodies in the system were given they are stored in a tuple for
+ future access
+
+ loads : Tuple
+ If the loads in the system were given they are stored in a tuple for
+ future access. This includes forces and torques where forces are given
+ by (point of application, force vector) and torques are given by
+ (reference frame acted upon, torque vector).
+
+ Example
+ =======
+
+ As a simple example, the dynamics of a simple pendulum will be input into a
+ SymbolicSystem object manually. First some imports will be needed and then
+ symbols will be set up for the length of the pendulum (l), mass at the end
+ of the pendulum (m), and a constant for gravity (g). ::
+
+ >>> from sympy import Matrix, sin, symbols
+ >>> from sympy.physics.mechanics import dynamicsymbols, SymbolicSystem
+ >>> l, m, g = symbols('l m g')
+
+ The system will be defined by an angle of theta from the vertical and a
+ generalized speed of omega will be used where omega = theta_dot. ::
+
+ >>> theta, omega = dynamicsymbols('theta omega')
+
+ Now the equations of motion are ready to be formed and passed to the
+ SymbolicSystem object. ::
+
+ >>> kin_explicit_rhs = Matrix([omega])
+ >>> dyn_implicit_mat = Matrix([l**2 * m])
+ >>> dyn_implicit_rhs = Matrix([-g * l * m * sin(theta)])
+ >>> symsystem = SymbolicSystem([theta], dyn_implicit_rhs, [omega],
+ ... dyn_implicit_mat)
+
+ Notes
+ =====
+
+ m : number of generalized speeds
+ n : number of generalized coordinates
+ o : number of states
+
+ """
+
+ def __init__(self, coord_states, right_hand_side, speeds=None,
+ mass_matrix=None, coordinate_derivatives=None, alg_con=None,
+ output_eqns={}, coord_idxs=None, speed_idxs=None, bodies=None,
+ loads=None):
+ """Initializes a SymbolicSystem object"""
+
+ # Extract information on speeds, coordinates and states
+ if speeds is None:
+ self._states = Matrix(coord_states)
+
+ if coord_idxs is None:
+ self._coordinates = None
+ else:
+ coords = [coord_states[i] for i in coord_idxs]
+ self._coordinates = Matrix(coords)
+
+ if speed_idxs is None:
+ self._speeds = None
+ else:
+ speeds_inter = [coord_states[i] for i in speed_idxs]
+ self._speeds = Matrix(speeds_inter)
+ else:
+ self._coordinates = Matrix(coord_states)
+ self._speeds = Matrix(speeds)
+ self._states = self._coordinates.col_join(self._speeds)
+
+ # Extract equations of motion form
+ if coordinate_derivatives is not None:
+ self._kin_explicit_rhs = coordinate_derivatives
+ self._dyn_implicit_rhs = right_hand_side
+ self._dyn_implicit_mat = mass_matrix
+ self._comb_implicit_rhs = None
+ self._comb_implicit_mat = None
+ self._comb_explicit_rhs = None
+ elif mass_matrix is not None:
+ self._kin_explicit_rhs = None
+ self._dyn_implicit_rhs = None
+ self._dyn_implicit_mat = None
+ self._comb_implicit_rhs = right_hand_side
+ self._comb_implicit_mat = mass_matrix
+ self._comb_explicit_rhs = None
+ else:
+ self._kin_explicit_rhs = None
+ self._dyn_implicit_rhs = None
+ self._dyn_implicit_mat = None
+ self._comb_implicit_rhs = None
+ self._comb_implicit_mat = None
+ self._comb_explicit_rhs = right_hand_side
+
+ # Set the remainder of the inputs as instance attributes
+ if alg_con is not None and coordinate_derivatives is not None:
+ alg_con = [i + len(coordinate_derivatives) for i in alg_con]
+ self._alg_con = alg_con
+ self.output_eqns = output_eqns
+
+ # Change the body and loads iterables to tuples if they are not tuples
+ # already
+ if type(bodies) != tuple and bodies is not None:
+ bodies = tuple(bodies)
+ if type(loads) != tuple and loads is not None:
+ loads = tuple(loads)
+ self._bodies = bodies
+ self._loads = loads
+
+ @property
+ def coordinates(self):
+ """Returns the column matrix of the generalized coordinates"""
+ if self._coordinates is None:
+ raise AttributeError("The coordinates were not specified.")
+ else:
+ return self._coordinates
+
+ @property
+ def speeds(self):
+ """Returns the column matrix of generalized speeds"""
+ if self._speeds is None:
+ raise AttributeError("The speeds were not specified.")
+ else:
+ return self._speeds
+
+ @property
+ def states(self):
+ """Returns the column matrix of the state variables"""
+ return self._states
+
+ @property
+ def alg_con(self):
+ """Returns a list with the indices of the rows containing algebraic
+ constraints in the combined form of the equations of motion"""
+ return self._alg_con
+
+ @property
+ def dyn_implicit_mat(self):
+ """Returns the matrix, M, corresponding to the dynamic equations in
+ implicit form, M x' = F, where the kinematical equations are not
+ included"""
+ if self._dyn_implicit_mat is None:
+ raise AttributeError("dyn_implicit_mat is not specified for "
+ "equations of motion form [1] or [2].")
+ else:
+ return self._dyn_implicit_mat
+
+ @property
+ def dyn_implicit_rhs(self):
+ """Returns the column matrix, F, corresponding to the dynamic equations
+ in implicit form, M x' = F, where the kinematical equations are not
+ included"""
+ if self._dyn_implicit_rhs is None:
+ raise AttributeError("dyn_implicit_rhs is not specified for "
+ "equations of motion form [1] or [2].")
+ else:
+ return self._dyn_implicit_rhs
+
+ @property
+ def comb_implicit_mat(self):
+ """Returns the matrix, M, corresponding to the equations of motion in
+ implicit form (form [2]), M x' = F, where the kinematical equations are
+ included"""
+ if self._comb_implicit_mat is None:
+ if self._dyn_implicit_mat is not None:
+ num_kin_eqns = len(self._kin_explicit_rhs)
+ num_dyn_eqns = len(self._dyn_implicit_rhs)
+ zeros1 = zeros(num_kin_eqns, num_dyn_eqns)
+ zeros2 = zeros(num_dyn_eqns, num_kin_eqns)
+ inter1 = eye(num_kin_eqns).row_join(zeros1)
+ inter2 = zeros2.row_join(self._dyn_implicit_mat)
+ self._comb_implicit_mat = inter1.col_join(inter2)
+ return self._comb_implicit_mat
+ else:
+ raise AttributeError("comb_implicit_mat is not specified for "
+ "equations of motion form [1].")
+ else:
+ return self._comb_implicit_mat
+
+ @property
+ def comb_implicit_rhs(self):
+ """Returns the column matrix, F, corresponding to the equations of
+ motion in implicit form (form [2]), M x' = F, where the kinematical
+ equations are included"""
+ if self._comb_implicit_rhs is None:
+ if self._dyn_implicit_rhs is not None:
+ kin_inter = self._kin_explicit_rhs
+ dyn_inter = self._dyn_implicit_rhs
+ self._comb_implicit_rhs = kin_inter.col_join(dyn_inter)
+ return self._comb_implicit_rhs
+ else:
+ raise AttributeError("comb_implicit_mat is not specified for "
+ "equations of motion in form [1].")
+ else:
+ return self._comb_implicit_rhs
+
+ def compute_explicit_form(self):
+ """If the explicit right hand side of the combined equations of motion
+ is to provided upon initialization, this method will calculate it. This
+ calculation can potentially take awhile to compute."""
+ if self._comb_explicit_rhs is not None:
+ raise AttributeError("comb_explicit_rhs is already formed.")
+ else:
+ try:
+ inter1 = self.kin_explicit_rhs
+ inter2 = self._dyn_implicit_mat.LUsolve(self._dyn_implicit_rhs)
+ out = inter1.col_join(inter2)
+ except AttributeError:
+ out = self._comb_implicit_mat.LUsolve(self._comb_implicit_rhs)
+
+ self._comb_explicit_rhs = out
+
+ @property
+ def comb_explicit_rhs(self):
+ """Returns the right hand side of the equations of motion in explicit
+ form, x' = F, where the kinematical equations are included"""
+ if self._comb_explicit_rhs is None:
+ raise AttributeError("Please run .combute_explicit_form before "
+ "attempting to access comb_explicit_rhs.")
+ else:
+ return self._comb_explicit_rhs
+
+ @property
+ def kin_explicit_rhs(self):
+ """Returns the right hand side of the kinematical equations in explicit
+ form, q' = G"""
+ if self._kin_explicit_rhs is None:
+ raise AttributeError("kin_explicit_rhs is not specified for "
+ "equations of motion form [1] or [2].")
+ else:
+ return self._kin_explicit_rhs
+
+ def dynamic_symbols(self):
+ """Returns a column matrix containing all of the symbols in the system
+ that depend on time"""
+ # Create a list of all of the expressions in the equations of motion
+ if self._comb_explicit_rhs is None:
+ eom_expressions = (self.comb_implicit_mat[:] +
+ self.comb_implicit_rhs[:])
+ else:
+ eom_expressions = (self._comb_explicit_rhs[:])
+
+ functions_of_time = set()
+ for expr in eom_expressions:
+ functions_of_time = functions_of_time.union(
+ find_dynamicsymbols(expr))
+ functions_of_time = functions_of_time.union(self._states)
+
+ return tuple(functions_of_time)
+
+ def constant_symbols(self):
+ """Returns a column matrix containing all of the symbols in the system
+ that do not depend on time"""
+ # Create a list of all of the expressions in the equations of motion
+ if self._comb_explicit_rhs is None:
+ eom_expressions = (self.comb_implicit_mat[:] +
+ self.comb_implicit_rhs[:])
+ else:
+ eom_expressions = (self._comb_explicit_rhs[:])
+
+ constants = set()
+ for expr in eom_expressions:
+ constants = constants.union(expr.free_symbols)
+ constants.remove(dynamicsymbols._t)
+
+ return tuple(constants)
+
+ @property
+ def bodies(self):
+ """Returns the bodies in the system"""
+ if self._bodies is None:
+ raise AttributeError("bodies were not specified for the system.")
+ else:
+ return self._bodies
+
+ @property
+ def loads(self):
+ """Returns the loads in the system"""
+ if self._loads is None:
+ raise AttributeError("loads were not specified for the system.")
+ else:
+ return self._loads
| diff --git a/sympy/physics/mechanics/tests/test_system.py b/sympy/physics/mechanics/tests/test_system.py
new file mode 100644
index 000000000000..99d340241b0d
--- /dev/null
+++ b/sympy/physics/mechanics/tests/test_system.py
@@ -0,0 +1,244 @@
+from sympy import symbols, Matrix, atan, simplify, zeros
+from sympy.physics.mechanics import (dynamicsymbols, Particle, Point,
+ ReferenceFrame, SymbolicSystem)
+from sympy.utilities.pytest import raises
+
+# This class is going to be tested using a simple pendulum set up in x and y
+# coordinates
+x, y, u, v, lam = dynamicsymbols('x y u v lambda')
+m, l, g = symbols('m l g')
+
+# Set up the different forms the equations can take
+# [1] Explicit form where the kinematics and dynamics are combined
+# x' = F(x, t, r, p)
+#
+# [2] Implicit form where the kinematics and dynamics are combined
+# M(x, p) x' = F(x, t, r, p)
+#
+# [3] Implicit form where the kinematics and dynamics are separate
+# M(q, p) u' = F(q, u, t, r, p)
+# q' = G(q, u, t, r, p)
+dyn_implicit_mat = Matrix([[1, 0, -x/m],
+ [0, 1, -y/m],
+ [0, 0, l**2/m]])
+
+dyn_implicit_rhs = Matrix([0, 0, u**2 + v**2 - g*y])
+
+comb_implicit_mat = Matrix([[1, 0, 0, 0, 0],
+ [0, 1, 0, 0, 0],
+ [0, 0, 1, 0, -x/m],
+ [0, 0, 0, 1, -y/m],
+ [0, 0, 0, 0, l**2/m]])
+
+comb_implicit_rhs = Matrix([u, v, 0, 0, u**2 + v**2 - g*y])
+
+kin_explicit_rhs = Matrix([u, v])
+
+comb_explicit_rhs = comb_implicit_mat.LUsolve(comb_implicit_rhs)
+
+# Set up a body and load to pass into the system
+theta = atan(x / y)
+N = ReferenceFrame('N')
+A = N.orientnew('A', 'Axis', [theta, N.z])
+O = Point('O')
+P = O.locatenew('P', l * A.x)
+
+Pa = Particle('Pa', P, m)
+
+bodies = [Pa]
+loads = [(P, g * m * N.x)]
+
+# Set up some output equations to be given to SymbolicSystem
+# Change to make these fit the pendulum
+PE = symbols("PE")
+out_eqns = {PE: m*g*(l+y)}
+
+# Set up remaining arguments that can be passed to SymbolicSystem
+alg_con = [2]
+alg_con_full = [4]
+coordinates = (x, y, lam)
+speeds = (u, v)
+states = (x, y, u, v, lam)
+coord_idxs = (0, 1)
+speed_idxs = (2, 3)
+
+
+def test_form_1():
+ symsystem1 = SymbolicSystem(states, comb_explicit_rhs,
+ alg_con=alg_con_full, output_eqns=out_eqns,
+ coord_idxs=coord_idxs, speed_idxs=speed_idxs,
+ bodies=bodies, loads=loads)
+
+ assert symsystem1.coordinates == Matrix([x, y])
+ assert symsystem1.speeds == Matrix([u, v])
+ assert symsystem1.states == Matrix([x, y, u, v, lam])
+
+ assert symsystem1.alg_con == [4]
+
+ inter = comb_explicit_rhs
+ assert simplify(symsystem1.comb_explicit_rhs - inter) == zeros(5, 1)
+
+ assert set(symsystem1.dynamic_symbols()) == set([y, v, lam, u, x])
+ assert type(symsystem1.dynamic_symbols()) == tuple
+ assert set(symsystem1.constant_symbols()) == set([l, g, m])
+ assert type(symsystem1.constant_symbols()) == tuple
+
+ assert symsystem1.output_eqns == out_eqns
+
+ assert symsystem1.bodies == (Pa,)
+ assert symsystem1.loads == ((P, g * m * N.x),)
+
+
+def test_form_2():
+ symsystem2 = SymbolicSystem(coordinates, comb_implicit_rhs, speeds=speeds,
+ mass_matrix=comb_implicit_mat,
+ alg_con=alg_con_full, output_eqns=out_eqns,
+ bodies=bodies, loads=loads)
+
+ assert symsystem2.coordinates == Matrix([x, y, lam])
+ assert symsystem2.speeds == Matrix([u, v])
+ assert symsystem2.states == Matrix([x, y, lam, u, v])
+
+ assert symsystem2.alg_con == [4]
+
+ inter = comb_implicit_rhs
+ assert simplify(symsystem2.comb_implicit_rhs - inter) == zeros(5, 1)
+ assert simplify(symsystem2.comb_implicit_mat-comb_implicit_mat) == zeros(5)
+
+ assert set(symsystem2.dynamic_symbols()) == set([y, v, lam, u, x])
+ assert type(symsystem2.dynamic_symbols()) == tuple
+ assert set(symsystem2.constant_symbols()) == set([l, g, m])
+ assert type(symsystem2.constant_symbols()) == tuple
+
+ inter = comb_explicit_rhs
+ symsystem2.compute_explicit_form()
+ assert simplify(symsystem2.comb_explicit_rhs - inter) == zeros(5, 1)
+
+
+ assert symsystem2.output_eqns == out_eqns
+
+ assert symsystem2.bodies == (Pa,)
+ assert symsystem2.loads == ((P, g * m * N.x),)
+
+
+def test_form_3():
+ symsystem3 = SymbolicSystem(states, dyn_implicit_rhs,
+ mass_matrix=dyn_implicit_mat,
+ coordinate_derivatives=kin_explicit_rhs,
+ alg_con=alg_con, coord_idxs=coord_idxs,
+ speed_idxs=speed_idxs, bodies=bodies,
+ loads=loads)
+
+ assert symsystem3.coordinates == Matrix([x, y])
+ assert symsystem3.speeds == Matrix([u, v])
+ assert symsystem3.states == Matrix([x, y, u, v, lam])
+
+ assert symsystem3.alg_con == [4]
+
+ inter1 = kin_explicit_rhs
+ inter2 = dyn_implicit_rhs
+ assert simplify(symsystem3.kin_explicit_rhs - inter1) == zeros(2, 1)
+ assert simplify(symsystem3.dyn_implicit_mat - dyn_implicit_mat) == zeros(3)
+ assert simplify(symsystem3.dyn_implicit_rhs - inter2) == zeros(3, 1)
+
+ inter = comb_implicit_rhs
+ assert simplify(symsystem3.comb_implicit_rhs - inter) == zeros(5, 1)
+ assert simplify(symsystem3.comb_implicit_mat-comb_implicit_mat) == zeros(5)
+
+ inter = comb_explicit_rhs
+ symsystem3.compute_explicit_form()
+ assert simplify(symsystem3.comb_explicit_rhs - inter) == zeros(5, 1)
+
+ assert set(symsystem3.dynamic_symbols()) == set([y, v, lam, u, x])
+ assert type(symsystem3.dynamic_symbols()) == tuple
+ assert set(symsystem3.constant_symbols()) == set([l, g, m])
+ assert type(symsystem3.constant_symbols()) == tuple
+
+ assert symsystem3.output_eqns == {}
+
+ assert symsystem3.bodies == (Pa,)
+ assert symsystem3.loads == ((P, g * m * N.x),)
+
+
+def test_property_attributes():
+ symsystem = SymbolicSystem(states, comb_explicit_rhs,
+ alg_con=alg_con_full, output_eqns=out_eqns,
+ coord_idxs=coord_idxs, speed_idxs=speed_idxs,
+ bodies=bodies, loads=loads)
+
+ with raises(AttributeError):
+ symsystem.bodies = 42
+ with raises(AttributeError):
+ symsystem.coordinates = 42
+ with raises(AttributeError):
+ symsystem.dyn_implicit_rhs = 42
+ with raises(AttributeError):
+ symsystem.comb_implicit_rhs = 42
+ with raises(AttributeError):
+ symsystem.loads = 42
+ with raises(AttributeError):
+ symsystem.dyn_implicit_mat = 42
+ with raises(AttributeError):
+ symsystem.comb_implicit_mat = 42
+ with raises(AttributeError):
+ symsystem.kin_explicit_rhs = 42
+ with raises(AttributeError):
+ symsystem.comb_explicit_rhs = 42
+ with raises(AttributeError):
+ symsystem.speeds = 42
+ with raises(AttributeError):
+ symsystem.states = 42
+ with raises(AttributeError):
+ symsystem.alg_con = 42
+
+
+def test_not_specified_errors():
+ """This test will cover errors that arise from trying to access attributes
+ that were not specificed upon object creation or were specified on creation
+ and the user trys to recalculate them."""
+ # Trying to access form 2 when form 1 given
+ # Trying to access form 3 when form 2 given
+
+ symsystem1 = SymbolicSystem(states, comb_explicit_rhs)
+
+ with raises(AttributeError):
+ symsystem1.comb_implicit_mat
+ with raises(AttributeError):
+ symsystem1.comb_implicit_rhs
+ with raises(AttributeError):
+ symsystem1.dyn_implicit_mat
+ with raises(AttributeError):
+ symsystem1.dyn_implicit_rhs
+ with raises(AttributeError):
+ symsystem1.kin_explicit_rhs
+ with raises(AttributeError):
+ symsystem1.compute_explicit_form()
+
+ symsystem2 = SymbolicSystem(coordinates, comb_implicit_rhs, speeds=speeds,
+ mass_matrix=comb_implicit_mat)
+
+ with raises(AttributeError):
+ symsystem2.dyn_implicit_mat
+ with raises(AttributeError):
+ symsystem2.dyn_implicit_rhs
+ with raises(AttributeError):
+ symsystem2.kin_explicit_rhs
+
+ # Attribute error when trying to access coordinates and speeds when only the
+ # states were given.
+ with raises(AttributeError):
+ symsystem1.coordinates
+ with raises(AttributeError):
+ symsystem1.speeds
+
+ # Attribute error when trying to access bodies and loads when they are not
+ # given
+ with raises(AttributeError):
+ symsystem1.bodies
+ with raises(AttributeError):
+ symsystem1.loads
+
+ # Attribute error when trying to access comb_explicit_rhs before it was
+ # calculated
+ with raises(AttributeError):
+ symsystem2.comb_explicit_rhs
| diff --git a/doc/src/modules/physics/mechanics/api/system.rst b/doc/src/modules/physics/mechanics/api/system.rst
new file mode 100644
index 000000000000..fb2da1898af2
--- /dev/null
+++ b/doc/src/modules/physics/mechanics/api/system.rst
@@ -0,0 +1,9 @@
+===========================
+SymbolicSystem (Docstrings)
+===========================
+
+SymbolicSystem
+==============
+
+.. automodule:: sympy.physics.mechanics.system
+ :members:
diff --git a/doc/src/modules/physics/mechanics/index.rst b/doc/src/modules/physics/mechanics/index.rst
index e6cb40f9dac9..7a40f880f09e 100644
--- a/doc/src/modules/physics/mechanics/index.rst
+++ b/doc/src/modules/physics/mechanics/index.rst
@@ -76,6 +76,7 @@ Guide to Mechanics
masses.rst
kane.rst
lagrange.rst
+ symsystem.rst
linearize.rst
examples.rst
advanced.rst
@@ -89,6 +90,7 @@ Mechanics API
api/part_bod.rst
api/kane_lagrange.rst
+ api/system.rst
api/linearize.rst
api/expr_manip.rst
api/printing.rst
diff --git a/doc/src/modules/physics/mechanics/symsystem.rst b/doc/src/modules/physics/mechanics/symsystem.rst
new file mode 100644
index 000000000000..a5ca30feb8b7
--- /dev/null
+++ b/doc/src/modules/physics/mechanics/symsystem.rst
@@ -0,0 +1,202 @@
+=====================================
+Symbolic Systems in Physics/Mechanics
+=====================================
+
+The `SymbolicSystem` class in physics/mechanics is a location for the pertinent
+information of a multibody dynamic system. In its most basic form it contains
+the equations of motion for the dynamic system, however, it can also contain
+information regarding the loads that the system is subject to, the bodies that
+the system is comprised of and any additional equations the user feels is
+important for the system. The goal of this class is to provide a unified output
+format for the equations of motion that numerical analysis code can be designed
+around.
+
+SymbolicSystem Example Usage
+============================
+
+This code will go over the manual input of the equations of motion for the
+simple pendulum that uses the Cartesian location of the mass as the generalized
+coordinates into `SymbolicSystem`.
+
+The equations of motion are formed in the physics/mechanics/examples_. In that
+spot the variables q1 and q2 are used in place of x and y and the reference
+frame is rotated 90 degrees.
+
+.. _examples: ../examples/lin_pend_nonmin_example.html
+
+::
+
+ >>> from sympy import atan, symbols, Matrix
+ >>> from sympy.physics.mechanics import (dynamicsymbols, ReferenceFrame,
+ ... Particle, Point)
+ >>> import sympy.physics.mechanics.system as system
+
+The first step will be to initialize all of the dynamic and constant symbols. ::
+
+ >>> x, y, u, v, lam = dynamicsymbols('x y u v lambda')
+ >>> m, l, g = symbols('m l g')
+
+Next step is to define the equations of motion in multiple forms:
+
+ [1] Explicit form where the kinematics and dynamics are combined
+ x' = F_1(x, t, r, p)
+
+ [2] Implicit form where the kinematics and dynamics are combined
+ M_2(x, p) x' = F_2(x, t, r, p)
+
+ [3] Implicit form where the kinematics and dynamics are separate
+ M_3(q, p) u' = F_3(q, u, t, r, p)
+ q' = G(q, u, t, r, p)
+
+where
+
+ x : states, e.g. [q, u]
+ t : time
+ r : specified (exogenous) inputs
+ p : constants
+ q : generalized coordinates
+ u : generalized speeds
+ F_1 : right hand side of the combined equations in explicit form
+ F_2 : right hand side of the combined equations in implicit form
+ F_3 : right hand side of the dynamical equations in implicit form
+ M_2 : mass matrix of the combined equations in implicit form
+ M_3 : mass matrix of the dynamical equations in implicit form
+ G : right hand side of the kinematical differential equations
+
+::
+
+ >>> dyn_implicit_mat = Matrix([[1, 0, -x/m],
+ ... [0, 1, -y/m],
+ ... [0, 0, l**2/m]])
+ >>> dyn_implicit_rhs = Matrix([0, 0, u**2 + v**2 - g*y])
+ >>> comb_implicit_mat = Matrix([[1, 0, 0, 0, 0],
+ ... [0, 1, 0, 0, 0],
+ ... [0, 0, 1, 0, -x/m],
+ ... [0, 0, 0, 1, -y/m],
+ ... [0, 0, 0, 0, l**2/m]])
+ >>> comb_implicit_rhs = Matrix([u, v, 0, 0, u**2 + v**2 - g*y])
+ >>> kin_explicit_rhs = Matrix([u, v])
+ >>> comb_explicit_rhs = comb_implicit_mat.LUsolve(comb_implicit_rhs)
+
+Now the reference frames, points and particles will be set up so this
+information can be passed into `system.SymbolicSystem` in the form of a bodies
+and loads iterable. ::
+
+ >>> theta = atan(x/y)
+ >>> omega = dynamicsymbols('omega')
+ >>> N = ReferenceFrame('N')
+ >>> A = N.orientnew('A', 'Axis', [theta, N.z])
+ >>> A.set_ang_vel(N, omega * N.z)
+ >>> O = Point('O')
+ >>> O.set_vel(N, 0)
+ >>> P = O.locatenew('P', l * A.x)
+ >>> P.v2pt_theory(O, N, A)
+ l*omega*A.y
+ >>> Pa = Particle('Pa', P, m)
+
+Now the bodies and loads iterables need to be initialized. ::
+
+ >>> bodies = [Pa]
+ >>> loads = [(P, g * m * N.x)]
+
+The equations of motion are in the form of a differential algebraic equation
+(DAE) and DAE solvers need to know which of the equations are the algebraic
+expressions. This information is passed into `SymbolicSystem` as a list
+specifying which rows are the algebraic equations. In this example it is a
+different row based on the chosen equations of motion format. The row index
+should always correspond to the mass matrix that is being input to the
+`SymbolicSystem` class but will always correspond to the row index of the
+combined dynamics and kinematics when being accessed from the `SymbolicSystem`
+class. ::
+
+ >>> alg_con = [2]
+ >>> alg_con_full = [4]
+
+An iterable containing the states now needs to be created for the system. The
+`SymbolicSystem` class can determine which of the states are considered
+coordinates or speeds by passing in the indexes of the coordinates and speeds.
+If these indexes are not passed in the object will not be able to differentiate
+between coordinates and speeds. ::
+
+ >>> states = (x, y, u, v, lam)
+ >>> coord_idxs = (0, 1)
+ >>> speed_idxs = (2, 3)
+
+Now the equations of motion instances can be created using the above mentioned
+equations of motion formats. ::
+
+ >>> symsystem1 = system.SymbolicSystem(states, comb_explicit_rhs,
+ ... alg_con=alg_con_full, bodies=bodies,
+ ... loads=loads)
+ >>> symsystem2 = system.SymbolicSystem(states, comb_implicit_rhs,
+ ... mass_matrix=comb_implicit_mat,
+ ... alg_con=alg_con_full,
+ ... coord_idxs=coord_idxs)
+ >>> symsystem3 = system.SymbolicSystem(states, dyn_implicit_rhs,
+ ... mass_matrix=dyn_implicit_mat,
+ ... coordinate_derivatives=kin_explicit_rhs,
+ ... alg_con=alg_con,
+ ... coord_idxs=coord_idxs,
+ ... speed_idxs=speed_idxs)
+
+ Like coordinates and speeds, the bodies and loads attributes can only be
+ accessed if they are specified during initialization of the `SymbolicSystem`
+ class. Lastly here are some attributes accessible from the `SymbolicSystem`
+ class. ::
+
+ >>> symsystem1.states
+ Matrix([
+ [ x(t)],
+ [ y(t)],
+ [ u(t)],
+ [ v(t)],
+ [lambda(t)]])
+ >>> symsystem2.coordinates
+ Matrix([
+ [x(t)],
+ [y(t)]])
+ >>> symsystem3.speeds
+ Matrix([
+ [u(t)],
+ [v(t)]])
+ >>> symsystem1.comb_explicit_rhs
+ Matrix([
+ [ u(t)],
+ [ v(t)],
+ [(-g*y(t) + u(t)**2 + v(t)**2)*x(t)/l**2],
+ [(-g*y(t) + u(t)**2 + v(t)**2)*y(t)/l**2],
+ [ m*(-g*y(t) + u(t)**2 + v(t)**2)/l**2]])
+ >>> symsystem2.comb_implicit_rhs
+ Matrix([
+ [ u(t)],
+ [ v(t)],
+ [ 0],
+ [ 0],
+ [-g*y(t) + u(t)**2 + v(t)**2]])
+ >>> symsystem2.comb_implicit_mat
+ Matrix([
+ [1, 0, 0, 0, 0],
+ [0, 1, 0, 0, 0],
+ [0, 0, 1, 0, -x(t)/m],
+ [0, 0, 0, 1, -y(t)/m],
+ [0, 0, 0, 0, l**2/m]])
+ >>> symsystem3.dyn_implicit_rhs
+ Matrix([
+ [ 0],
+ [ 0],
+ [-g*y(t) + u(t)**2 + v(t)**2]])
+ >>> symsystem3.dyn_implicit_mat
+ Matrix([
+ [1, 0, -x(t)/m],
+ [0, 1, -y(t)/m],
+ [0, 0, l**2/m]])
+ >>> symsystem3.kin_explicit_rhs
+ Matrix([
+ [u(t)],
+ [v(t)]])
+ >>> symsystem1.alg_con
+ [4]
+ >>> symsystem1.bodies
+ (Pa,)
+ >>> symsystem1.loads
+ ((P, g*m*N.x),)
| [
{
"components": [
{
"doc": "SymbolicSystem is a class that contains all the information about a\nsystem in a symbolic format such as the equations of motions and the bodies\nand loads in the system.\n\nThere are three ways that the equations of motion can be described for\nSymbolic System:\n\n\n ... | [
"test_form_1",
"test_form_2",
"test_form_3",
"test_property_attributes"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Added system.py to physics/mechanics
Added a class called SymbolicSystem to physics/mechanics to act as a container class for the equations of motion and other details related to dynamic systems. I have created all of the class attributes and docstrings. This work emerged from work done in PR #11182 and in PR [#353](https://github.com/pydy/pydy/pull/353) in the pydy repository.
Would it be better to name the file `symbolicsystem`, `symsystem` or just leave it as `system`?
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/physics/mechanics/system.py]
(definition of SymbolicSystem:)
class SymbolicSystem(object):
"""SymbolicSystem is a class that contains all the information about a
system in a symbolic format such as the equations of motions and the bodies
and loads in the system.
There are three ways that the equations of motion can be described for
Symbolic System:
[1] Explicit form where the kinematics and dynamics are combined
x' = F_1(x, t, r, p)
[2] Implicit form where the kinematics and dynamics are combined
M_2(x, p) x' = F_2(x, t, r, p)
[3] Implicit form where the kinematics and dynamics are separate
M_3(q, p) u' = F_3(q, u, t, r, p)
q' = G(q, u, t, r, p)
where
x : states, e.g. [q, u]
t : time
r : specified (exogenous) inputs
p : constants
q : generalized coordinates
u : generalized speeds
F_1 : right hand side of the combined equations in explicit form
F_2 : right hand side of the combined equations in implicit form
F_3 : right hand side of the dynamical equations in implicit form
M_2 : mass matrix of the combined equations in implicit form
M_3 : mass matrix of the dynamical equations in implicit form
G : right hand side of the kinematical differential equations
Parameters
==========
coord_states : ordered iterable of functions of time
This input will either be a collection of the coordinates or states
of the system depending on whether or not the speeds are also given.
If speeds are specified this input will be assumed to be the
coordinates otherwise this input will be assumed to be the states.
right_hand_side : Matrix
This variable is the right hand side of the equations of motion in
any of the forms. The specific form will be assumed depending on
whether a mass matrix or coordinate derivatives are given.
speeds : ordered iterable of functions of time, optional
This is a collection of the generalized speeds of the system. If
given it will be assumed that the first argument (coord_states) will
represent the generalized coordinates of the system.
mass_matrix : Matrix, optional
The matrix of the implicit forms of the equations of motion (forms
[2] and [3]). The distinction between the forms is determined by
whether or not the coordinate derivatives are passed in. If they are
given form [3] will be assumed otherwise form [2] is assumed.
coordinate_derivatives : Matrix, optional
The right hand side of the kinematical equations in explicit form.
If given it will be assumed that the equations of motion are being
entered in form [3].
alg_con : Iterable, optional
The indexes of the rows in the equations of motion that contain
algebraic constraints instead of differential equations. If the
equations are input in form [3], it will be assumed the indexes are
referencing the mass_matrix/right_hand_side combination and not the
coordinate_derivatives.
output_eqns : Dictionary, optional
Any output equations that are desired to be tracked are stored in a
dictionary where the key corresponds to the name given for the
specific equation and the value is the equation itself in symbolic
form
coord_idxs : Iterable, optional
If coord_states corresponds to the states rather than the
coordinates this variable will tell SymbolicSystem which indexes of
the states correspond to generalized coordinates.
speed_idxs : Iterable, optional
If coord_states corresponds to the states rather than the
coordinates this variable will tell SymbolicSystem which indexes of
the states correspond to generalized speeds.
bodies : iterable of Body/Rigidbody objects, optional
Iterable containing the bodies of the system
loads : iterable of load instances (described below), optional
Iterable containing the loads of the system where forces are given
by (point of application, force vector) and torques are given by
(reference frame acting upon, torque vector). Ex [(point, force),
(ref_frame, torque)]
Attributes
==========
coordinates : Matrix, shape(n, 1)
This is a matrix containing the generalized coordinates of the system
speeds : Matrix, shape(m, 1)
This is a matrix containing the generalized speeds of the system
states : Matrix, shape(o, 1)
This is a matrix containing the state variables of the system
alg_con : List
This list contains the indices of the algebraic constraints in the
combined equations of motion. The presence of these constraints requires
that a DAE solver be used instead of an ODE solver. If the system is
given in form [3] the alg_con variable will be adjusted such that it is
a representation of the combined kinematics and dynamics thus make sure
it always matches the mass matrix entered.
dyn_implicit_mat : Matrix, shape(m, m)
This is the M matrix in form [3] of the equations of motion (the mass
matrix or generalized inertia matrix of the dynamical equations of
motion in implicit form).
dyn_implicit_rhs : Matrix, shape(m, 1)
This is the F vector in form [3] of the equations of motion (the right
hand side of the dynamical equations of motion in implicit form).
comb_implicit_mat : Matrix, shape(o, o)
This is the M matrix in form [2] of the equations of motion. This matrix
contains a block diagonal structure where the top left block (the first
rows) represent the matrix in the implicit form of the kinematical
equations and the bottom right block (the last rows) represent the
matrix in the implicit form of the dynamical equations.
comb_implicit_rhs : Matrix, shape(o, 1)
This is the F vector in form [2] of the equations of motion. The top
part of the vector represents the right hand side of the implicit form
of the kinemaical equations and the bottom of the vector represents the
right hand side of the implicit form of the dynamical equations of
motion.
comb_explicit_rhs : Matrix, shape(o, 1)
This vector represents the right hand side of the combined equations of
motion in explicit form (form [1] from above).
kin_explicit_rhs : Matrix, shape(m, 1)
This is the right hand side of the explicit form of the kinematical
equations of motion as can be seen in form [3] (the G matrix).
output_eqns : Dictionary
If output equations were given they are stored in a dictionary where the
key corresponds to the name given for the specific equation and the
value is the equation itself in symbolic form
bodies : Tuple
If the bodies in the system were given they are stored in a tuple for
future access
loads : Tuple
If the loads in the system were given they are stored in a tuple for
future access. This includes forces and torques where forces are given
by (point of application, force vector) and torques are given by
(reference frame acted upon, torque vector).
Example
=======
As a simple example, the dynamics of a simple pendulum will be input into a
SymbolicSystem object manually. First some imports will be needed and then
symbols will be set up for the length of the pendulum (l), mass at the end
of the pendulum (m), and a constant for gravity (g). ::
>>> from sympy import Matrix, sin, symbols
>>> from sympy.physics.mechanics import dynamicsymbols, SymbolicSystem
>>> l, m, g = symbols('l m g')
The system will be defined by an angle of theta from the vertical and a
generalized speed of omega will be used where omega = theta_dot. ::
>>> theta, omega = dynamicsymbols('theta omega')
Now the equations of motion are ready to be formed and passed to the
SymbolicSystem object. ::
>>> kin_explicit_rhs = Matrix([omega])
>>> dyn_implicit_mat = Matrix([l**2 * m])
>>> dyn_implicit_rhs = Matrix([-g * l * m * sin(theta)])
>>> symsystem = SymbolicSystem([theta], dyn_implicit_rhs, [omega],
... dyn_implicit_mat)
Notes
=====
m : number of generalized speeds
n : number of generalized coordinates
o : number of states"""
(definition of SymbolicSystem.__init__:)
def __init__(self, coord_states, right_hand_side, speeds=None, mass_matrix=None, coordinate_derivatives=None, alg_con=None, output_eqns={}, coord_idxs=None, speed_idxs=None, bodies=None, loads=None):
"""Initializes a SymbolicSystem object"""
(definition of SymbolicSystem.coordinates:)
def coordinates(self):
"""Returns the column matrix of the generalized coordinates"""
(definition of SymbolicSystem.speeds:)
def speeds(self):
"""Returns the column matrix of generalized speeds"""
(definition of SymbolicSystem.states:)
def states(self):
"""Returns the column matrix of the state variables"""
(definition of SymbolicSystem.alg_con:)
def alg_con(self):
"""Returns a list with the indices of the rows containing algebraic
constraints in the combined form of the equations of motion"""
(definition of SymbolicSystem.dyn_implicit_mat:)
def dyn_implicit_mat(self):
"""Returns the matrix, M, corresponding to the dynamic equations in
implicit form, M x' = F, where the kinematical equations are not
included"""
(definition of SymbolicSystem.dyn_implicit_rhs:)
def dyn_implicit_rhs(self):
"""Returns the column matrix, F, corresponding to the dynamic equations
in implicit form, M x' = F, where the kinematical equations are not
included"""
(definition of SymbolicSystem.comb_implicit_mat:)
def comb_implicit_mat(self):
"""Returns the matrix, M, corresponding to the equations of motion in
implicit form (form [2]), M x' = F, where the kinematical equations are
included"""
(definition of SymbolicSystem.comb_implicit_rhs:)
def comb_implicit_rhs(self):
"""Returns the column matrix, F, corresponding to the equations of
motion in implicit form (form [2]), M x' = F, where the kinematical
equations are included"""
(definition of SymbolicSystem.compute_explicit_form:)
def compute_explicit_form(self):
"""If the explicit right hand side of the combined equations of motion
is to provided upon initialization, this method will calculate it. This
calculation can potentially take awhile to compute."""
(definition of SymbolicSystem.comb_explicit_rhs:)
def comb_explicit_rhs(self):
"""Returns the right hand side of the equations of motion in explicit
form, x' = F, where the kinematical equations are included"""
(definition of SymbolicSystem.kin_explicit_rhs:)
def kin_explicit_rhs(self):
"""Returns the right hand side of the kinematical equations in explicit
form, q' = G"""
(definition of SymbolicSystem.dynamic_symbols:)
def dynamic_symbols(self):
"""Returns a column matrix containing all of the symbols in the system
that depend on time"""
(definition of SymbolicSystem.constant_symbols:)
def constant_symbols(self):
"""Returns a column matrix containing all of the symbols in the system
that do not depend on time"""
(definition of SymbolicSystem.bodies:)
def bodies(self):
"""Returns the bodies in the system"""
(definition of SymbolicSystem.loads:)
def loads(self):
"""Returns the loads in the system"""
[end of new definitions in sympy/physics/mechanics/system.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 820363f5b17cbe5809ef0911ea539e135c179c62 | |
python-hyper__h2-265 | 265 | python-hyper/h2 | null | 9f0116fa5de42de81962dbaab0abf5edc54c9801 | 2016-07-25T09:54:32Z | diff --git a/HISTORY.rst b/HISTORY.rst
index e3fcb6fe9..ef9090bab 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -4,6 +4,13 @@ Release History
2.5.0dev0
---------
+API Changes (Backward-Compatible)
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+- Added a new ``H2Configuration`` object that allows rich configuration of
+ a ``H2Connection``. This object supersedes the prior keyword arguments to the
+ ``H2Connection`` object, which are now deprecated and will be removed in 3.0.
+
Bugfixes
~~~~~~~~
diff --git a/docs/source/api.rst b/docs/source/api.rst
index d4ad2f2a4..a841ecd37 100644
--- a/docs/source/api.rst
+++ b/docs/source/api.rst
@@ -19,6 +19,13 @@ Connection
:members:
+Configuration
+-------------
+
+.. autoclass:: h2.config.H2Configuration
+ :members:
+
+
.. _h2-events-api:
Events
diff --git a/h2/config.py b/h2/config.py
new file mode 100644
index 000000000..bf2f3edcb
--- /dev/null
+++ b/h2/config.py
@@ -0,0 +1,79 @@
+# -*- coding: utf-8 -*-
+"""
+h2/config
+~~~~~~~~~
+
+Objects for controlling the configuration of the HTTP/2 stack.
+"""
+
+
+class H2Configuration(object):
+ """
+ An object that controls the way a single HTTP/2 connection behaves.
+
+ This object allows the users to customize behaviour. In particular, it
+ allows users to enable or disable optional features, or to otherwise handle
+ various unusual behaviours.
+
+ This object has very little behaviour of its own: it mostly just ensures
+ that configuration is self-consistent.
+
+ :param client_side: Whether this object is to be used on the client side of
+ a connection, or on the server side. Affects the logic used by the
+ state machine, the default settings values, the allowable stream IDs,
+ and several other properties. Defaults to ``True``.
+ :type client_side: ``bool``
+
+ :param header_encoding: Controls whether the headers emitted by this object
+ in events are transparently decoded to ``unicode`` strings, and what
+ encoding is used to do that decoding. For historical reasons, this
+ defaults to ``'utf-8'``. To prevent the decoding of headers (that is,
+ to force them to be returned as bytestrings), this can be set to
+ ``False`` or the empty string.
+ :type header_encoding: ``str``, ``False``, or ``None``
+ """
+ def __init__(self, client_side=True, header_encoding='utf-8'):
+ self._client_side = client_side
+ self._header_encoding = header_encoding
+
+ @property
+ def client_side(self):
+ """
+ Whether this object is to be used on the client side of a connection,
+ or on the server side. Affects the logic used by the state machine, the
+ default settings values, the allowable stream IDs, and several other
+ properties. Defaults to ``True``.
+ """
+ return self._client_side
+
+ @client_side.setter
+ def client_side(self, value):
+ """
+ Enforces constraints on the client side of the connection.
+ """
+ if not isinstance(value, bool):
+ raise ValueError("client_side must be a bool")
+ self._client_side = value
+
+ @property
+ def header_encoding(self):
+ """
+ Controls whether the headers emitted by this object in events are
+ transparently decoded to ``unicode`` strings, and what encoding is used
+ to do that decoding. For historical reasons, this defaults to
+ ``'utf-8'``. To prevent the decoding of headers (that is, to force them
+ to be returned as bytestrings), this can be set to ``False`` or the
+ empty string.
+ """
+ return self._header_encoding
+
+ @header_encoding.setter
+ def header_encoding(self, value):
+ """
+ Enforces constraints on the value of header encoding.
+ """
+ if not isinstance(value, (bool, str, type(None))):
+ raise ValueError("header_encoding must be bool, string, or None")
+ if value is True:
+ raise ValueError("header_encoding cannot be True")
+ self._header_encoding = value
diff --git a/h2/connection.py b/h2/connection.py
index 0685e1739..8259dda43 100644
--- a/h2/connection.py
+++ b/h2/connection.py
@@ -18,6 +18,7 @@
from hpack.hpack import Encoder, Decoder
from hpack.exceptions import HPACKError
+from .config import H2Configuration
from .errors import PROTOCOL_ERROR, REFUSED_STREAM
from .events import (
WindowUpdated, RemoteSettingsChanged, PingAcknowledged,
@@ -252,10 +253,17 @@ class H2Connection(object):
.. versionchanged:: 2.3.0
Added the ``header_encoding`` keyword argument.
+ .. versionchanged:: 2.5.0
+ Added the ``config`` keyword argument. Deprecated the ``client_side``
+ and ``header_encoding`` parameters.
+
:param client_side: Whether this object is to be used on the client side of
a connection, or on the server side. Affects the logic used by the
state machine, the default settings values, the allowable stream IDs,
and several other properties. Defaults to ``True``.
+
+ .. deprecated:: 2.5.0
+
:type client_side: ``bool``
:param header_encoding: Controls whether the headers emitted by this object
@@ -264,7 +272,18 @@ class H2Connection(object):
defaults to ``'utf-8'``. To prevent the decoding of headers (that is,
to force them to be returned as bytestrings), this can be set to
``False`` or the empty string.
+
+ .. deprecated:: 2.5.0
+
:type header_encoding: ``str`` or ``False``
+
+ :param config: The configuration for the HTTP/2 connection. If provided,
+ supersedes the deprecated ``client_side`` and ``header_encoding``
+ values.
+
+ .. versionadded:: 2.5.0
+
+ :type config: :class:`H2Configuration <h2.config.H2Configuration>`
"""
# The initial maximum outbound frame size. This can be changed by receiving
# a settings frame.
@@ -280,14 +299,13 @@ class H2Connection(object):
# The largest acceptable window increment.
MAX_WINDOW_INCREMENT = 2**31 - 1
- def __init__(self, client_side=True, header_encoding='utf-8'):
+ def __init__(self, client_side=True, header_encoding='utf-8', config=None):
self.state_machine = H2ConnectionStateMachine()
self.streams = {}
self.highest_inbound_stream_id = 0
self.highest_outbound_stream_id = 0
self.encoder = Encoder()
self.decoder = Decoder()
- self.client_side = client_side
# Objects that store settings, including defaults.
#
@@ -320,15 +338,15 @@ def __init__(self, client_side=True, header_encoding='utf-8'):
#: bytes.
self.max_inbound_frame_size = self.local_settings.max_frame_size
- #: Controls whether the headers emitted by this object in events are
- #: transparently decoded to ``unicode`` strings, and what encoding is
- #: used to do that decoding. For historical reason, this defaults to
- #: ``'utf-8'``. To prevent the decoding of headers (that is, to force
- #: them to be returned as bytestrings), this can be set to ``False`` or
- #: the empty string.
+ #: The configuration for this HTTP/2 connection object.
#:
- #: .. versionadded:: 2.3.0
- self.header_encoding = header_encoding
+ #: .. versionadded:: 2.5.0
+ self.config = config
+ if self.config is None:
+ self.config = H2Configuration(
+ client_side=client_side,
+ header_encoding=header_encoding,
+ )
# Buffer for incoming data.
self.incoming_buffer = FrameBuffer(server=not client_side)
@@ -391,7 +409,7 @@ def open_outbound_streams(self):
"""
The current number of open outbound streams.
"""
- outbound_numbers = int(self.client_side)
+ outbound_numbers = int(self.config.client_side)
return self._open_streams(outbound_numbers)
@property
@@ -399,9 +417,46 @@ def open_inbound_streams(self):
"""
The current number of open inbound streams.
"""
- inbound_numbers = int(not self.client_side)
+ inbound_numbers = int(not self.config.client_side)
return self._open_streams(inbound_numbers)
+ @property
+ def header_encoding(self):
+ """
+ Controls whether the headers emitted by this object in events are
+ transparently decoded to ``unicode`` strings, and what encoding is used
+ to do that decoding. For historical reason, this defaults to
+ ``'utf-8'``. To prevent the decoding of headers (that is, to force them
+ to be returned as bytestrings), this can be set to ``False`` or the
+ empty string.
+
+ .. versionadded:: 2.3.0
+
+ .. deprecated:: 2.5.0
+ Use :data:`config <H2Connection.config>` instead.
+ """
+ return self.config.header_encoding
+
+ @header_encoding.setter
+ def header_encoding(self, value):
+ """
+ Setter for header encoding config value.
+ """
+ self.config.header_encoding = value
+
+ @property
+ def client_side(self):
+ """
+ Whether this object is to be used on the client side of a connection,
+ or on the server side. Affects the logic used by the state machine, the
+ default settings values, the allowable stream IDs, and several other
+ properties. Defaults to ``True``.
+
+ .. deprecated:: 2.5.0
+ Use :data:`config <H2Connection.config>` instead.
+ """
+ return self.config.client_side
+
def _begin_new_stream(self, stream_id, allowed_ids):
"""
Initiate a new stream.
@@ -449,7 +504,7 @@ def initiate_connection(self):
Must be called for both clients and servers.
"""
self.state_machine.process_input(ConnectionInputs.SEND_SETTINGS)
- if self.client_side:
+ if self.config.client_side:
preamble = b'PRI * HTTP/2.0\r\n\r\nSM\r\n\r\n'
else:
preamble = b''
@@ -495,7 +550,7 @@ def initiate_upgrade_connection(self, settings_header=None):
"""
frame_data = None
- if self.client_side:
+ if self.config.client_side:
f = SettingsFrame(0)
for setting, value in self.local_settings.items():
f.settings[setting] = value
@@ -509,14 +564,14 @@ def initiate_upgrade_connection(self, settings_header=None):
self.initiate_connection()
connection_input = (
- ConnectionInputs.SEND_HEADERS if self.client_side
+ ConnectionInputs.SEND_HEADERS if self.config.client_side
else ConnectionInputs.RECV_HEADERS
)
self.state_machine.process_input(connection_input)
# Set up stream 1.
self._begin_new_stream(stream_id=1, allowed_ids=AllowedStreamIDs.ODD)
- self.streams[1].upgrade(self.client_side)
+ self.streams[1].upgrade(self.config.client_side)
return frame_data
def _get_or_create_stream(self, stream_id, allowed_ids):
@@ -581,7 +636,7 @@ def get_next_available_stream_id(self):
# No streams have been opened yet, so return the lowest allowed stream
# ID.
if not self.highest_outbound_stream_id:
- return 1 if self.client_side else 2
+ return 1 if self.config.client_side else 2
next_stream_id = self.highest_outbound_stream_id + 2
if next_stream_id > self.HIGHEST_ALLOWED_STREAM_ID:
@@ -702,7 +757,7 @@ def send_headers(self, stream_id, headers, end_stream=False,
self.state_machine.process_input(ConnectionInputs.SEND_HEADERS)
stream = self._get_or_create_stream(
- stream_id, AllowedStreamIDs(self.client_side)
+ stream_id, AllowedStreamIDs(self.config.client_side)
)
frames = stream.send_headers(
headers, self.encoder, end_stream
@@ -716,7 +771,7 @@ def send_headers(self, stream_id, headers, end_stream=False,
)
if priority_present:
- if not self.client_side:
+ if not self.config.client_side:
raise RFC1122Error("Servers SHOULD NOT prioritize streams.")
headers_frame = frames[0]
@@ -1111,7 +1166,7 @@ def prioritize(self, stream_id, weight=None, depends_on=None,
of the new exclusively-dependent stream. Defaults to ``False``.
:type exclusive: ``bool``
"""
- if not self.client_side:
+ if not self.config.client_side:
raise RFC1122Error("Servers SHOULD NOT prioritize streams.")
self.state_machine.process_input(
@@ -1391,12 +1446,12 @@ def _receive_headers_frame(self, frame):
ConnectionInputs.RECV_HEADERS
)
stream = self._get_or_create_stream(
- frame.stream_id, AllowedStreamIDs(not self.client_side)
+ frame.stream_id, AllowedStreamIDs(not self.config.client_side)
)
frames, stream_events = stream.receive_headers(
headers,
'END_STREAM' in frame.flags,
- self.header_encoding
+ self.config.header_encoding
)
if 'PRIORITY' in frame.flags:
@@ -1450,7 +1505,7 @@ def _receive_push_promise_frame(self, frame):
frames, stream_events = stream.receive_push_promise_in_band(
frame.promised_stream_id,
pushed_headers,
- self.header_encoding,
+ self.config.header_encoding,
)
new_stream = self._begin_new_stream(
@@ -1692,7 +1747,7 @@ def _receive_alt_svc_frame(self, frame):
return frames, events
# If we're a server, we want to ignore this (RFC 7838 says so).
- if not self.client_side:
+ if not self.config.client_side:
return frames, events
event = AlternativeServiceAvailable()
@@ -1722,7 +1777,7 @@ def _stream_id_is_outbound(self, stream_id):
Returns ``True`` if the stream ID corresponds to an outbound stream
(one initiated by this peer), returns ``False`` otherwise.
"""
- return (stream_id % 2 == int(self.client_side))
+ return (stream_id % 2 == int(self.config.client_side))
def _add_frame_priority(frame, weight=None, depends_on=None, exclusive=None):
| diff --git a/test/test_basic_logic.py b/test/test_basic_logic.py
index a7778c91c..5afdf00f4 100644
--- a/test/test_basic_logic.py
+++ b/test/test_basic_logic.py
@@ -11,6 +11,7 @@
import hyperframe
import pytest
+import h2.config
import h2.connection
import h2.errors
import h2.events
@@ -65,6 +66,18 @@ def test_begin_connection(self, frame_factory):
assert not events
assert c.data_to_send() == expected_data
+ def test_deprecated_properties(self):
+ """
+ We can access the deprecated properties.
+ """
+ config = h2.config.H2Configuration(
+ client_side=False, header_encoding=False
+ )
+ c = h2.connection.H2Connection(config=config)
+
+ assert c.client_side is False
+ assert c.header_encoding is False
+
def test_sending_headers(self):
"""
Single headers frames are correctly encoded.
@@ -142,7 +155,8 @@ def test_receiving_a_response_bytes(self, frame_factory):
When receiving a response, the ResponseReceived event fires with bytes
headers if the encoding is set appropriately.
"""
- c = h2.connection.H2Connection(header_encoding=False)
+ config = h2.config.H2Configuration(header_encoding=False)
+ c = h2.connection.H2Connection(config=config)
c.initiate_connection()
c.send_headers(1, self.example_request_headers, end_stream=True)
@@ -159,6 +173,44 @@ def test_receiving_a_response_bytes(self, frame_factory):
assert event.stream_id == 1
assert event.headers == self.bytes_example_response_headers
+ def test_receiving_a_response_change_encoding(self, frame_factory):
+ """
+ When receiving a response, the ResponseReceived event fires with bytes
+ headers if the encoding is set appropriately, but if this changes then
+ the change reflects it.
+ """
+ config = h2.config.H2Configuration(header_encoding=False)
+ c = h2.connection.H2Connection(config=config)
+ c.initiate_connection()
+ c.send_headers(1, self.example_request_headers, end_stream=True)
+
+ f = frame_factory.build_headers_frame(
+ self.example_response_headers
+ )
+ events = c.receive_data(f.serialize())
+
+ assert len(events) == 1
+ event = events[0]
+
+ assert isinstance(event, h2.events.ResponseReceived)
+ assert event.stream_id == 1
+ assert event.headers == self.bytes_example_response_headers
+
+ c.send_headers(3, self.example_request_headers, end_stream=True)
+ c.header_encoding = 'utf-8'
+ f = frame_factory.build_headers_frame(
+ self.example_response_headers,
+ stream_id=3,
+ )
+ events = c.receive_data(f.serialize())
+
+ assert len(events) == 1
+ event = events[0]
+
+ assert isinstance(event, h2.events.ResponseReceived)
+ assert event.stream_id == 3
+ assert event.headers == self.example_response_headers
+
def test_end_stream_without_data(self, frame_factory):
"""
Ending a stream without data emits a zero-length DATA frame with
diff --git a/test/test_config.py b/test/test_config.py
new file mode 100644
index 000000000..eb6812ec7
--- /dev/null
+++ b/test/test_config.py
@@ -0,0 +1,62 @@
+# -*- coding: utf-8 -*-
+"""
+test_config
+~~~~~~~~~~~
+
+Test the configuration object.
+"""
+import pytest
+
+import h2.config
+
+
+class TestH2Config(object):
+ """
+ Tests of the H2 config object.
+ """
+ def test_defaults(self):
+ """
+ The default values of the HTTP/2 config object are sensible.
+ """
+ config = h2.config.H2Configuration()
+ assert config.client_side
+ assert config.header_encoding == 'utf-8'
+
+ @pytest.mark.parametrize('client_side', [None, 'False', 1])
+ def test_client_side_must_be_bool(self, client_side):
+ """
+ The value of the ``client_side`` setting must be a boolean.
+ """
+ config = h2.config.H2Configuration()
+
+ with pytest.raises(ValueError):
+ config.client_side = client_side
+
+ @pytest.mark.parametrize('client_side', [True, False])
+ def test_client_side_is_reflected(self, client_side):
+ """
+ The value of ``client_side``, when set, is reflected in the value.
+ """
+ config = h2.config.H2Configuration()
+ config.client_side = client_side
+ assert config.client_side == client_side
+
+ @pytest.mark.parametrize('header_encoding', [True, 1, object()])
+ def test_header_encoding_must_be_false_str_none(self, header_encoding):
+ """
+ The value of the ``header_encoding`` setting must be False, a string,
+ or None.
+ """
+ config = h2.config.H2Configuration()
+
+ with pytest.raises(ValueError):
+ config.header_encoding = header_encoding
+
+ @pytest.mark.parametrize('header_encoding', [False, 'ascii', None])
+ def test_header_encoding_is_reflected(self, header_encoding):
+ """
+ The value of ``header_encoding``, when set, is reflected in the value.
+ """
+ config = h2.config.H2Configuration()
+ config.header_encoding = header_encoding
+ assert config.header_encoding == header_encoding
| diff --git a/HISTORY.rst b/HISTORY.rst
index e3fcb6fe9..ef9090bab 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -4,6 +4,13 @@ Release History
2.5.0dev0
---------
+API Changes (Backward-Compatible)
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+- Added a new ``H2Configuration`` object that allows rich configuration of
+ a ``H2Connection``. This object supersedes the prior keyword arguments to the
+ ``H2Connection`` object, which are now deprecated and will be removed in 3.0.
+
Bugfixes
~~~~~~~~
diff --git a/docs/source/api.rst b/docs/source/api.rst
index d4ad2f2a4..a841ecd37 100644
--- a/docs/source/api.rst
+++ b/docs/source/api.rst
@@ -19,6 +19,13 @@ Connection
:members:
+Configuration
+-------------
+
+.. autoclass:: h2.config.H2Configuration
+ :members:
+
+
.. _h2-events-api:
Events
| [
{
"components": [
{
"doc": "An object that controls the way a single HTTP/2 connection behaves.\n\nThis object allows the users to customize behaviour. In particular, it\nallows users to enable or disable optional features, or to otherwise handle\nvarious unusual behaviours.\n\nThis object has ver... | [
"test/test_basic_logic.py::TestBasicClient::test_begin_connection",
"test/test_basic_logic.py::TestBasicClient::test_deprecated_properties",
"test/test_basic_logic.py::TestBasicClient::test_sending_headers",
"test/test_basic_logic.py::TestBasicClient::test_sending_data",
"test/test_basic_logic.py::TestBasic... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add configuration object.
As discussed in #246, this adds a basic configuration object. The goal here is to resist the proliferation of flags to the HTTP/2 connection object.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in h2/config.py]
(definition of H2Configuration:)
class H2Configuration(object):
"""An object that controls the way a single HTTP/2 connection behaves.
This object allows the users to customize behaviour. In particular, it
allows users to enable or disable optional features, or to otherwise handle
various unusual behaviours.
This object has very little behaviour of its own: it mostly just ensures
that configuration is self-consistent.
:param client_side: Whether this object is to be used on the client side of
a connection, or on the server side. Affects the logic used by the
state machine, the default settings values, the allowable stream IDs,
and several other properties. Defaults to ``True``.
:type client_side: ``bool``
:param header_encoding: Controls whether the headers emitted by this object
in events are transparently decoded to ``unicode`` strings, and what
encoding is used to do that decoding. For historical reasons, this
defaults to ``'utf-8'``. To prevent the decoding of headers (that is,
to force them to be returned as bytestrings), this can be set to
``False`` or the empty string.
:type header_encoding: ``str``, ``False``, or ``None``"""
(definition of H2Configuration.__init__:)
def __init__(self, client_side=True, header_encoding='utf-8'):
(definition of H2Configuration.client_side:)
def client_side(self):
"""Whether this object is to be used on the client side of a connection,
or on the server side. Affects the logic used by the state machine, the
default settings values, the allowable stream IDs, and several other
properties. Defaults to ``True``."""
(definition of H2Configuration.client_side:)
def client_side(self, value):
"""Enforces constraints on the client side of the connection."""
(definition of H2Configuration.header_encoding:)
def header_encoding(self):
"""Controls whether the headers emitted by this object in events are
transparently decoded to ``unicode`` strings, and what encoding is used
to do that decoding. For historical reasons, this defaults to
``'utf-8'``. To prevent the decoding of headers (that is, to force them
to be returned as bytestrings), this can be set to ``False`` or the
empty string."""
(definition of H2Configuration.header_encoding:)
def header_encoding(self, value):
"""Enforces constraints on the value of header encoding."""
[end of new definitions in h2/config.py]
[start of new definitions in h2/connection.py]
(definition of H2Connection.header_encoding:)
def header_encoding(self, value):
"""Setter for header encoding config value."""
(definition of H2Connection.client_side:)
def client_side(self):
"""Whether this object is to be used on the client side of a connection,
or on the server side. Affects the logic used by the state machine, the
default settings values, the allowable stream IDs, and several other
properties. Defaults to ``True``.
.. deprecated:: 2.5.0
Use :data:`config <H2Connection.config>` instead."""
[end of new definitions in h2/connection.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 9df8f94ce983d44ef57c8f332463f7b3cbe0127b | |
sympy__sympy-11400 | 11,400 | sympy/sympy | 1.0 | 8dcb12a6cf500e8738d6729ab954a261758f49ca | 2016-07-15T21:40:49Z | diff --git a/sympy/printing/ccode.py b/sympy/printing/ccode.py
index 30a07a7b5936..71eda1fecb0c 100644
--- a/sympy/printing/ccode.py
+++ b/sympy/printing/ccode.py
@@ -231,6 +231,20 @@ def _print_Symbol(self, expr):
else:
return name
+ def _print_Relational(self, expr):
+ lhs_code = self._print(expr.lhs)
+ rhs_code = self._print(expr.rhs)
+ op = expr.rel_op
+ return ("{0} {1} {2}").format(lhs_code, op, rhs_code)
+
+ def _print_sinc(self, expr):
+ from sympy.functions.elementary.trigonometric import sin
+ from sympy.core.relational import Ne
+ from sympy.functions import Piecewise
+ _piecewise = Piecewise(
+ (sin(expr.args[0]) / expr.args[0], Ne(expr.args[0], 0)), (1, True))
+ return self._print(_piecewise)
+
def _print_AugmentedAssignment(self, expr):
lhs_code = self._print(expr.lhs)
op = expr.rel_op
| diff --git a/sympy/printing/tests/test_ccode.py b/sympy/printing/tests/test_ccode.py
index d8e449d85341..d36ae66ac7e9 100644
--- a/sympy/printing/tests/test_ccode.py
+++ b/sympy/printing/tests/test_ccode.py
@@ -120,6 +120,16 @@ def test_ccode_boolean():
assert ccode((x | y) & z) == "z && (x || y)"
+def test_ccode_Relational():
+ from sympy import Eq, Ne, Le, Lt, Gt, Ge
+ assert ccode(Eq(x, y)) == "x == y"
+ assert ccode(Ne(x, y)) == "x != y"
+ assert ccode(Le(x, y)) == "x <= y"
+ assert ccode(Lt(x, y)) == "x < y"
+ assert ccode(Gt(x, y)) == "x > y"
+ assert ccode(Ge(x, y)) == "x >= y"
+
+
def test_ccode_Piecewise():
expr = Piecewise((x, x < 1), (x**2, True))
assert ccode(expr) == (
@@ -162,6 +172,18 @@ def test_ccode_Piecewise():
raises(ValueError, lambda: ccode(expr))
+def test_ccode_sinc():
+ from sympy import sinc
+ expr = sinc(x)
+ assert ccode(expr) == (
+ "((x != 0) ? (\n"
+ " sin(x)/x\n"
+ ")\n"
+ ": (\n"
+ " 1\n"
+ "))")
+
+
def test_ccode_Piecewise_deep():
p = ccode(2*Piecewise((x, x < 1), (x + 1, x < 2), (x**2, True)))
assert p == (
| [
{
"components": [
{
"doc": "",
"lines": [
234,
238
],
"name": "CCodePrinter._print_Relational",
"signature": "def _print_Relational(self, expr):",
"type": "function"
},
{
"doc": "",
"lines": [
240,
... | [
"test_ccode_Relational",
"test_ccode_sinc"
] | [
"test_printmethod",
"test_ccode_sqrt",
"test_ccode_Pow",
"test_ccode_constants_mathh",
"test_ccode_constants_other",
"test_ccode_Rational",
"test_ccode_Integer",
"test_ccode_functions",
"test_ccode_inline_function",
"test_ccode_exceptions",
"test_ccode_user_functions",
"test_ccode_boolean",
... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add sinc to C codegen [rebase #11303]
@kritkaran94 please let me know if this looks ok.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/printing/ccode.py]
(definition of CCodePrinter._print_Relational:)
def _print_Relational(self, expr):
(definition of CCodePrinter._print_sinc:)
def _print_sinc(self, expr):
[end of new definitions in sympy/printing/ccode.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
ccode(sinc(x)) doesn't work
```
In [30]: ccode(sinc(x))
Out[30]: '// Not supported in C:\n// sinc\nsinc(x)'
```
I don't think `math.h` has `sinc`, but it could print
```
In [38]: ccode(Piecewise((sin(theta)/theta, Ne(theta, 0)), (1, True)))
Out[38]: '((Ne(theta, 0)) ? (\n sin(theta)/theta\n)\n: (\n 1\n))'
```
----------
@asmeurer I would like to fix this issue. Should I work upon the codegen.py file ? If there's something else tell me how to start ?
The relevant file is sympy/printing/ccode.py
@asmeurer I am new here. I would like to work on this issue. Please tell me how to start?
Since there are two people asking, maybe one person can try #11286 which is very similar, maybe even easier.
--------------------
</issues> | 820363f5b17cbe5809ef0911ea539e135c179c62 | |
conan-io__conan-370 | 370 | conan-io/conan | null | fe9295f800f8080bc225ac6ce4ff3a85795fee09 | 2016-07-14T17:40:35Z | diff --git a/conans/client/command.py b/conans/client/command.py
index 76748152845..250dfac04e4 100644
--- a/conans/client/command.py
+++ b/conans/client/command.py
@@ -300,6 +300,8 @@ def install(self, *args):
help="update with new upstream packages")
parser.add_argument("--scope", "-sc", nargs=1, action=Extender,
help='Define scopes for packages')
+ parser.add_argument("--generator", "-g", nargs=1, action=Extender,
+ help='Generators to use')
self._parse_args(parser)
args = parser.parse_args(*args)
@@ -332,7 +334,8 @@ def install(self, *args):
filename=args.file,
update=args.update,
integrity=args.integrity,
- scopes=scopes)
+ scopes=scopes,
+ generators=args.generator)
def info(self, *args):
""" Prints information about the requirements.
diff --git a/conans/client/configure_environment.py b/conans/client/configure_environment.py
index 282d6bcf0db..4d47f8a64e3 100644
--- a/conans/client/configure_environment.py
+++ b/conans/client/configure_environment.py
@@ -1,22 +1,51 @@
from conans.model.settings import Settings
import copy
+from conans.client.generators.virtualenv import get_setenv_variables_commands
+from conans.model.env_info import DepsEnvInfo
class ConfigureEnvironment(object):
- def __init__(self, deps_cpp_info, settings):
+ def __init__(self, *args):
+ if len(args) == 2:
+ deps_cpp_info = args[0]
+ deps_env_info = DepsEnvInfo()
+ settings = args[1]
+ elif len(args) == 1: # conanfile (new interface)
+ self.conanfile = args[0]
+ deps_cpp_info = self.conanfile.deps_cpp_info
+ deps_env_info = self.conanfile.deps_env_info
+ settings = self.conanfile.settings
+
assert isinstance(settings, Settings)
+
self._settings = settings
self._deps_cpp_info = deps_cpp_info
- self.compiler = getattr(self._settings, "compiler", None)
- self.arch = getattr(self._settings, "arch", None)
- self.os = getattr(self._settings, "os", None)
- self.build_type = getattr(self._settings, "build_type", None)
- self.libcxx = None
+ self._deps_env_info = deps_env_info
+ try:
+ self.compiler = str(self._settings.compiler)
+ except:
+ self.compiler = None
+
+ try:
+ self.arch = str(self._settings.arch)
+ except:
+ self.arch = None
+
try:
- self.libcxx = self.compiler.libcxx
+ self.os = str(self._settings.os)
except:
- pass
+ self.os = None
+
+ try:
+ self.build_type = str(self._settings.build_type)
+ except:
+ self.build_type = None
+
+ try:
+ self.libcxx = str(self.compiler.libcxx)
+ except:
+ self.libcxx = None
@property
def command_line(self):
@@ -26,7 +55,7 @@ def command_line(self):
self.run(command)
"""
command = ""
- if self.os == "Linux" or self.os == "Macos":
+ if self.os == "Linux" or self.os == "Macos" or (self.os == "Windows" and self.compiler == "gcc"):
libflags = " ".join(["-l%s" % lib for lib in self._deps_cpp_info.libs])
libs = 'LIBS="%s"' % libflags
archflag = "-m32" if self.arch == "x86" else ""
@@ -61,4 +90,7 @@ def command_line(self):
cl_args = " ".join(['/I"%s"' % lib for lib in self._deps_cpp_info.include_paths])
lib_paths = ";".join(['"%s"' % lib for lib in self._deps_cpp_info.lib_paths])
command = "SET LIB=%s;%%LIB%% && SET CL=%s" % (lib_paths, cl_args)
+
+ # Add the rest of env variables from deps_env_info
+ command += " ".join(get_setenv_variables_commands(self._deps_env_info, "" if self.os != "Windows" else "SET"))
return command
diff --git a/conans/client/deps_builder.py b/conans/client/deps_builder.py
index b3381abea01..182fbc55c03 100644
--- a/conans/client/deps_builder.py
+++ b/conans/client/deps_builder.py
@@ -104,7 +104,7 @@ def __repr__(self):
return "\n".join(["Nodes:\n ",
"\n ".join(repr(n) for n in self.nodes)])
- def propagate_buildinfo(self):
+ def propagate_info_objects(self):
""" takes the exports from upper level and updates the imports
right now also the imports are propagated, but should be checked
E.g. Conan A, depends on B. A=>B
@@ -127,7 +127,7 @@ def propagate_buildinfo(self):
_, conanfile = node
for n in node_order:
conanfile.deps_cpp_info.update(n.conanfile.cpp_info, n.conan_ref)
-
+ conanfile.deps_env_info.update(n.conanfile.env_info, n.conan_ref)
return ordered
def propagate_info(self):
diff --git a/conans/client/generators/__init__.py b/conans/client/generators/__init__.py
index d75df62fdca..0e278adaaf8 100644
--- a/conans/client/generators/__init__.py
+++ b/conans/client/generators/__init__.py
@@ -1,4 +1,5 @@
from conans.model import registered_generators
+from conans.model.env_info import EnvInfo
from conans.util.files import save, normalize
from os.path import join
from .text import TXTGenerator
@@ -9,6 +10,7 @@
from .visualstudio import VisualStudioGenerator
from .xcode import XCodeGenerator
from .ycm import YouCompleteMeGenerator
+from .virtualenv import VirtualEnvGenerator
def _save_generator(name, klass):
@@ -23,6 +25,7 @@ def _save_generator(name, klass):
_save_generator("visual_studio", VisualStudioGenerator)
_save_generator("xcode", XCodeGenerator)
_save_generator("ycm", YouCompleteMeGenerator)
+_save_generator("virtualenv", VirtualEnvGenerator)
def write_generators(conanfile, path, output):
@@ -33,6 +36,8 @@ def write_generators(conanfile, path, output):
conanfile.cpp_info = CppInfo(path)
conanfile.cpp_info.dependencies = []
+
+ conanfile.env_info = EnvInfo(path)
conanfile.package_info()
for generator_name in conanfile.generators:
diff --git a/conans/client/generators/virtualenv.py b/conans/client/generators/virtualenv.py
new file mode 100644
index 00000000000..0f7a76d963d
--- /dev/null
+++ b/conans/client/generators/virtualenv.py
@@ -0,0 +1,85 @@
+from conans.model import Generator
+import platform
+import os
+import copy
+from conans.errors import ConanException
+
+
+def get_setenv_variables_commands(deps_env_info, command_set=None):
+ if command_set is None:
+ command_set = "SET" if platform.system() == "Windows" else "export"
+
+ multiple_to_set, simple_to_set = get_dict_values(deps_env_info)
+ ret = []
+ for name, value in multiple_to_set.items():
+ if platform.system() == "Windows":
+ ret.append(command_set + ' "' + name + '=' + value + ';%' + name + '%"')
+ else:
+ ret.append(command_set + ' ' + name + '=' + value + ':$' + name)
+ for name, value in simple_to_set.items():
+ if platform.system() == "Windows":
+ ret.append(command_set + ' "' + name + '=' + value + '"')
+ else:
+ ret.append(command_set + ' ' + name + '=' + value)
+ return ret
+
+
+def get_dict_values(deps_env_info):
+ def adjust_var_name(name):
+ return "PATH" if name.lower() == "path" else name
+ separator = ";" if platform.system() == "Windows" else ":"
+ multiple_to_set = {adjust_var_name(name): separator.join(value_list)
+ for name, value_list in deps_env_info.vars.items()
+ if isinstance(value_list, list)}
+ simple_to_set = {adjust_var_name(name): value
+ for name, value in deps_env_info.vars.items()
+ if not isinstance(value, list)}
+ return multiple_to_set, simple_to_set
+
+
+class VirtualEnvGenerator(Generator):
+
+ @property
+ def filename(self):
+ return
+
+ @property
+ def content(self):
+ old_venv = os.environ.get("_CONAN_VENV", None)
+ if old_venv:
+ raise ConanException("Deactivate the current virtual environment (or close the console) and then execute conan install again: %s" % old_venv)
+ multiple_to_set, simple_to_set = get_dict_values(self.deps_env_info)
+ all_vars = copy.copy(multiple_to_set)
+ all_vars.update(simple_to_set)
+ venv_name = os.path.basename(self.conanfile.conanfile_directory)
+ venv_dir = self.conanfile.conanfile_directory
+ deactivate_lines = ["@echo off"] if platform.system() == "Windows" else []
+ for name in all_vars.keys():
+ old_value = os.environ.get(name, "")
+ if platform.system() == "Windows":
+ deactivate_lines.append('SET "%s=%s"' % (name, old_value))
+ else:
+ deactivate_lines.append('export %s=%s' % (name, old_value))
+ if platform.system() == "Windows":
+ deactivate_lines.append("SET PROMPT=%s" % os.environ.get("PROMPT", ""))
+ deactivate_lines.append("SET _CONAN_VENV=")
+ else:
+ deactivate_lines.append(r"export PS1=\"\[\e]0;\u@\h: \w\a\]${debian_chroot:+($debian_chroot)}\u@\h:\w\$ \"")
+ deactivate_lines.append("export _CONAN_VENV=")
+
+ activate_lines = ["@echo off"] if platform.system() == "Windows" else []
+ if platform.system() == "Windows":
+ activate_lines.append('if defined _CONAN_VENV (echo Deactivate current venv first with %_CONAN_VENV%\deactivate.bat)')
+ activate_lines.append('if defined _CONAN_VENV (EXIT /B)')
+ activate_lines.append("SET PROMPT=(%s) " % venv_name + "%PROMPT%")
+ activate_lines.append("SET _CONAN_VENV=%s" % venv_dir)
+ else:
+ activate_lines.append('if [ -n "$_CONAN_VENV" ]; then echo "Deactivate current venv first with \'source $_CONAN_VENV\deactivate"; fi')
+ activate_lines.append('if [ -n "$_CONAN_VENV" ]; then exit; fi')
+ activate_lines.append("export PS1=\"(%s) " % venv_name + "$PS1\"")
+ activate_lines.append("export _CONAN_VENV=%s" % venv_dir)
+
+ activate_lines.extend(get_setenv_variables_commands(self.deps_env_info))
+ ext = "bat" if platform.system() == "Windows" else "sh"
+ return {"activate.%s" % ext: os.linesep.join(activate_lines),
+ "deactivate.%s" % ext: os.linesep.join(deactivate_lines)}
diff --git a/conans/client/installer.py b/conans/client/installer.py
index 0876d7b6dd7..9d2bd818634 100644
--- a/conans/client/installer.py
+++ b/conans/client/installer.py
@@ -14,9 +14,11 @@
from conans.model.build_info import CppInfo
from conans.client.output import ScopedOutput
from collections import Counter
+from conans.model.env_info import EnvInfo
+import six
-def init_cpp_info(deps_graph, paths):
+def init_info_objects(deps_graph, paths):
""" Made external so it is independent of installer and can called
in testing too
"""
@@ -27,7 +29,9 @@ def init_cpp_info(deps_graph, paths):
package_id = conan_file.info.package_id()
package_reference = PackageReference(conan_ref, package_id)
package_folder = paths.package(package_reference)
+ conan_file.package_folder = package_folder
conan_file.cpp_info = CppInfo(package_folder)
+ conan_file.env_info = EnvInfo(package_folder)
try:
conan_file.package_info()
except Exception as e:
@@ -66,10 +70,10 @@ def _process_buildinfo(self, deps_graph):
passes their exported build flags and included directories to the downstream
imports flags
"""
- init_cpp_info(deps_graph, self._paths)
+ init_info_objects(deps_graph, self._paths)
# order by levels and propagate exports as download imports
- nodes_by_level = deps_graph.propagate_buildinfo()
+ nodes_by_level = deps_graph.propagate_info_objects()
return nodes_by_level
def _compute_private_nodes(self, deps_graph, build_mode):
@@ -230,7 +234,10 @@ def remove_source(raise_error=False):
rmdir(src_folder)
except BaseException as e_rm:
save(dirty, "") # Creation of DIRTY flag
- output.error("Unable to remove source folder %s\n%s" % (src_folder, str(e_rm)))
+ msg = str(e_rm)
+ if six.PY2:
+ msg = str(e_rm).decode("latin1") # Windows prints some chars in latin1
+ output.error("Unable to remove source folder %s\n%s" % (src_folder, msg))
output.warn("**** Please delete it manually ****")
if raise_error or isinstance(e_rm, KeyboardInterrupt):
raise ConanException("Unable to remove source folder")
diff --git a/conans/client/loader.py b/conans/client/loader.py
index 583c120a9e4..4aab581c674 100644
--- a/conans/client/loader.py
+++ b/conans/client/loader.py
@@ -14,7 +14,6 @@
from conans.model.conan_generator import Generator
from conans.client.generators import _save_generator
from conans.model.scope import Scopes
-from conans.client.output import ScopedOutput
class ConanFileLoader(object):
@@ -115,18 +114,24 @@ def load_conan(self, conan_file_path, output, consumer=False):
except Exception as e: # re-raise with file name
raise ConanException("%s: %s" % (conan_file_path, str(e)))
- def load_conan_txt(self, conan_requirements_path, output):
+ def load_conan_txt(self, conan_txt_path, output):
- if not os.path.exists(conan_requirements_path):
+ if not os.path.exists(conan_txt_path):
raise NotFoundException("Conanfile not found!")
- conanfile = ConanFile(output, self._runner, self._settings.copy(),
- os.path.dirname(conan_requirements_path))
+ contents = load(conan_txt_path)
+ path = os.path.dirname(conan_txt_path)
+
+ conanfile = self.parse_conan_txt(contents, path, output)
+ return conanfile
+
+ def parse_conan_txt(self, contents, path, output):
+ conanfile = ConanFile(output, self._runner, self._settings.copy(), path)
try:
- parser = ConanFileTextLoader(load(conan_requirements_path))
+ parser = ConanFileTextLoader(contents)
except Exception as e:
- raise ConanException("%s:\n%s" % (conan_requirements_path, str(e)))
+ raise ConanException("%s:\n%s" % (path, str(e)))
for requirement_text in parser.requirements:
ConanFileReference.loads(requirement_text) # Raise if invalid
conanfile.requires.add(requirement_text)
@@ -143,7 +148,7 @@ def load_conan_txt(self, conan_requirements_path, output):
conanfile.scope = self._scopes.package_scope()
return conanfile
- def load_virtual(self, reference):
+ def load_virtual(self, reference, path):
fixed_options = []
# If user don't specify namespace in options, assume that it's for the reference (keep compatibility)
for option_name, option_value in self._options.as_list():
@@ -154,7 +159,7 @@ def load_virtual(self, reference):
fixed_options.append(tmp)
options = OptionsValues.from_list(fixed_options)
- conanfile = ConanFile(None, self._runner, self._settings.copy(), None)
+ conanfile = ConanFile(None, self._runner, self._settings.copy(), path)
conanfile.requires.add(str(reference)) # Convert to string necessary
# conanfile.options.values = options
diff --git a/conans/client/manager.py b/conans/client/manager.py
index 56c13c46219..04ff7b05c3a 100644
--- a/conans/client/manager.py
+++ b/conans/client/manager.py
@@ -150,7 +150,7 @@ def _get_graph(self, reference, current_path, remote, options, settings, filenam
if isinstance(reference, ConanFileReference):
project_reference = None
- conanfile = loader.load_virtual(reference)
+ conanfile = loader.load_virtual(reference, current_path)
is_txt = True
else:
conanfile_path = reference
@@ -209,7 +209,7 @@ def info(self, reference, current_path, remote=None, options=None, settings=None
def install(self, reference, current_path, remote=None, options=None, settings=None,
build_mode=False, filename=None, update=False, check_updates=False,
- integrity=False, scopes=None):
+ integrity=False, scopes=None, generators=None):
""" Fetch and build all dependencies for the given reference
@param reference: ConanFileReference or path to user space conanfile
@param current_path: where the output files will be saved
@@ -217,6 +217,7 @@ def install(self, reference, current_path, remote=None, options=None, settings=N
@param options: list of tuples: [(optionname, optionvalue), (optionname, optionvalue)...]
@param settings: list of tuples: [(settingname, settingvalue), (settingname, value)...]
"""
+ generators = generators or []
objects = self._get_graph(reference, current_path, remote, options, settings, filename,
update, check_updates, integrity, scopes)
(_, deps_graph, _, registry, conanfile, remote_proxy, loader) = objects
@@ -239,12 +240,19 @@ def install(self, reference, current_path, remote=None, options=None, settings=N
installer = ConanInstaller(self._paths, self._user_io, remote_proxy)
installer.install(deps_graph, build_mode)
+ scope_prefix = "PROJECT" if not isinstance(reference, ConanFileReference) else str(reference)
+ output = ScopedOutput(scope_prefix, self._user_io.out)
+
+ # Write generators
+ tmp = list(conanfile.generators) # Add the command line specified generators
+ tmp.extend(generators)
+ conanfile.generators = tmp
+ write_generators(conanfile, current_path, output)
+
if not isinstance(reference, ConanFileReference):
content = normalize(conanfile.info.dumps())
save(os.path.join(current_path, CONANINFO), content)
- output = ScopedOutput("PROJECT", self._user_io.out)
output.info("Generated %s" % CONANINFO)
- write_generators(conanfile, current_path, output)
local_installer = FileImporter(deps_graph, self._paths, current_path)
conanfile.copy = local_installer
conanfile.imports()
diff --git a/conans/model/conan_file.py b/conans/model/conan_file.py
index ecedd452e7d..d4a23bab0d8 100644
--- a/conans/model/conan_file.py
+++ b/conans/model/conan_file.py
@@ -3,6 +3,7 @@
from conans.model.build_info import DepsCppInfo
from conans import tools # @UnusedImport KEEP THIS! Needed for pyinstaller to copy to exe.
from conans.errors import ConanException
+from conans.model.env_info import DepsEnvInfo
def create_options(conanfile):
@@ -93,6 +94,11 @@ def __init__(self, output, runner, settings, conanfile_directory):
# needed variables to pack the project
self.cpp_info = None # Will be initialized at processing time
self.deps_cpp_info = DepsCppInfo()
+
+ # environment variables declared in the package_info
+ self.env_info = None # Will be initialized at processing time
+ self.deps_env_info = DepsEnvInfo()
+
self.copy = None # initialized at runtime
# an output stream (writeln, info, warn error)
diff --git a/conans/model/conan_generator.py b/conans/model/conan_generator.py
index 6e45d5d5a26..54d88f48437 100644
--- a/conans/model/conan_generator.py
+++ b/conans/model/conan_generator.py
@@ -9,6 +9,8 @@ def __init__(self, conanfile):
self.conanfile = conanfile
self._deps_build_info = conanfile.deps_cpp_info
self._build_info = conanfile.cpp_info
+ self._deps_env_info = conanfile.deps_env_info
+ self._env_info = conanfile.env_info
@property
def deps_build_info(self):
@@ -18,6 +20,14 @@ def deps_build_info(self):
def build_info(self):
return self._build_info
+ @property
+ def deps_env_info(self):
+ return self._deps_env_info
+
+ @property
+ def env_info(self):
+ return self._env_info
+
@abstractproperty
def content(self):
raise NotImplementedError()
diff --git a/conans/model/env_info.py b/conans/model/env_info.py
new file mode 100644
index 00000000000..b42f630da00
--- /dev/null
+++ b/conans/model/env_info.py
@@ -0,0 +1,77 @@
+from collections import OrderedDict
+from conans.util.log import logger
+
+
+class EnvInfo(object):
+ """ Object that stores all the environment variables required:
+
+ env = EnvInfo()
+ env.hola = True
+ env.Cosa.append("OTRO")
+ env.Cosa.append("MAS")
+ env.Cosa = "hello"
+ env.Cosa.append("HOLA")
+
+ """
+ def __init__(self, root_folder=None):
+ self._root_folder_ = root_folder
+ self._values_ = {}
+
+ def __getattr__(self, name):
+ if name.startswith("_") and name.endswith("_"):
+ return super(EnvInfo, self).__getattr__(name)
+ else:
+ if not self._values_.get(name):
+ self._values_[name] = []
+ elif not isinstance(self._values_[name], list):
+ tmp = self._values_[name]
+ self._values_[name] = [tmp]
+ return self._values_[name]
+
+ def __setattr__(self, name, value):
+ if (name.startswith("_") and name.endswith("_")):
+ super(EnvInfo, self).__setattr__(name, value)
+ else:
+ self._values_[name] = value
+ return
+
+ @property
+ def vars(self):
+ return self._values_
+
+
+class DepsEnvInfo(EnvInfo):
+ """ All the env info for a conanfile dependencies
+ """
+ def __init__(self):
+ super(DepsEnvInfo, self).__init__()
+ self._dependencies_ = OrderedDict()
+
+ @property
+ def dependencies(self):
+ return self._dependencies_.items()
+
+ @property
+ def deps(self):
+ return self._dependencies_.keys()
+
+ def __getitem__(self, item):
+ return self._dependencies_[item]
+
+ def update(self, dep_env_info, conan_ref=None):
+ if conan_ref is not None:
+ self._dependencies_[conan_ref.name] = dep_env_info
+ else:
+ self._dependencies_.update(dep_env_info.dependencies)
+
+ # With vars if its setted the keep the setted value
+ for varname, value in dep_env_info.vars.items():
+ if varname not in self.vars:
+ self.vars[varname] = value
+ elif isinstance(self.vars[varname], list):
+ if isinstance(value, list):
+ self.vars[varname].extend(value)
+ else:
+ self.vars[varname].append(value)
+ else:
+ logger.warn("DISCARDED variable %s=%s from %s" % (varname, value, str(conan_ref)))
| diff --git a/conans/test/compile_helpers_test.py b/conans/test/compile_helpers_test.py
index 250ed449360..0df7e750be5 100644
--- a/conans/test/compile_helpers_test.py
+++ b/conans/test/compile_helpers_test.py
@@ -97,7 +97,16 @@ def configure_environment_test(self):
'"path/to/includes/lib2"')
# Not supported yet
env = ConfigureEnvironment(BuildInfoMock(), MockWinGccSettings())
- self.assertEquals(env.command_line, "")
+ self.assertEquals(env.command_line, 'env LIBS="-llib1 -llib2" LDFLAGS="-Lpath/to/lib1 '
+ '-Lpath/to/lib2 -llib1 -llib2 -m32" '
+ 'CFLAGS="-m32 cflag1 -s -DNDEBUG '
+ '-Ipath/to/includes/lib1 -Ipath/to/includes/lib2" '
+ 'CPPFLAGS="-m32 cppflag1 -s -DNDEBUG '
+ '-Ipath/to/includes/lib1 -Ipath/to/includes/lib2" '
+ 'C_INCLUDE_PATH="path/to/includes/lib1":'
+ '"path/to/includes/lib2" '
+ 'CPP_INCLUDE_PATH="path/to/includes/lib1":'
+ '"path/to/includes/lib2"')
def gcc_test(self):
gcc = GCC(MockLinuxSettings())
diff --git a/conans/test/integration/conan_env_test.py b/conans/test/integration/conan_env_test.py
new file mode 100644
index 00000000000..585bc2271e7
--- /dev/null
+++ b/conans/test/integration/conan_env_test.py
@@ -0,0 +1,60 @@
+import unittest
+from conans.test.tools import TestClient
+from conans.util.files import load
+import os
+import platform
+
+
+class ConanEnvTest(unittest.TestCase):
+
+ def conan_env_deps_test(self):
+ client = TestClient()
+ conanfile = '''
+from conans import ConanFile
+
+class HelloConan(ConanFile):
+ name = "Hello"
+ version = "0.1"
+ def package_info(self):
+ self.env_info.var1="bad value"
+ self.env_info.var2.append("value2")
+ self.env_info.var3="Another value"
+ self.env_info.path = "/dir"
+'''
+ files = {}
+ files["conanfile.py"] = conanfile
+ client.save(files)
+ client.run("export lasote/stable")
+ conanfile = '''
+from conans import ConanFile
+
+class HelloConan(ConanFile):
+ name = "Hello2"
+ version = "0.1"
+ def config(self):
+ self.requires("Hello/0.1@lasote/stable")
+
+ def package_info(self):
+ self.env_info.var1="good value"
+ self.env_info.var2.append("value3")
+ '''
+ files["conanfile.py"] = conanfile
+ client.save(files, clean_first=True)
+ client.run("export lasote/stable")
+ client.run("install Hello2/0.1@lasote/stable --build -g virtualenv")
+ ext = "bat" if platform.system() == "Windows" else "sh"
+ self.assertTrue(os.path.exists(os.path.join(client.current_folder, "activate.%s" % ext)))
+ self.assertTrue(os.path.exists(os.path.join(client.current_folder, "deactivate.%s" % ext)))
+ activate_contents = load(os.path.join(client.current_folder, "activate.%s" % ext))
+ deactivate_contents = load(os.path.join(client.current_folder, "deactivate.%s" % ext))
+ self.assertNotIn("bad value", activate_contents)
+ self.assertIn("var1=good value", activate_contents)
+ if platform.system() == "Windows":
+ self.assertIn("var2=value3;value2;%var2%", activate_contents)
+ else:
+ self.assertIn("var2=value3:value2:$var2", activate_contents)
+ self.assertIn("Another value", activate_contents)
+ self.assertIn("PATH=/dir", activate_contents)
+
+ self.assertIn('var1=', deactivate_contents)
+ self.assertIn('var2=', deactivate_contents)
diff --git a/conans/test/integration/conan_scopes_test.py b/conans/test/integration/conan_scopes_test.py
index 31615c2ed5d..0626abdbdfa 100644
--- a/conans/test/integration/conan_scopes_test.py
+++ b/conans/test/integration/conan_scopes_test.py
@@ -1,3 +1,4 @@
+
import unittest
from conans.test.tools import TestClient
from conans.util.files import load
diff --git a/conans/test/model/build_info_test.py b/conans/test/model/build_info_test.py
index c6599c9d140..12ecf9517cf 100644
--- a/conans/test/model/build_info_test.py
+++ b/conans/test/model/build_info_test.py
@@ -2,6 +2,7 @@
from conans.model.build_info import DepsCppInfo
from conans.client.generators import TXTGenerator
from collections import namedtuple
+from conans.model.env_info import DepsEnvInfo
class BuildInfoTest(unittest.TestCase):
@@ -12,6 +13,7 @@ def _equal(self, item1, item2):
getattr(item2, field))
def help_test(self):
+ deps_env_info = DepsEnvInfo()
deps_cpp_info = DepsCppInfo()
deps_cpp_info.includedirs.append("C:/whatever")
deps_cpp_info.includedirs.append("C:/whenever")
@@ -21,7 +23,7 @@ def help_test(self):
child.includedirs.append("F:/ChildrenPath")
child.cppflags.append("cxxmyflag")
deps_cpp_info._dependencies["Boost"] = child
- fakeconan = namedtuple("Conanfile", "deps_cpp_info cpp_info")
- output = TXTGenerator(fakeconan(deps_cpp_info, None)).content
+ fakeconan = namedtuple("Conanfile", "deps_cpp_info cpp_info deps_env_info env_info")
+ output = TXTGenerator(fakeconan(deps_cpp_info, None, deps_env_info, None)).content
deps_cpp_info2 = DepsCppInfo.loads(output)
self._equal(deps_cpp_info, deps_cpp_info2)
diff --git a/conans/test/model/env_info_test.py b/conans/test/model/env_info_test.py
new file mode 100644
index 00000000000..b738332f9ec
--- /dev/null
+++ b/conans/test/model/env_info_test.py
@@ -0,0 +1,35 @@
+import unittest
+from conans.model.env_info import DepsEnvInfo
+from conans.model.ref import ConanFileReference
+
+
+class EnvInfoTest(unittest.TestCase):
+
+ def assign_test(self):
+ env = DepsEnvInfo()
+ env.foo = "var"
+ env.foo.append("var2")
+ env.foo2 = "var3"
+ env.foo2 = "var4"
+ env.foo63 = "other"
+
+ self.assertEquals(env.vars, {"foo": ["var", "var2"], "foo2": "var4", "foo63": "other"})
+
+ def update_test(self):
+ env = DepsEnvInfo()
+ env.foo = "var"
+ env.foo.append("var2")
+ env.foo2 = "var3"
+ env.foo2 = "var4"
+ env.foo63 = "other"
+
+ env2 = DepsEnvInfo()
+ env2.foo = "new_value"
+ env2.foo2.append("not")
+ env2.foo3.append("var3")
+
+ env.update(env2, ConanFileReference.loads("pack/1.0@lasote/testing"))
+
+ self.assertEquals(env.vars, {"foo": ["var", "var2", "new_value"],
+ "foo2": "var4", "foo3": ["var3"],
+ "foo63": "other"})
diff --git a/conans/test/model/order_libs_test.py b/conans/test/model/order_libs_test.py
index 2dbc4d86323..5e70ef0ccc3 100644
--- a/conans/test/model/order_libs_test.py
+++ b/conans/test/model/order_libs_test.py
@@ -9,7 +9,7 @@
from conans.util.files import save
from conans.model.settings import Settings
from conans.test.utils.test_files import temp_folder
-from conans.client.installer import init_cpp_info
+from conans.client.installer import init_info_objects
from conans.model.scope import Scopes
@@ -86,8 +86,8 @@ def test_diamond_no_conflict(self):
self.retriever.conan("SDL2_ttf", ["freeType", "SDL2"])
root = self.retriever.root("MyProject", ["SDL2_ttf"])
deps_graph = self.builder.load(None, root)
- init_cpp_info(deps_graph, self.retriever)
- bylevel = deps_graph.propagate_buildinfo()
+ init_info_objects(deps_graph, self.retriever)
+ bylevel = deps_graph.propagate_info_objects()
E = bylevel[-1][0]
self.assertEqual(E.conanfile.deps_cpp_info.libs,
['SDL2_ttf', 'SDL2', 'rt', 'pthread', 'dl', 'freeType',
| [
{
"components": [
{
"doc": "takes the exports from upper level and updates the imports\nright now also the imports are propagated, but should be checked\nE.g. Conan A, depends on B. A=>B\nB exports an include directory \"my_dir\", with root \"/...../0123efe\"\nA imports are the exports of B, plus... | [
"conans/test/model/order_libs_test.py::ConanRequirementsTest::test_diamond_no_conflict"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Feature/conan environ
I'm not happy with the install method in manager. Now it creates in a tmp file a conanfile.txt file to emulate it when it receives a reference. It's necessary to process the reference as a depenency node in the graph because the generator needs the vars that the package reference declares.
About the rest, it's necessary to test virutalenv in OSx and Linux a lot, but basic usage is very good.
You can export this cmake recipe, should work in win but i've tested in linux:
https://github.com/lasote/conan-cmake-installer
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in conans/client/deps_builder.py]
(definition of DepsGraph.propagate_info_objects:)
def propagate_info_objects(self):
"""takes the exports from upper level and updates the imports
right now also the imports are propagated, but should be checked
E.g. Conan A, depends on B. A=>B
B exports an include directory "my_dir", with root "/...../0123efe"
A imports are the exports of B, plus any other conans it depends on
A.imports.include_dirs = B.export.include_paths.
Note the difference, include_paths used to compute full paths as the user
defines export relative to its folder"""
[end of new definitions in conans/client/deps_builder.py]
[start of new definitions in conans/client/generators/virtualenv.py]
(definition of get_setenv_variables_commands:)
def get_setenv_variables_commands(deps_env_info, command_set=None):
(definition of get_dict_values:)
def get_dict_values(deps_env_info):
(definition of get_dict_values.adjust_var_name:)
def adjust_var_name(name):
(definition of VirtualEnvGenerator:)
class VirtualEnvGenerator(Generator): @property
(definition of VirtualEnvGenerator.filename:)
def filename(self):
(definition of VirtualEnvGenerator.content:)
def content(self):
[end of new definitions in conans/client/generators/virtualenv.py]
[start of new definitions in conans/client/installer.py]
(definition of init_info_objects:)
def init_info_objects(deps_graph, paths):
"""Made external so it is independent of installer and can called
in testing too"""
[end of new definitions in conans/client/installer.py]
[start of new definitions in conans/client/loader.py]
(definition of ConanFileLoader.parse_conan_txt:)
def parse_conan_txt(self, contents, path, output):
[end of new definitions in conans/client/loader.py]
[start of new definitions in conans/model/conan_generator.py]
(definition of Generator.deps_env_info:)
def deps_env_info(self):
(definition of Generator.env_info:)
def env_info(self):
[end of new definitions in conans/model/conan_generator.py]
[start of new definitions in conans/model/env_info.py]
(definition of EnvInfo:)
class EnvInfo(object):
"""Object that stores all the environment variables required:
env = EnvInfo()
env.hola = True
env.Cosa.append("OTRO")
env.Cosa.append("MAS")
env.Cosa = "hello"
env.Cosa.append("HOLA")"""
(definition of EnvInfo.__init__:)
def __init__(self, root_folder=None):
(definition of EnvInfo.__getattr__:)
def __getattr__(self, name):
(definition of EnvInfo.__setattr__:)
def __setattr__(self, name, value):
(definition of EnvInfo.vars:)
def vars(self):
(definition of DepsEnvInfo:)
class DepsEnvInfo(EnvInfo):
"""All the env info for a conanfile dependencies
"""
(definition of DepsEnvInfo.__init__:)
def __init__(self):
(definition of DepsEnvInfo.dependencies:)
def dependencies(self):
(definition of DepsEnvInfo.deps:)
def deps(self):
(definition of DepsEnvInfo.__getitem__:)
def __getitem__(self, item):
(definition of DepsEnvInfo.update:)
def update(self, dep_env_info, conan_ref=None):
[end of new definitions in conans/model/env_info.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 4a5b19a75db9225316c8cb022a2dfb9705a2af34 | ||
boto__boto3-605 | 605 | boto/boto3 | null | c0b2d38ccc9f09cbd4af5e05875b620ce2e581f7 | 2016-04-25T07:38:12Z | diff --git a/.changes/next-release/feature-DynamoDB.json b/.changes/next-release/feature-DynamoDB.json
new file mode 100644
index 0000000000..e7c6841edc
--- /dev/null
+++ b/.changes/next-release/feature-DynamoDB.json
@@ -0,0 +1,5 @@
+{
+ "category": "DynamoDB",
+ "type": "feature",
+ "description": "Add request auto de-duplication based on specified primary keys for batch_writer. (``#605 <https://github.com/boto/boto3/issues/605>`__ <https"
+}
\ No newline at end of file
diff --git a/boto3/dynamodb/table.py b/boto3/dynamodb/table.py
index a4b7f17cd2..6e32e190b9 100644
--- a/boto3/dynamodb/table.py
+++ b/boto3/dynamodb/table.py
@@ -29,7 +29,7 @@ class TableResource(object):
def __init__(self, *args, **kwargs):
super(TableResource, self).__init__(*args, **kwargs)
- def batch_writer(self):
+ def batch_writer(self, overwrite_by_pkeys=None):
"""Create a batch writer object.
This method creates a context manager for writing
@@ -39,7 +39,9 @@ def batch_writer(self):
in batches. In addition, the batch writer will also automatically
handle any unprocessed items and resend them as needed. All you need
to do is call ``put_item`` for any items you want to add, and
- ``delete_item`` for any items you want to delete.
+ ``delete_item`` for any items you want to delete. In addition, you can
+ specify ``auto_dedup`` if the batch might contain duplicated requests
+ and you want this writer to handle de-dup for you.
Example usage::
@@ -50,13 +52,18 @@ def batch_writer(self):
# You can also delete_items in a batch.
batch.delete_item(Key={'HashKey': 'SomeHashKey'})
+ :type overwrite_by_pkeys: list(string)
+ :param overwrite_by_pkeys: De-duplicate request items in buffer
+ if match new request item on specified primary keys. i.e
+ ``["partition_key1", "sort_key2", "sort_key3"]``
+
"""
- return BatchWriter(self.name, self.meta.client)
+ return BatchWriter(self.name, self.meta.client, overwrite_by_pkeys=overwrite_by_pkeys)
class BatchWriter(object):
"""Automatically handle batch writes to DynamoDB for a single table."""
- def __init__(self, table_name, client, flush_amount=25):
+ def __init__(self, table_name, client, flush_amount=25, overwrite_by_pkeys=None):
"""
:type table_name: str
@@ -78,21 +85,44 @@ def __init__(self, table_name, client, flush_amount=25):
a local buffer before sending a batch_write_item
request to DynamoDB.
+ :type overwrite_by_pkeys: list(string)
+ :param overwrite_by_pkeys: De-duplicate request items in buffer
+ if match new request item on specified primary keys. i.e
+ ``["partition_key1", "sort_key2", "sort_key3"]``
"""
self._table_name = table_name
self._client = client
self._items_buffer = []
self._flush_amount = flush_amount
+ self._overwrite_by_pkeys = overwrite_by_pkeys
def put_item(self, Item):
- self._items_buffer.append({'PutRequest': {'Item': Item}})
- self._flush_if_needed()
+ self._add_request_and_process({'PutRequest': {'Item': Item}})
def delete_item(self, Key):
- self._items_buffer.append({'DeleteRequest': {'Key': Key}})
+ self._add_request_and_process({'DeleteRequest': {'Key': Key}})
+
+ def _add_request_and_process(self, request):
+ if self._overwrite_by_pkeys:
+ self._remove_dup_pkeys_request_if_any(request)
+ logger.debug("With overwrite_by_pkeys enabled, skipping request:%s", request)
+ self._items_buffer.append(request)
self._flush_if_needed()
+ def _remove_dup_pkeys_request_if_any(self, request):
+ pkey_values_new = self._extract_pkey_values(request)
+ for item in self._items_buffer:
+ if self._extract_pkey_values(item) == pkey_values_new:
+ self._items_buffer.remove(item)
+
+ def _extract_pkey_values(self, request):
+ if request.get('PutRequest'):
+ return [ request['PutRequest']['Item'][key] for key in self._overwrite_by_pkeys ]
+ elif request.get('DeleteRequest'):
+ return [ request['DeleteRequest']['Key'][key] for key in self._overwrite_by_pkeys ]
+ return None
+
def _flush_if_needed(self):
if len(self._items_buffer) >= self._flush_amount:
self._flush()
diff --git a/docs/source/guide/dynamodb.rst b/docs/source/guide/dynamodb.rst
index bf58c506c7..ade7d636a2 100644
--- a/docs/source/guide/dynamodb.rst
+++ b/docs/source/guide/dynamodb.rst
@@ -274,6 +274,64 @@ table.
}
)
+The batch writer can help to de-duplicate request by specifying ``overwrite_by_pkeys=['partition_key', 'sort_key']``
+if you want to bypass no duplication limitation of single batch write request as
+``botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the BatchWriteItem operation: Provided list of item keys contains duplicates``.
+
+It will drop request items in the buffer if their primary keys(composite) values are
+the same as newly added one, as eventually consistent with streams of individual
+put/delete operations on the same item.
+
+::
+
+ with table.batch_writer(overwrite_by_pkeys=['partition_key', 'sort_key']) as batch:
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's1',
+ 'other': '111',
+ }
+ )
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's1',
+ 'other': '222',
+ }
+ )
+ batch.delete_item(
+ Key={
+ 'partition_key': 'p1',
+ 'sort_key': 's2'
+ }
+ )
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's2',
+ 'other': '444',
+ }
+ )
+
+after de-duplicate:
+
+::
+
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's1',
+ 'other': '222',
+ }
+ )
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's2',
+ 'other': '444',
+ }
+ )
+
Querying and Scanning
---------------------
| diff --git a/tests/functional/docs/test_dynamodb.py b/tests/functional/docs/test_dynamodb.py
index 6bc22ceafe..b751e1fe4e 100644
--- a/tests/functional/docs/test_dynamodb.py
+++ b/tests/functional/docs/test_dynamodb.py
@@ -27,7 +27,7 @@ def test_batch_writer_is_documented(self):
self.assert_contains_lines_in_order([
'.. py:class:: DynamoDB.Table(name)',
' * :py:meth:`batch_writer()`',
- ' .. py:method:: batch_writer()'],
+ ' .. py:method:: batch_writer(overwrite_by_pkeys=None)'],
self.generated_contents
)
diff --git a/tests/unit/dynamodb/test_table.py b/tests/unit/dynamodb/test_table.py
index 127dce270d..66b8f335fb 100644
--- a/tests/unit/dynamodb/test_table.py
+++ b/tests/unit/dynamodb/test_table.py
@@ -285,3 +285,104 @@ def test_repeated_flushing_on_exit(self):
}
self.assert_batch_write_calls_are([first_batch, second_batch,
third_batch])
+
+ def test_auto_dedup_for_dup_requests(self):
+ with BatchWriter(self.table_name, self.client,
+ flush_amount=5, overwrite_by_pkeys=["pkey", "skey"]) as b:
+ # dup 1
+ b.put_item(Item={
+ 'pkey': 'foo1',
+ 'skey': 'bar1',
+ 'other': 'other1'
+ })
+ b.put_item(Item={
+ 'pkey': 'foo1',
+ 'skey': 'bar1',
+ 'other': 'other2'
+ })
+ # dup 2
+ b.delete_item(Key={
+ 'pkey': 'foo1',
+ 'skey': 'bar2',
+ })
+ b.put_item(Item={
+ 'pkey': 'foo1',
+ 'skey': 'bar2',
+ 'other': 'other3'
+ })
+ # dup 3
+ b.put_item(Item={
+ 'pkey': 'foo2',
+ 'skey': 'bar2',
+ 'other': 'other3'
+ })
+ b.delete_item(Key={
+ 'pkey': 'foo2',
+ 'skey': 'bar2',
+ })
+ # dup 4
+ b.delete_item(Key={
+ 'pkey': 'foo2',
+ 'skey': 'bar3',
+ })
+ b.delete_item(Key={
+ 'pkey': 'foo2',
+ 'skey': 'bar3',
+ })
+ # 5
+ b.delete_item(Key={
+ 'pkey': 'foo3',
+ 'skey': 'bar3',
+ })
+ # 2nd batch
+ b.put_item(Item={
+ 'pkey': 'foo1',
+ 'skey': 'bar1',
+ 'other': 'other1'
+ })
+ b.put_item(Item={
+ 'pkey': 'foo1',
+ 'skey': 'bar1',
+ 'other': 'other2'
+ })
+
+ first_batch = {
+ 'RequestItems': {
+ self.table_name: [
+ {'PutRequest': { 'Item': {
+ 'pkey': 'foo1',
+ 'skey': 'bar1',
+ 'other': 'other2'
+ }}},
+ {'PutRequest': { 'Item': {
+ 'pkey': 'foo1',
+ 'skey': 'bar2',
+ 'other': 'other3'
+ }}},
+ {'DeleteRequest': {'Key': {
+ 'pkey': 'foo2',
+ 'skey': 'bar2',
+ }}},
+ {'DeleteRequest': {'Key': {
+ 'pkey': 'foo2',
+ 'skey': 'bar3',
+ }}},
+ {'DeleteRequest': {'Key': {
+ 'pkey': 'foo3',
+ 'skey': 'bar3',
+ }}},
+ ]
+ }
+ }
+ second_batch = {
+ 'RequestItems': {
+ self.table_name: [
+ {'PutRequest': { 'Item': {
+ 'pkey': 'foo1',
+ 'skey': 'bar1',
+ 'other': 'other2'
+ }}},
+ ]
+ }
+ }
+ self.assert_batch_write_calls_are([first_batch, second_batch])
| diff --git a/.changes/next-release/feature-DynamoDB.json b/.changes/next-release/feature-DynamoDB.json
new file mode 100644
index 0000000000..e7c6841edc
--- /dev/null
+++ b/.changes/next-release/feature-DynamoDB.json
@@ -0,0 +1,5 @@
+{
+ "category": "DynamoDB",
+ "type": "feature",
+ "description": "Add request auto de-duplication based on specified primary keys for batch_writer. (``#605 <https://github.com/boto/boto3/issues/605>`__ <https"
+}
\ No newline at end of file
diff --git a/docs/source/guide/dynamodb.rst b/docs/source/guide/dynamodb.rst
index bf58c506c7..ade7d636a2 100644
--- a/docs/source/guide/dynamodb.rst
+++ b/docs/source/guide/dynamodb.rst
@@ -274,6 +274,64 @@ table.
}
)
+The batch writer can help to de-duplicate request by specifying ``overwrite_by_pkeys=['partition_key', 'sort_key']``
+if you want to bypass no duplication limitation of single batch write request as
+``botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the BatchWriteItem operation: Provided list of item keys contains duplicates``.
+
+It will drop request items in the buffer if their primary keys(composite) values are
+the same as newly added one, as eventually consistent with streams of individual
+put/delete operations on the same item.
+
+::
+
+ with table.batch_writer(overwrite_by_pkeys=['partition_key', 'sort_key']) as batch:
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's1',
+ 'other': '111',
+ }
+ )
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's1',
+ 'other': '222',
+ }
+ )
+ batch.delete_item(
+ Key={
+ 'partition_key': 'p1',
+ 'sort_key': 's2'
+ }
+ )
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's2',
+ 'other': '444',
+ }
+ )
+
+after de-duplicate:
+
+::
+
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's1',
+ 'other': '222',
+ }
+ )
+ batch.put_item(
+ Item={
+ 'partition_key': 'p1',
+ 'sort_key': 's2',
+ 'other': '444',
+ }
+ )
+
Querying and Scanning
---------------------
| [
{
"components": [
{
"doc": "",
"lines": [
106,
111
],
"name": "BatchWriter._add_request_and_process",
"signature": "def _add_request_and_process(self, request):",
"type": "function"
},
{
"doc": "",
"lines": [
... | [
"tests/functional/docs/test_dynamodb.py::TestDynamoDBCustomizations::test_batch_writer_is_documented",
"tests/unit/dynamodb/test_table.py::BaseTransformationTest::test_auto_dedup_for_dup_requests"
] | [
"tests/functional/docs/test_dynamodb.py::TestDynamoDBCustomizations::test_conditions_is_documented",
"tests/functional/docs/test_dynamodb.py::TestDynamoDBCustomizations::test_document_interface_is_documented",
"tests/unit/dynamodb/test_table.py::BaseTransformationTest::test_all_items_flushed_on_exit",
"tests/... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
dynamodb: add request auto de-duplication for batch_writer
# Motivation
For scenarios like parsing some values from several sources like server log, user upload data which might contain value duplication, and write them to dynamoDB as unique values.
You want to bypass no duplication limitation of single batch write request within `boto3` rather than adding another layer to deal with value duplication.
The no duplication error looks like
`botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the BatchWriteItem operation: Provided list of item keys contains duplicates`.
# Solution
To de-duplicate request by specifying `auto_dedup=True`
It will just write out a single request since all requests are the same in below example.
``` python
with table.batch_writer(auto_dedup=True) as batch:
for _ in range(50):
batch.put_item(
Item={
'account_type': 'anonymous',
'username': 'user',
'first_name': 'unknown',
'last_name': 'unknown'
}
)
```
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/dynamodb/table.py]
(definition of BatchWriter._add_request_and_process:)
def _add_request_and_process(self, request):
(definition of BatchWriter._remove_dup_pkeys_request_if_any:)
def _remove_dup_pkeys_request_if_any(self, request):
(definition of BatchWriter._extract_pkey_values:)
def _extract_pkey_values(self, request):
[end of new definitions in boto3/dynamodb/table.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | |
python-hyper__h2-204 | 204 | python-hyper/h2 | null | 2cdc7dde04a6ba42110841e1c843a4c7d2399eb6 | 2016-04-21T14:04:28Z | diff --git a/HISTORY.rst b/HISTORY.rst
index 3d50672d2..b171ee811 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -24,6 +24,8 @@ Bugfixes
~~~~~~~~
- Correctly forbid pseudo-headers that were not defined in RFC 7540.
+- Automatically ensure that all ``Authorization`` headers and short ``Cookie``
+ headers are prevented from being added to encoding contexts.
2.2.3 (2016-04-13)
------------------
diff --git a/h2/stream.py b/h2/stream.py
index 08296625e..db14abbb8 100644
--- a/h2/stream.py
+++ b/h2/stream.py
@@ -24,7 +24,8 @@
ProtocolError, StreamClosedError, InvalidBodyLengthError
)
from .utilities import (
- guard_increment_window, is_informational_response, authority_from_headers
+ guard_increment_window, is_informational_response, authority_from_headers,
+ secure_headers
)
@@ -988,8 +989,10 @@ def _build_headers_frames(self,
"""
Helper method to build headers or push promise frames.
"""
- # We need to lowercase the header names.
+ # We need to lowercase the header names, and to ensure that secure
+ # header fields are kept out of compression contexts.
headers = _lowercase_header_names(headers)
+ headers = secure_headers(headers)
encoded_headers = encoder.encode(headers)
# Slice into blocks of max_outbound_frame_size. Be careful with this:
diff --git a/h2/utilities.py b/h2/utilities.py
index 132eb3473..f770ad7da 100644
--- a/h2/utilities.py
+++ b/h2/utilities.py
@@ -7,6 +7,8 @@
"""
import re
+from hpack import NeverIndexedHeaderTuple
+
from .exceptions import ProtocolError, FlowControlError
UPPER_RE = re.compile(b"[A-Z]")
@@ -31,6 +33,32 @@
])
+def secure_headers(headers):
+ """
+ Certain headers are at risk of being attacked during the header compression
+ phase, and so need to be kept out of header compression contexts. This
+ function automatically transforms certain specific headers into HPACK
+ never-indexed fields to ensure they don't get added to header compression
+ contexts.
+
+ This function currently implements two rules:
+
+ - All 'authorization' headers are automatically made never-indexed.
+ - Any 'cookie' header field shorter than 20 bytes long is made
+ never-indexed.
+
+ These two fields are the most at-risk. These rules are inspired by Firefox
+ and nghttp2.
+ """
+ for header in headers:
+ if header[0] in (b'authorization', u'authorization'):
+ yield NeverIndexedHeaderTuple(*header)
+ elif header[0] in (b'cookie', u'cookie') and len(header[1]) < 20:
+ yield NeverIndexedHeaderTuple(*header)
+ else:
+ yield header
+
+
def is_informational_response(headers):
"""
Searches a header block for a :status header to confirm that a given
| diff --git a/test/test_header_indexing.py b/test/test_header_indexing.py
index ea4fa5e01..10ff42670 100644
--- a/test/test_header_indexing.py
+++ b/test/test_header_indexing.py
@@ -311,3 +311,291 @@ def test_header_tuples_are_decoded_push_promise(self,
assert isinstance(event, h2.events.PushedStreamReceived)
assert_header_blocks_actually_equal(headers, event.headers)
+
+
+class TestSecureHeaders(object):
+ """
+ Certain headers should always be transformed to their never-indexed form.
+ """
+ example_request_headers = [
+ (u':authority', u'example.com'),
+ (u':path', u'/'),
+ (u':scheme', u'https'),
+ (u':method', u'GET'),
+ ]
+ bytes_example_request_headers = [
+ (b':authority', b'example.com'),
+ (b':path', b'/'),
+ (b':scheme', b'https'),
+ (b':method', b'GET'),
+ ]
+ possible_auth_headers = [
+ (u'authorization', u'test'),
+ (u'Authorization', u'test'),
+ (u'authorization', u'really long test'),
+ HeaderTuple(u'authorization', u'test'),
+ HeaderTuple(u'Authorization', u'test'),
+ HeaderTuple(u'authorization', u'really long test'),
+ NeverIndexedHeaderTuple(u'authorization', u'test'),
+ NeverIndexedHeaderTuple(u'Authorization', u'test'),
+ NeverIndexedHeaderTuple(u'authorization', u'really long test'),
+ (b'authorization', b'test'),
+ (b'Authorization', b'test'),
+ (b'authorization', b'really long test'),
+ HeaderTuple(b'authorization', b'test'),
+ HeaderTuple(b'Authorization', b'test'),
+ HeaderTuple(b'authorization', b'really long test'),
+ NeverIndexedHeaderTuple(b'authorization', b'test'),
+ NeverIndexedHeaderTuple(b'Authorization', b'test'),
+ NeverIndexedHeaderTuple(b'authorization', b'really long test'),
+ ]
+ secured_cookie_headers = [
+ (u'cookie', u'short'),
+ (u'Cookie', u'short'),
+ (u'cookie', u'nineteen byte cooki'),
+ HeaderTuple(u'cookie', u'short'),
+ HeaderTuple(u'Cookie', u'short'),
+ HeaderTuple(u'cookie', u'nineteen byte cooki'),
+ NeverIndexedHeaderTuple(u'cookie', u'short'),
+ NeverIndexedHeaderTuple(u'Cookie', u'short'),
+ NeverIndexedHeaderTuple(u'cookie', u'nineteen byte cooki'),
+ NeverIndexedHeaderTuple(u'cookie', u'longer manually secured cookie'),
+ (b'cookie', b'short'),
+ (b'Cookie', b'short'),
+ (b'cookie', b'nineteen byte cooki'),
+ HeaderTuple(b'cookie', b'short'),
+ HeaderTuple(b'Cookie', b'short'),
+ HeaderTuple(b'cookie', b'nineteen byte cooki'),
+ NeverIndexedHeaderTuple(b'cookie', b'short'),
+ NeverIndexedHeaderTuple(b'Cookie', b'short'),
+ NeverIndexedHeaderTuple(b'cookie', b'nineteen byte cooki'),
+ NeverIndexedHeaderTuple(b'cookie', b'longer manually secured cookie'),
+ ]
+ unsecured_cookie_headers = [
+ (u'cookie', u'twenty byte cookie!!'),
+ (u'Cookie', u'twenty byte cookie!!'),
+ (u'cookie', u'substantially longer than 20 byte cookie'),
+ HeaderTuple(u'cookie', u'twenty byte cookie!!'),
+ HeaderTuple(u'cookie', u'twenty byte cookie!!'),
+ HeaderTuple(u'Cookie', u'twenty byte cookie!!'),
+ (b'cookie', b'twenty byte cookie!!'),
+ (b'Cookie', b'twenty byte cookie!!'),
+ (b'cookie', b'substantially longer than 20 byte cookie'),
+ HeaderTuple(b'cookie', b'twenty byte cookie!!'),
+ HeaderTuple(b'cookie', b'twenty byte cookie!!'),
+ HeaderTuple(b'Cookie', b'twenty byte cookie!!'),
+ ]
+
+ @pytest.mark.parametrize(
+ 'headers', (example_request_headers, bytes_example_request_headers)
+ )
+ @pytest.mark.parametrize('auth_header', possible_auth_headers)
+ def test_authorization_headers_never_indexed(self,
+ headers,
+ auth_header,
+ frame_factory):
+ """
+ Authorization headers are always forced to be never-indexed, regardless
+ of their form.
+ """
+ # Regardless of what we send, we expect it to be never indexed.
+ send_headers = headers + [auth_header]
+ expected_headers = headers + [
+ NeverIndexedHeaderTuple(auth_header[0].lower(), auth_header[1])
+ ]
+
+ c = h2.connection.H2Connection()
+ c.initiate_connection()
+
+ # Clear the data, then send headers.
+ c.clear_outbound_data_buffer()
+ c.send_headers(1, send_headers)
+
+ f = frame_factory.build_headers_frame(headers=expected_headers)
+ assert c.data_to_send() == f.serialize()
+
+ @pytest.mark.parametrize(
+ 'headers', (example_request_headers, bytes_example_request_headers)
+ )
+ @pytest.mark.parametrize('auth_header', possible_auth_headers)
+ def test_authorization_headers_never_indexed_push(self,
+ headers,
+ auth_header,
+ frame_factory):
+ """
+ Authorization headers are always forced to be never-indexed, regardless
+ of their form, when pushed by a server.
+ """
+ # Regardless of what we send, we expect it to be never indexed.
+ send_headers = headers + [auth_header]
+ expected_headers = headers + [
+ NeverIndexedHeaderTuple(auth_header[0].lower(), auth_header[1])
+ ]
+
+ c = h2.connection.H2Connection(client_side=False)
+ c.receive_data(frame_factory.preamble())
+
+ # We can use normal headers for the request.
+ f = frame_factory.build_headers_frame(
+ self.example_request_headers
+ )
+ c.receive_data(f.serialize())
+
+ frame_factory.refresh_encoder()
+ expected_frame = frame_factory.build_push_promise_frame(
+ stream_id=1,
+ promised_stream_id=2,
+ headers=expected_headers,
+ flags=['END_HEADERS'],
+ )
+
+ c.clear_outbound_data_buffer()
+ c.push_stream(
+ stream_id=1,
+ promised_stream_id=2,
+ request_headers=send_headers
+ )
+
+ assert c.data_to_send() == expected_frame.serialize()
+
+ @pytest.mark.parametrize(
+ 'headers', (example_request_headers, bytes_example_request_headers)
+ )
+ @pytest.mark.parametrize('cookie_header', secured_cookie_headers)
+ def test_short_cookie_headers_never_indexed(self,
+ headers,
+ cookie_header,
+ frame_factory):
+ """
+ Short cookie headers, and cookies provided as NeverIndexedHeaderTuple,
+ are never indexed.
+ """
+ # Regardless of what we send, we expect it to be never indexed.
+ send_headers = headers + [cookie_header]
+ expected_headers = headers + [
+ NeverIndexedHeaderTuple(cookie_header[0].lower(), cookie_header[1])
+ ]
+
+ c = h2.connection.H2Connection()
+ c.initiate_connection()
+
+ # Clear the data, then send headers.
+ c.clear_outbound_data_buffer()
+ c.send_headers(1, send_headers)
+
+ f = frame_factory.build_headers_frame(headers=expected_headers)
+ assert c.data_to_send() == f.serialize()
+
+ @pytest.mark.parametrize(
+ 'headers', (example_request_headers, bytes_example_request_headers)
+ )
+ @pytest.mark.parametrize('cookie_header', secured_cookie_headers)
+ def test_short_cookie_headers_never_indexed_push(self,
+ headers,
+ cookie_header,
+ frame_factory):
+ """
+ Short cookie headers, and cookies provided as NeverIndexedHeaderTuple,
+ are never indexed when pushed by servers.
+ """
+ # Regardless of what we send, we expect it to be never indexed.
+ send_headers = headers + [cookie_header]
+ expected_headers = headers + [
+ NeverIndexedHeaderTuple(cookie_header[0].lower(), cookie_header[1])
+ ]
+
+ c = h2.connection.H2Connection(client_side=False)
+ c.receive_data(frame_factory.preamble())
+
+ # We can use normal headers for the request.
+ f = frame_factory.build_headers_frame(
+ self.example_request_headers
+ )
+ c.receive_data(f.serialize())
+
+ frame_factory.refresh_encoder()
+ expected_frame = frame_factory.build_push_promise_frame(
+ stream_id=1,
+ promised_stream_id=2,
+ headers=expected_headers,
+ flags=['END_HEADERS'],
+ )
+
+ c.clear_outbound_data_buffer()
+ c.push_stream(
+ stream_id=1,
+ promised_stream_id=2,
+ request_headers=send_headers
+ )
+
+ assert c.data_to_send() == expected_frame.serialize()
+
+ @pytest.mark.parametrize(
+ 'headers', (example_request_headers, bytes_example_request_headers)
+ )
+ @pytest.mark.parametrize('cookie_header', unsecured_cookie_headers)
+ def test_long_cookie_headers_can_be_indexed(self,
+ headers,
+ cookie_header,
+ frame_factory):
+ """
+ Longer cookie headers can be indexed.
+ """
+ # Regardless of what we send, we expect it to be indexed.
+ send_headers = headers + [cookie_header]
+ expected_headers = headers + [
+ HeaderTuple(cookie_header[0].lower(), cookie_header[1])
+ ]
+
+ c = h2.connection.H2Connection()
+ c.initiate_connection()
+
+ # Clear the data, then send headers.
+ c.clear_outbound_data_buffer()
+ c.send_headers(1, send_headers)
+
+ f = frame_factory.build_headers_frame(headers=expected_headers)
+ assert c.data_to_send() == f.serialize()
+
+ @pytest.mark.parametrize(
+ 'headers', (example_request_headers, bytes_example_request_headers)
+ )
+ @pytest.mark.parametrize('cookie_header', unsecured_cookie_headers)
+ def test_long_cookie_headers_can_be_indexed_push(self,
+ headers,
+ cookie_header,
+ frame_factory):
+ """
+ Longer cookie headers can be indexed.
+ """
+ # Regardless of what we send, we expect it to be never indexed.
+ send_headers = headers + [cookie_header]
+ expected_headers = headers + [
+ HeaderTuple(cookie_header[0].lower(), cookie_header[1])
+ ]
+
+ c = h2.connection.H2Connection(client_side=False)
+ c.receive_data(frame_factory.preamble())
+
+ # We can use normal headers for the request.
+ f = frame_factory.build_headers_frame(
+ self.example_request_headers
+ )
+ c.receive_data(f.serialize())
+
+ frame_factory.refresh_encoder()
+ expected_frame = frame_factory.build_push_promise_frame(
+ stream_id=1,
+ promised_stream_id=2,
+ headers=expected_headers,
+ flags=['END_HEADERS'],
+ )
+
+ c.clear_outbound_data_buffer()
+ c.push_stream(
+ stream_id=1,
+ promised_stream_id=2,
+ request_headers=send_headers
+ )
+
+ assert c.data_to_send() == expected_frame.serialize()
| diff --git a/HISTORY.rst b/HISTORY.rst
index 3d50672d2..b171ee811 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -24,6 +24,8 @@ Bugfixes
~~~~~~~~
- Correctly forbid pseudo-headers that were not defined in RFC 7540.
+- Automatically ensure that all ``Authorization`` headers and short ``Cookie``
+ headers are prevented from being added to encoding contexts.
2.2.3 (2016-04-13)
------------------
| [
{
"components": [
{
"doc": "Certain headers are at risk of being attacked during the header compression\nphase, and so need to be kept out of header compression contexts. This\nfunction automatically transforms certain specific headers into HPACK\nnever-indexed fields to ensure they don't get adde... | [
"test/test_header_indexing.py::TestSecureHeaders::test_authorization_headers_never_indexed[auth_header0-headers0]",
"test/test_header_indexing.py::TestSecureHeaders::test_authorization_headers_never_indexed[auth_header0-headers1]",
"test/test_header_indexing.py::TestSecureHeaders::test_authorization_headers_nev... | [
"test/test_header_indexing.py::TestHeaderIndexing::test_sending_header_tuples[headers0]",
"test/test_header_indexing.py::TestHeaderIndexing::test_sending_header_tuples[headers1]",
"test/test_header_indexing.py::TestHeaderIndexing::test_sending_header_tuples[headers2]",
"test/test_header_indexing.py::TestHeade... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Automatically prevent some headers from being indexed.
Resolves #194.
This change implements the final part of #194: automatically preventing certain header fields from being indexed. This specifically implements the suggestion from @tatsuhiro-t: namely, all Authorization headers and short Cookie headers are forced to be never indexed.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in h2/utilities.py]
(definition of secure_headers:)
def secure_headers(headers):
"""Certain headers are at risk of being attacked during the header compression
phase, and so need to be kept out of header compression contexts. This
function automatically transforms certain specific headers into HPACK
never-indexed fields to ensure they don't get added to header compression
contexts.
This function currently implements two rules:
- All 'authorization' headers are automatically made never-indexed.
- Any 'cookie' header field shorter than 20 bytes long is made
never-indexed.
These two fields are the most at-risk. These rules are inspired by Firefox
and nghttp2."""
[end of new definitions in h2/utilities.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
[Discussion] What should we do about HPACK never-indexed header fields?
This is a complex issue and I may fork it out into multiple places once we've identified work items (if any), but I'd like to briefly talk about the problem.
[RFC 7541](https://tools.ietf.org/html/rfc7541) (the HPACK specification) provides support for what it calls "never indexed" header fields ([§ 6.2.3](https://tools.ietf.org/html/rfc7541#section-6.2.3)). These fields have certain restrictions, which exist to serve one specific goal:
> This representation is intended for protecting header field values that are not to be put at risk by compressing them.
The core reasoning is discussed at length in [RFC 7541 § 7.1](https://tools.ietf.org/html/rfc7541#section-7.1), but can be summarised as follows. It is possible for attackers to mount attacks similar to the [CRIME](https://en.wikipedia.org/wiki/CRIME) attack against the HPACK compression algorithm state. Put another way, if the attacker is capable of getting any entity that emits privacy-sensitive headers to emit headers of their own construction, they are potentially able to use the size of the responses to probe the compression state of the endpoint. That can expose users to the risk of having their credentials stolen: obviously very bad.
RFC 7541 points out that
> Attacks of this nature are possible any time that two mutually distrustful entities control requests or responses that are placed onto a single HTTP/2 connection.
The cases that worry me here are:
- Servers or clients that allow users to inject headers without validation.
- Intermediaries that coalesce connections in any way.
Happily, RFC 7541's "never indexed" literals exist to solve this problem. These header fields are sent in their literal form with one extra caveat: intermediaries MUST NOT translate them to any other form. That means that they never get added to the compression context of any HTTP/2 box in the network.
The purpose of this thread is to work out what hyper-h2 should do about this. The Python HPACK library has support for emitting headers in this form (since 1.1.0), and handles receiving them appropriately.
I have two questions:
1. How do we handle servers/clients needing to keep fields out of compression contexts? Do we give them explicit APIs to do it, or do we do it by default for specific fields (e.g. Authorization, Cookie)? Do we do both? If we give them explicit APIs, how should that API look?
2. What about middleboxes (e.g. mitmproxy)? Right now they aren't told about headers that are emitted with never indexed semantics, which means they aren't able to meet the requirements of RFC 7541. That clearly has to change.
I'd like to solicit answers to those questions from some people. I'm explicitly tagging the Hyper devs (@python-hyper/core), the mitmproxy devs (@kriechi, @mhils), and some other people who care about this sort of thing (@jimcarreer, @bagder, @tatsuhiro-t) to get your ideas about this.
----------
So the problem here is that its really up to the developers to decide which headers should not be compressed for security reasons. While it is sane to assume that by default (with maybe a simple list/set that can be set by the hyper / hpack user) we should not compress Authorization or Cookie headers, really any header containing session identifying information could be vulnerable to the attack. I've seen many a REST API, for example, using custom X- headers for storing session and authentication data. There would be no "catch all" for Hyper / HPACK as far as I can see, and since it's a library, providing that might be out of scope anyway.
Speaking as nghttp2 maintainer, nghttp2 uses never-indexed for Authorization header fields, and Cookie header fields whose value is less than 20 bytes. The 20 bytes cookie comes from Firefox codebase. Since we'd like to avoid exact match, we care about short entropy cookie here. Authorization is typically short, so they are encoded in never indexed always.
nghttp2 offers the API that application can choose never indexed representation for particular header fields.
Thanks @tatsuhiro-t, that's extremely valuable feedback. It's also almost exactly what I was considering doing (except for the length limitation on cookies, which is a good idea).
This is just my 2 cents, I am a bit hesitant to support the 20 byte limit: pulling off a successful CRIME attack really comes down to how many requests can be made against the system that compresses the sensitive header. It is definitely a "brute force" guessing attack, but it is made easier by the fact that you should be able to make many simultaneous requests against the compressing host given the resources (like a large bot network). At the very very least I'd say it should be an additional option that can be changed with an API call (and in general maybe we want to extend the ability to have a maximum byte length for any header set by the user to never be indexed). I hope that makes sense?
For mitmproxy (or middleboxes on general), if we would like to follow the
spec religiously, we'd need to get the metadata from the sending connection
which headers are never-indexed in order to feed that in the other
connection. Ideally, we would just take something hyper-h2 produces (just
the list of never-indexed headers?) and feed that as-is into another
connection instance when sending headers.
On Wed, 6 Apr 2016 07:03 Cory Benfield, notifications@github.com wrote:
> Thanks @tatsuhiro-t https://github.com/tatsuhiro-t, that's extremely
> valuable feedback. It's also almost exactly what I was considering doing
> (except for the length limitation on cookies, which is a good idea).
>
> —
> You are receiving this because you were mentioned.
>
> Reply to this email directly or view it on GitHub
> https://github.com/python-hyper/hyper-h2/issues/194#issuecomment-206386387
@mhils @Lukasa Maybe what HPACK and eventually Hyper emits in terms of headers could also include an element specifying that it is not to be indexed? IIRC the hpack decoder takes care of determining which headers from the sender were not indexed but doesn't actually make that information available in any meaningful way. @Lukasa do you think it is something the table object could / should track?
> do you think it is something the table object could / should track?
The table object shouldn't track it: that's part of the purpose of the never indexed headers, they shouldn't reside in the compression tables in any way.
I'm beginning to wonder if we should transition to a "richer" representation of headers when generating them from HPACK, and then when working with them in hyper-h2. For example, we could use namedtuples that contain a value indicating whether they're indexed and emit those from HPACK and then work with them inside hyper-h2. That would allow us to attach richer metadata to each header (for now just the never-index flag) and provide the user with objects they can hook into to flag that appropriately.
The way I see it, we can go a few different ways with this:
1. Use named three-tuples (`namedtuple('HeaderField', ['name', 'value', 'never_index'])`) to emit all headers from HPACK. This breaks anyone expecting to be able to tuple-unpack the headers coming out of HPACK (which probably includes HPACK's test suites and possibly even HPACK itself!), but has the advantage of being isomorphic to the input HPACK takes.
2. At the HPACK level, alternate between two and three tuples (named or not) depending on whether the header field is never indexed. That's _truly_ isomorphic to the input HPACK takes but is also clearly stupid, so we'll immediately rule it out.
3. Use namedtuples that are always two-tuples and differ only by type to signal indexed/non-indexed status. E.g. `namedtuple('HeaderField', ['name', 'value'])` & `namedtuple('HeaderFieldNeverIndex', ['name', 'value'])`. That allows `isinstance` checking to determine the correct value. These would probably get passed through to hyper-h2 directly, which would emit them itself. hyper-h2 would also _accept_ these types (just safely because they're tuples).
4. Use a tuple subclass that is _not_ a namedtuple (it'd be hand-rolled to be as efficient as namedtuple though) that carries a flag on it indicating whether the header field is to be never indexed, but that does not include that header field in the tuple directly. That would meant that tuple unpacking still works, but that the flag can be accessed directly rather than having to do `isinstance` checks to get everything to work. hyper-h2 will also just pass these up in the place that the headers currently go, and will accept them internally.
Of the set of ideas I think I prefer 4, even though it requires the most work, because it enables us to keep most of the current behaviour the same and provides the nicest API for checking whether headers are safe to index. It also makes mitmproxy's job easier (it can pass the entire header block back into hyper-h2 and just automatically get the right behaviour), and allows us to provide a declarative API to consumers that want to mark headers as never indexed ("instantiate this special kind of tuple!").
Thoughts?
@Lukasa If I understand correctly, would we need to rework the interface for the encoder/decoder, or just the emissions of the decoder (HPACK) for no 4 which I agree sounds like the better solution.
The encoder would need to be reworked too, but it's fairly minor: we can enhance the encoder to replace three-tuples with these special tuples. Shouldn't take long at all. =)
So, the tuples would be implemented roughly like this:
``` python
class HeaderTuple(tuple):
__slots__ = ()
indexable = True
def __new__(_cls, *args):
return tuple.__new__(_cls, args)
class NeverIndexedHeaderTuple(HeaderTuple):
__slots__ = ()
indexable = False
```
This has a nice side-effect: because indexable isn't mutable (as no tuple fields are mutable), it becomes very difficult to accidentally break these header tuples. They also unpack exactly as one would expect, and altogether just behave properly.
Do you really need to define `__new__` for the `HeaderTuple`?
@sigmavirus24 To get it to behave like a tuple you do. The tuple constructor doesn't actually take multiple arguments, it takes a single iterable. That feels really awkward when what you want to write is `HeaderTuple(name, value)`.
Ok, hpack v2.2.0 has just been shipped with the brand new tuple classes. We can plumb support through hyper-h2 now.
This isn't done yet, silly GitHub.
--------------------
</issues> | 9df8f94ce983d44ef57c8f332463f7b3cbe0127b |
boto__boto3-573 | 573 | boto/boto3 | null | 9ffa82006b2f9c7eb55d57ae7ca632ed7adf3fa3 | 2016-03-29T17:32:59Z | diff --git a/boto3/session.py b/boto3/session.py
index 554958e171..fe1f79a639 100644
--- a/boto3/session.py
+++ b/boto3/session.py
@@ -164,6 +164,16 @@ def get_available_regions(self, service_name, partition_name='aws',
service_name=service_name, partition_name=partition_name,
allow_non_regional=allow_non_regional)
+ def get_credentials(self):
+ """
+ Return the :class:`botocore.credential.Credential` object
+ associated with this session. If the credentials have not
+ yet been loaded, this will attempt to load them. If they
+ have already been loaded, this will return the cached
+ credentials.
+ """
+ return self._session.get_credentials()
+
def client(self, service_name, region_name=None, api_version=None,
use_ssl=True, verify=None, endpoint_url=None,
aws_access_key_id=None, aws_secret_access_key=None,
| diff --git a/tests/unit/test_session.py b/tests/unit/test_session.py
index 9ae03d434c..bb5f116c4c 100644
--- a/tests/unit/test_session.py
+++ b/tests/unit/test_session.py
@@ -61,6 +61,29 @@ def test_credentials_can_be_set(self):
bc_session.set_credentials.assert_called_with(
'key', 'secret', 'token')
+ def test_can_get_credentials(self):
+ access_key = 'foo'
+ secret_key = 'bar'
+ token = 'baz'
+
+ creds = mock.Mock()
+ creds.access_key = access_key
+ creds.secret_key = secret_key
+ creds.token = token
+
+ bc_session = self.bc_session_cls.return_value
+ bc_session.get_credentials.return_value = creds
+
+ session = Session(
+ aws_access_key_id=access_key,
+ aws_secret_access_key=secret_key,
+ aws_session_token=token)
+
+ credentials = session.get_credentials()
+ self.assertEqual(credentials.access_key, access_key)
+ self.assertEqual(credentials.secret_key, secret_key)
+ self.assertEqual(credentials.token, token)
+
def test_profile_can_be_set(self):
bc_session = self.bc_session_cls.return_value
| [
{
"components": [
{
"doc": "Return the :class:`botocore.credential.Credential` object\nassociated with this session. If the credentials have not\nyet been loaded, this will attempt to load them. If they\nhave already been loaded, this will return the cached\ncredentials.",
"lines": [
... | [
"tests/unit/test_session.py::TestSession::test_can_get_credentials"
] | [
"tests/unit/test_session.py::TestSession::test_arguments_not_required",
"tests/unit/test_session.py::TestSession::test_bad_resource_name",
"tests/unit/test_session.py::TestSession::test_bad_resource_name_with_no_client_has_simple_err_msg",
"tests/unit/test_session.py::TestSession::test_can_access_region_name"... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add method to get session credentials
We expose this in botocore, but not boto3.
cc @kyleknap @jamesls
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/session.py]
(definition of Session.get_credentials:)
def get_credentials(self):
"""Return the :class:`botocore.credential.Credential` object
associated with this session. If the credentials have not
yet been loaded, this will attempt to load them. If they
have already been loaded, this will return the cached
credentials."""
[end of new definitions in boto3/session.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | ||
boto__boto3-554 | 554 | boto/boto3 | null | a0333f70860d76f31fddac6b7e5649f26a64dbec | 2016-03-18T23:06:44Z | diff --git a/boto3/resources/factory.py b/boto3/resources/factory.py
index 800679e672..b627855288 100644
--- a/boto3/resources/factory.py
+++ b/boto3/resources/factory.py
@@ -248,6 +248,26 @@ def _load_has_relations(self, attrs, resource_name, resource_model,
service_context=service_context
)
+ self._create_available_subresources_command(
+ attrs, resource_model.subresources)
+
+ def _create_available_subresources_command(self, attrs, subresources):
+ _subresources = [subresource.name for subresource in subresources]
+ _subresources = sorted(_subresources)
+
+ def get_available_subresources(factory_self):
+ """
+ Returns a list of all the available sub-resources for this
+ Resource.
+
+ :returns: A list containing the name of each sub-resource for this
+ resource
+ :rtype: list of str
+ """
+ return _subresources
+
+ attrs['get_available_subresources'] = get_available_subresources
+
def _load_waiters(self, attrs, resource_name, resource_model,
service_context):
"""
| diff --git a/tests/functional/test_resource.py b/tests/functional/test_resource.py
index f2656eb0ec..941bb8826f 100644
--- a/tests/functional/test_resource.py
+++ b/tests/functional/test_resource.py
@@ -39,7 +39,6 @@ def test_can_inject_method_onto_resource(self):
self.assertEqual(resource.my_method('anything'), 'anything')
-
class TestSessionErrorMessages(unittest.TestCase):
def test_has_good_error_message_when_no_resource(self):
bad_resource_name = 'doesnotexist'
@@ -48,3 +47,9 @@ def test_has_good_error_message_when_no_resource(self):
)
with self.assertRaisesRegexp(ResourceNotExistsError, err_regex):
boto3.resource(bad_resource_name)
+
+
+class TestGetAvailableSubresources(unittest.TestCase):
+ def test_s3_available_subresources_exists(self):
+ s3 = boto3.resource('s3')
+ self.assertTrue(hasattr(s3, 'get_available_subresources'))
diff --git a/tests/unit/resources/test_factory.py b/tests/unit/resources/test_factory.py
index ed2a22bc9b..98551487df 100644
--- a/tests/unit/resources/test_factory.py
+++ b/tests/unit/resources/test_factory.py
@@ -856,6 +856,13 @@ def test_contains_all_subresources(self):
self.assertIn('PriorityQueue', dir(resource))
self.assertIn('Message', dir(resource))
+ def test_get_available_subresources(self):
+ resource = self.load('test', self.model, self.defs)()
+ self.assertTrue(hasattr(resource, 'get_available_subresources'))
+ subresources = sorted(resource.get_available_subresources())
+ expected = sorted(['PriorityQueue', 'Message', 'QueueObject'])
+ self.assertEqual(subresources, expected)
+
def test_subresource_missing_all_subresources(self):
resource = self.load('test', self.model, self.defs)()
message = resource.Message('url', 'handle')
@@ -875,8 +882,9 @@ def test_event_emitted_when_class_created(self):
# Verify we send out the class attributes dict.
actual_class_attrs = sorted(call_args[1]['class_attributes'])
- self.assertEqual(actual_class_attrs,
- ['Message', 'PriorityQueue', 'QueueObject', 'meta'])
+ self.assertEqual(actual_class_attrs, [
+ 'Message', 'PriorityQueue', 'QueueObject',
+ 'get_available_subresources', 'meta'])
base_classes = sorted(call_args[1]['base_classes'])
self.assertEqual(base_classes, [ServiceResource])
| [
{
"components": [
{
"doc": "",
"lines": [
254,
269
],
"name": "ResourceFactory._create_available_subresources_command",
"signature": "def _create_available_subresources_command(self, attrs, subresources):",
"type": "function"
},
... | [
"tests/functional/test_resource.py::TestGetAvailableSubresources::test_s3_available_subresources_exists",
"tests/unit/resources/test_factory.py::TestServiceResourceSubresources::test_event_emitted_when_class_created",
"tests/unit/resources/test_factory.py::TestServiceResourceSubresources::test_get_available_sub... | [
"tests/functional/test_resource.py::TestResourceCustomization::test_can_inject_method_onto_resource",
"tests/functional/test_resource.py::TestSessionErrorMessages::test_has_good_error_message_when_no_resource",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_can_instantiate_service_resource",
... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add get_available_subresources to Resources
We currently have the ability to inspect what resources are available to a session, but not what subresources are available to a resource. This adds a method to do that on each resource class.
Resolves #113
This wasn't super high priority, but it was a very simple change.

cc @jamesls @kyleknap
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/resources/factory.py]
(definition of ResourceFactory._create_available_subresources_command:)
def _create_available_subresources_command(self, attrs, subresources):
(definition of ResourceFactory._create_available_subresources_command.get_available_subresources:)
def get_available_subresources(factory_self):
"""Returns a list of all the available sub-resources for this
Resource.
:returns: A list containing the name of each sub-resource for this
resource
:rtype: list of str"""
[end of new definitions in boto3/resources/factory.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Question about introspection
I can easily find the available service resources give a session:
```
>>> import boto3.session
>>> session = boto3.session.Session(profile_name='prod')
>>> session.available_resources
['cloudformation',
'dynamodb',
'ec2',
'glacier',
'iam',
'opsworks',
's3',
'sns',
'sqs']
```
But is there a way to find the available resources for a given service resource? I.E. is there some way to get the equivalent of:
```
>>> ec2 = session.resource('ec2')
>>> ec2.available_resources
```
I've been going through the code for a while and can't really figure out an easy way to do this.
----------
Unfortunately, there is nothing equivalent to `resource.available_resources`. I think that this would be helpful. Marking as feature request.
One way you could go about figuring the available resources for now is using the `inspect` module to figure out the resources members and then look for any member that begins with a capitol letter. Those are usually the resources.
Thanks. I can also find what I'm looking for by accessing the `_loader` private attribute of the client and then loading the service model but, of course, I shouldn't be mucking around with those private attributes.
--------------------
</issues> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | |
docker__docker-py-956 | 956 | docker/docker-py | null | 7befe694bd21e3c54bb1d7825270ea4bd6864c13 | 2016-02-24T14:41:06Z | diff --git a/docker/__init__.py b/docker/__init__.py
index 3844c81ac8..84d0734f51 100644
--- a/docker/__init__.py
+++ b/docker/__init__.py
@@ -17,4 +17,4 @@
__version__ = version
__title__ = 'docker-py'
-from .client import Client, AutoVersionClient # flake8: noqa
+from .client import Client, AutoVersionClient, from_env # flake8: noqa
diff --git a/docker/client.py b/docker/client.py
index 7d1f7c4669..5f60a328fd 100644
--- a/docker/client.py
+++ b/docker/client.py
@@ -28,10 +28,14 @@
from .auth import auth
from .unixconn import unixconn
from .ssladapter import ssladapter
-from .utils import utils, check_resource, update_headers
+from .utils import utils, check_resource, update_headers, kwargs_from_env
from .tls import TLSConfig
+def from_env(**kwargs):
+ return Client.from_env(**kwargs)
+
+
class Client(
requests.Session,
api.BuildApiMixin,
@@ -84,6 +88,10 @@ def __init__(self, base_url=None, version=None,
)
)
+ @classmethod
+ def from_env(cls, **kwargs):
+ return cls(**kwargs_from_env(**kwargs))
+
def _retrieve_server_version(self):
try:
return self.version(api_version=False)["ApiVersion"]
diff --git a/docs/boot2docker.md b/docs/boot2docker.md
deleted file mode 100644
index 4854e41425..0000000000
--- a/docs/boot2docker.md
+++ /dev/null
@@ -1,38 +0,0 @@
-# Using with Boot2docker
-
-For usage with boot2docker, there is a helper function in the utils package named `kwargs_from_env`, it will pass any environment variables from Boot2docker to the Client.
-
-First run boot2docker in your shell:
-```bash
-$ eval "$(boot2docker shellinit)"
-Writing /Users/you/.boot2docker/certs/boot2docker-vm/ca.pem
-Writing /Users/you/.boot2docker/certs/boot2docker-vm/cert.pem
-Writing /Users/you/.boot2docker/certs/boot2docker-vm/key.pem
-```
-
-You can then instantiate `docker.Client` like this:
-```python
-from docker.client import Client
-from docker.utils import kwargs_from_env
-
-cli = Client(**kwargs_from_env())
-print cli.version()
-```
-
-If you're encountering the following error:
-`SSLError: hostname '192.168.59.103' doesn't match 'boot2docker'`, you can:
-
-1. Add an entry to your /etc/hosts file matching boot2docker to the daemon's IP
-1. disable hostname validation (but please consider the security implications
- in doing this)
-
-```python
-from docker.client import Client
-from docker.utils import kwargs_from_env
-
-kwargs = kwargs_from_env()
-kwargs['tls'].assert_hostname = False
-
-cli = Client(**kwargs)
-print cli.version()
-```
\ No newline at end of file
diff --git a/docs/machine.md b/docs/machine.md
new file mode 100644
index 0000000000..d38316efc6
--- /dev/null
+++ b/docs/machine.md
@@ -0,0 +1,19 @@
+# Using with Docker Toolbox and Machine
+
+In development, we recommend using [Docker Toolbox](https://www.docker.com/products/docker-toolbox) to set up Docker. It includes a tool called Machine which will create a VM running Docker Engine and point your shell at it using environment variables.
+
+To configure docker-py with these environment variables
+
+First use Machine to set up the environment variables:
+```bash
+$ eval "$(docker-machine env)"
+```
+
+You can then use docker-py like this:
+```python
+import docker
+client = docker.from_env(assert_hostname=False)
+print client.version()
+```
+
+**Note:** We are disabling TLS hostname checking with `assert\_hostname=False`. Machine provides us with the exact certificate the server is using so this is safe. If you are not using Machine and verifying the host against a certificate authority, you'll want to enable this.
| diff --git a/tests/unit/client_test.py b/tests/unit/client_test.py
new file mode 100644
index 0000000000..1a173b5cc7
--- /dev/null
+++ b/tests/unit/client_test.py
@@ -0,0 +1,26 @@
+import os
+from docker.client import Client
+from .. import base
+
+TEST_CERT_DIR = os.path.join(
+ os.path.dirname(__file__),
+ 'testdata/certs',
+)
+
+
+class ClientTest(base.BaseTestCase):
+ def setUp(self):
+ self.os_environ = os.environ.copy()
+
+ def tearDown(self):
+ os.environ = self.os_environ
+
+ def test_from_env(self):
+ """Test that environment variables are passed through to
+ utils.kwargs_from_env(). KwargsFromEnvTest tests that environment
+ variables are parsed correctly."""
+ os.environ.update(DOCKER_HOST='tcp://192.168.59.103:2376',
+ DOCKER_CERT_PATH=TEST_CERT_DIR,
+ DOCKER_TLS_VERIFY='1')
+ client = Client.from_env()
+ self.assertEqual(client.base_url, "https://192.168.59.103:2376")
| diff --git a/docs/boot2docker.md b/docs/boot2docker.md
deleted file mode 100644
index 4854e41425..0000000000
--- a/docs/boot2docker.md
+++ /dev/null
@@ -1,38 +0,0 @@
-# Using with Boot2docker
-
-For usage with boot2docker, there is a helper function in the utils package named `kwargs_from_env`, it will pass any environment variables from Boot2docker to the Client.
-
-First run boot2docker in your shell:
-```bash
-$ eval "$(boot2docker shellinit)"
-Writing /Users/you/.boot2docker/certs/boot2docker-vm/ca.pem
-Writing /Users/you/.boot2docker/certs/boot2docker-vm/cert.pem
-Writing /Users/you/.boot2docker/certs/boot2docker-vm/key.pem
-```
-
-You can then instantiate `docker.Client` like this:
-```python
-from docker.client import Client
-from docker.utils import kwargs_from_env
-
-cli = Client(**kwargs_from_env())
-print cli.version()
-```
-
-If you're encountering the following error:
-`SSLError: hostname '192.168.59.103' doesn't match 'boot2docker'`, you can:
-
-1. Add an entry to your /etc/hosts file matching boot2docker to the daemon's IP
-1. disable hostname validation (but please consider the security implications
- in doing this)
-
-```python
-from docker.client import Client
-from docker.utils import kwargs_from_env
-
-kwargs = kwargs_from_env()
-kwargs['tls'].assert_hostname = False
-
-cli = Client(**kwargs)
-print cli.version()
-```
\ No newline at end of file
diff --git a/docs/machine.md b/docs/machine.md
new file mode 100644
index 0000000000..d38316efc6
--- /dev/null
+++ b/docs/machine.md
@@ -0,0 +1,19 @@
+# Using with Docker Toolbox and Machine
+
+In development, we recommend using [Docker Toolbox](https://www.docker.com/products/docker-toolbox) to set up Docker. It includes a tool called Machine which will create a VM running Docker Engine and point your shell at it using environment variables.
+
+To configure docker-py with these environment variables
+
+First use Machine to set up the environment variables:
+```bash
+$ eval "$(docker-machine env)"
+```
+
+You can then use docker-py like this:
+```python
+import docker
+client = docker.from_env(assert_hostname=False)
+print client.version()
+```
+
+**Note:** We are disabling TLS hostname checking with `assert\_hostname=False`. Machine provides us with the exact certificate the server is using so this is safe. If you are not using Machine and verifying the host against a certificate authority, you'll want to enable this.
| [
{
"components": [
{
"doc": "",
"lines": [
35,
36
],
"name": "from_env",
"signature": "def from_env(**kwargs):",
"type": "function"
},
{
"doc": "",
"lines": [
92,
93
],
"name"... | [
"tests/unit/client_test.py::ClientTest::test_from_env"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add docker.from_env() shortcut
A much neater way of getting started with docker-py. Also updated the boot2docker docs to reflect the current Toolbox/Machine world.
A question: are the security concerns of doing `assert_hostname=False` true? The client is validating the specific cert that the server has, I think, so there is no need to also check the hostname. I think we would still have to specifically say "if you are using Machine, this is safe", because in the context of using a normal CA cert it would not be safe.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in docker/client.py]
(definition of from_env:)
def from_env(**kwargs):
(definition of Client.from_env:)
def from_env(cls, **kwargs):
[end of new definitions in docker/client.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 26753c81defff28a1a38a34788e9653c8eb87c3d | |
python-hyper__h2-93 | 93 | python-hyper/h2 | null | 5a6a70caf257d55a45af76412f475151435d7d27 | 2016-01-11T10:52:42Z | diff --git a/docs/source/api.rst b/docs/source/api.rst
index 4b9ffd158..3e5f3c1c8 100644
--- a/docs/source/api.rst
+++ b/docs/source/api.rst
@@ -59,12 +59,15 @@ Exceptions
----------
.. autoclass:: h2.exceptions.H2Error
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.NoSuchStreamError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.StreamClosedError
+ :show-inheritance:
:members:
@@ -72,21 +75,31 @@ Protocol Errors
~~~~~~~~~~~~~~~
.. autoclass:: h2.exceptions.ProtocolError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.FrameTooLargeError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.TooManyStreamsError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.FlowControlError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.StreamIDTooLowError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.InvalidSettingsValueError
+ :show-inheritance:
+ :members:
+
+.. autoclass:: h2.exceptions.NoAvailableStreamIDError
+ :show-inheritance:
:members:
diff --git a/h2/connection.py b/h2/connection.py
index 4b0b1bf23..12304c133 100644
--- a/h2/connection.py
+++ b/h2/connection.py
@@ -21,7 +21,8 @@
)
from .exceptions import (
ProtocolError, NoSuchStreamError, FlowControlError, FrameTooLargeError,
- TooManyStreamsError, StreamClosedError, StreamIDTooLowError
+ TooManyStreamsError, StreamClosedError, StreamIDTooLowError,
+ NoAvailableStreamIDError
)
from .frame_buffer import FrameBuffer
from .settings import Settings
@@ -231,6 +232,9 @@ class H2Connection(object):
# chosen.
DEFAULT_MAX_INBOUND_FRAME_SIZE = 2**24
+ # The highest acceptable stream ID.
+ HIGHEST_ALLOWED_STREAM_ID = 2**31 - 1
+
def __init__(self, client_side=True):
self.state_machine = H2ConnectionStateMachine()
self.streams = {}
@@ -412,6 +416,37 @@ def get_stream_by_id(self, stream_id):
else:
raise StreamClosedError(stream_id)
+ def get_next_available_stream_id(self):
+ """
+ Returns an integer suitable for use as the stream ID for the next
+ stream created by this endpoint. For server endpoints, this stream ID
+ will be even. For client endpoints, this stream ID will be odd. If no
+ stream IDs are available, raises :class:`NoAvailableStreamIDError
+ <h2.exceptions.NoAvailableStreamIDError>`.
+
+ .. warning: The return value from this function does not change until
+ the stream ID has actually been used by sending or pushing
+ headers on that stream. For that reason, it should be
+ called as close as possible to the actual use of the stream
+ ID.
+
+ :raises: :class:`NoAvailableStreamIDError
+ <h2.exceptions.NoAvailableStreamIDError>`
+ :returns: The next free stream ID this peer can use to initiate a
+ stream.
+ :rtype: ``int``
+ """
+ # No streams have been opened yet, so return the lowest allowed stream
+ # ID.
+ if not self.highest_outbound_stream_id:
+ return 1 if self.client_side else 2
+
+ next_stream_id = self.highest_outbound_stream_id + 2
+ if next_stream_id > self.HIGHEST_ALLOWED_STREAM_ID:
+ raise NoAvailableStreamIDError("Exhausted allowed stream IDs")
+
+ return next_stream_id
+
def send_headers(self, stream_id, headers, end_stream=False):
"""
Send headers on a given stream.
diff --git a/h2/exceptions.py b/h2/exceptions.py
index 9d89da8e3..6c0cbe473 100644
--- a/h2/exceptions.py
+++ b/h2/exceptions.py
@@ -69,6 +69,14 @@ def __str__(self):
)
+class NoAvailableStreamIDError(ProtocolError, ValueError):
+ """
+ There are no available stream IDs left to the connection. All stream IDs
+ have been exhausted.
+ """
+ pass
+
+
class NoSuchStreamError(H2Error):
"""
A stream-specific action referenced a stream that does not exist.
| diff --git a/test/test_utility_functions.py b/test/test_utility_functions.py
new file mode 100644
index 000000000..d934c88f4
--- /dev/null
+++ b/test/test_utility_functions.py
@@ -0,0 +1,153 @@
+# -*- coding: utf-8 -*-
+"""
+test_utility_functions
+~~~~~~~~~~~~~~~~~~~~~~
+
+Tests for the various utility functions provided by hyper-h2.
+"""
+import pytest
+
+import h2.connection
+import h2.errors
+import h2.events
+import h2.exceptions
+
+# These tests require a non-list-returning range function.
+try:
+ range = xrange
+except NameError:
+ range = range
+
+
+class TestGetNextAvailableStreamID(object):
+ """
+ Tests for the ``H2Connection.get_next_available_stream_id`` method.
+ """
+ example_request_headers = [
+ (':authority', 'example.com'),
+ (':path', '/'),
+ (':scheme', 'https'),
+ (':method', 'GET'),
+ ]
+ example_response_headers = [
+ (':status', '200'),
+ ('server', 'fake-serv/0.1.0')
+ ]
+
+ def test_returns_correct_sequence_for_clients(self, frame_factory):
+ """
+ For a client connection, the correct sequence of stream IDs is
+ returned.
+ """
+ # Running the exhaustive version of this test (all 1 billion available
+ # stream IDs) is too painful. For that reason, we validate that the
+ # original sequence is right for the first few thousand, and then just
+ # check that it terminates properly.
+ #
+ # Make sure that the streams get cleaned up: 8k streams floating
+ # around would make this test memory-hard, and it's not supposed to be
+ # a test of how much RAM your machine has.
+ c = h2.connection.H2Connection(client_side=True)
+ c.initiate_connection()
+ initial_sequence = range(1, 2**13, 2)
+
+ for expected_stream_id in initial_sequence:
+ stream_id = c.get_next_available_stream_id()
+ assert stream_id == expected_stream_id
+
+ c.send_headers(
+ stream_id=stream_id,
+ headers=self.example_request_headers,
+ end_stream=True
+ )
+ f = frame_factory.build_headers_frame(
+ headers=self.example_response_headers,
+ stream_id=stream_id,
+ flags=['END_STREAM'],
+ )
+ c.receive_data(f.serialize())
+ c.clear_outbound_data_buffer()
+
+ # Jump up to the last available stream ID. Don't clean up the stream
+ # here because who cares about one stream.
+ last_client_id = 2**31 - 1
+ c.send_headers(
+ stream_id=last_client_id,
+ headers=self.example_request_headers,
+ end_stream=True
+ )
+
+ with pytest.raises(h2.exceptions.NoAvailableStreamIDError):
+ c.get_next_available_stream_id()
+
+ def test_returns_correct_sequence_for_servers(self, frame_factory):
+ """
+ For a server connection, the correct sequence of stream IDs is
+ returned.
+ """
+ # Running the exhaustive version of this test (all 1 billion available
+ # stream IDs) is too painful. For that reason, we validate that the
+ # original sequence is right for the first few thousand, and then just
+ # check that it terminates properly.
+ #
+ # Make sure that the streams get cleaned up: 8k streams floating
+ # around would make this test memory-hard, and it's not supposed to be
+ # a test of how much RAM your machine has.
+ c = h2.connection.H2Connection(client_side=False)
+ c.initiate_connection()
+ c.receive_data(frame_factory.preamble())
+ f = frame_factory.build_headers_frame(
+ headers=self.example_request_headers
+ )
+ c.receive_data(f.serialize())
+
+ initial_sequence = range(2, 2**13, 2)
+
+ for expected_stream_id in initial_sequence:
+ stream_id = c.get_next_available_stream_id()
+ assert stream_id == expected_stream_id
+
+ c.push_stream(
+ stream_id=1,
+ promised_stream_id=stream_id,
+ request_headers=self.example_request_headers
+ )
+ c.send_headers(
+ stream_id=stream_id,
+ headers=self.example_response_headers,
+ end_stream=True
+ )
+ c.clear_outbound_data_buffer()
+
+ # Jump up to the last available stream ID. Don't clean up the stream
+ # here because who cares about one stream.
+ last_server_id = 2**31 - 2
+ c.push_stream(
+ stream_id=1,
+ promised_stream_id=last_server_id,
+ request_headers=self.example_request_headers,
+ )
+
+ with pytest.raises(h2.exceptions.NoAvailableStreamIDError):
+ c.get_next_available_stream_id()
+
+ def test_does_not_increment_without_stream_send(self):
+ """
+ If a new stream isn't actually created, the next stream ID doesn't
+ change.
+ """
+ c = h2.connection.H2Connection()
+ c.initiate_connection()
+
+ first_stream_id = c.get_next_available_stream_id()
+ second_stream_id = c.get_next_available_stream_id()
+
+ assert first_stream_id == second_stream_id
+
+ c.send_headers(
+ stream_id=first_stream_id,
+ headers=self.example_request_headers
+ )
+
+ third_stream_id = c.get_next_available_stream_id()
+ assert third_stream_id == (first_stream_id + 2)
| diff --git a/docs/source/api.rst b/docs/source/api.rst
index 4b9ffd158..3e5f3c1c8 100644
--- a/docs/source/api.rst
+++ b/docs/source/api.rst
@@ -59,12 +59,15 @@ Exceptions
----------
.. autoclass:: h2.exceptions.H2Error
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.NoSuchStreamError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.StreamClosedError
+ :show-inheritance:
:members:
@@ -72,21 +75,31 @@ Protocol Errors
~~~~~~~~~~~~~~~
.. autoclass:: h2.exceptions.ProtocolError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.FrameTooLargeError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.TooManyStreamsError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.FlowControlError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.StreamIDTooLowError
+ :show-inheritance:
:members:
.. autoclass:: h2.exceptions.InvalidSettingsValueError
+ :show-inheritance:
+ :members:
+
+.. autoclass:: h2.exceptions.NoAvailableStreamIDError
+ :show-inheritance:
:members:
| [
{
"components": [
{
"doc": "Returns an integer suitable for use as the stream ID for the next\nstream created by this endpoint. For server endpoints, this stream ID\nwill be even. For client endpoints, this stream ID will be odd. If no\nstream IDs are available, raises :class:`NoAvailableStreamIDE... | [
"test/test_utility_functions.py::TestGetNextAvailableStreamID::test_returns_correct_sequence_for_clients",
"test/test_utility_functions.py::TestGetNextAvailableStreamID::test_returns_correct_sequence_for_servers",
"test/test_utility_functions.py::TestGetNextAvailableStreamID::test_does_not_increment_without_str... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add get_next_available_stream_id function.
This function returns the next available stream ID for the given endpoint type, client or server. It makes it moderately easier to write a hyper-h2 server, even though it's not strictly necessary, and it also provides useful information about overly large stream IDs.
Resolves #91.
/cc @Kriechi
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in h2/connection.py]
(definition of H2Connection.get_next_available_stream_id:)
def get_next_available_stream_id(self):
"""Returns an integer suitable for use as the stream ID for the next
stream created by this endpoint. For server endpoints, this stream ID
will be even. For client endpoints, this stream ID will be odd. If no
stream IDs are available, raises :class:`NoAvailableStreamIDError
<h2.exceptions.NoAvailableStreamIDError>`.
.. warning: The return value from this function does not change until
the stream ID has actually been used by sending or pushing
headers on that stream. For that reason, it should be
called as close as possible to the actual use of the stream
ID.
:raises: :class:`NoAvailableStreamIDError
<h2.exceptions.NoAvailableStreamIDError>`
:returns: The next free stream ID this peer can use to initiate a
stream.
:rtype: ``int``"""
[end of new definitions in h2/connection.py]
[start of new definitions in h2/exceptions.py]
(definition of NoAvailableStreamIDError:)
class NoAvailableStreamIDError(ProtocolError, ValueError):
"""There are no available stream IDs left to the connection. All stream IDs
have been exhausted."""
[end of new definitions in h2/exceptions.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Get next available stream_id to begin a new stream
Hi,
I want to create a new stream without knowing or generating a new stream_id myself.
The current problem is that:
- `begin_new_stream`
- `get_or_create_stream`
- `send_headers`
all require a `stream_id` - which I don't know at this point.
I would expect hyper-h2 to provide something like: `send_headers(headers, end_stream=True)` (note there is no `stream_id` argument!) or simply `stream_id = h2conn.get_next_stream_id()` as well, as long as it is the next valid id available.
----------
--------------------
</issues> | 9df8f94ce983d44ef57c8f332463f7b3cbe0127b |
spec-first__connexion-95 | 95 | spec-first/connexion | null | a695a3c6583398b798c2fa9e7fd0119a2b5cb4f6 | 2015-11-18T00:47:23Z | diff --git a/connexion/operation.py b/connexion/operation.py
index 8a1d2fce2..7a04852bf 100644
--- a/connexion/operation.py
+++ b/connexion/operation.py
@@ -76,7 +76,11 @@ def __init__(self, method, path, operation, app_produces, app_security,
self.validate_responses = validate_responses
self.operation = operation
- self.operation_id = operation['operationId']
+ operation_id = operation['operationId']
+
+ router_controller = operation.get('x-swagger-router-controller')
+
+ self.operation_id = self.detect_controller(operation_id, router_controller)
# todo support definition references
# todo support references to application level parameters
self.parameters = list(self.resolve_parameters(operation.get('parameters', [])))
@@ -85,6 +89,11 @@ def __init__(self, method, path, operation, app_produces, app_security,
self.security = operation.get('security', app_security)
self.__undecorated_function = resolver(self.operation_id)
+ def detect_controller(self, operation_id, router_controller):
+ if router_controller is None:
+ return operation_id
+ return router_controller + '.' + operation_id
+
def resolve_reference(self, schema):
schema = schema.copy() # avoid changing the original schema
reference = schema.get('$ref') # type: str
| diff --git a/tests/test_operation.py b/tests/test_operation.py
index 7a4396df7..6cb29b729 100644
--- a/tests/test_operation.py
+++ b/tests/test_operation.py
@@ -99,6 +99,29 @@
OPERATION5 = {'operationId': 'fakeapi.hello.post_greeting',
'parameters': [{'$ref': '/parameters/fail'}]}
+OPERATION6 = {'description': 'Adds a new stack to be created by lizzy and returns the '
+ 'information needed to keep track of deployment',
+ 'operationId': 'post_greeting',
+ 'x-swagger-router-controller': 'fakeapi.hello',
+ 'parameters': [{'in': 'body',
+ 'name': 'new_stack',
+ 'required': True,
+ 'schema': {'$ref': '#/definitions/new_stack'}}],
+ 'responses': {201: {'description': 'Stack to be created. The '
+ 'CloudFormation Stack creation can '
+ "still fail if it's rejected by senza "
+ 'or AWS CF.',
+ 'schema': {'$ref': '#/definitions/stack'}},
+ 400: {'description': 'Stack was not created because request '
+ 'was invalid',
+ 'schema': {'$ref': '#/definitions/problem'}},
+ 401: {'description': 'Stack was not created because the '
+ 'access token was not provided or was '
+ 'not valid for this operation',
+ 'schema': {'$ref': '#/definitions/problem'}}},
+ 'security': [{'oauth': ['uid']}],
+ 'summary': 'Create new stack'}
+
SECURITY_DEFINITIONS = {'oauth': {'type': 'oauth2',
'flow': 'password',
'x-tokenInfoUrl': 'https://ouath.example/token_info',
@@ -225,3 +248,15 @@ def test_resolve_invalid_reference():
exception = exc_info.value # type: InvalidSpecification
assert exception.reason == "GET endpoint '$ref' needs to start with '#/'"
+
+def test_detect_controller():
+ operation = Operation(method='GET',
+ path='endpoint',
+ operation=OPERATION6,
+ app_produces=['application/json'],
+ app_security=[],
+ security_definitions={},
+ definitions={},
+ parameter_definitions=PARAMETER_DEFINITIONS,
+ resolver=get_function_from_name)
+ assert operation.operation_id == 'fakeapi.hello.post_greeting'
\ No newline at end of file
| [
{
"components": [
{
"doc": "",
"lines": [
92,
95
],
"name": "Operation.detect_controller",
"signature": "def detect_controller(self, operation_id, router_controller):",
"type": "function"
}
],
"file": "connexion/operation.py... | [
"tests/test_operation.py::test_detect_controller"
] | [
"tests/test_operation.py::test_operation",
"tests/test_operation.py::test_non_existent_reference",
"tests/test_operation.py::test_multi_body",
"tests/test_operation.py::test_invalid_reference",
"tests/test_operation.py::test_no_token_info",
"tests/test_operation.py::test_parameter_reference",
"tests/tes... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
added logic for detecting x-swagger-router-controller
closes #94
adds logic for detecting `x-swagger-router-controller` and if set, will join the `operationId` with this value.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in connexion/operation.py]
(definition of Operation.detect_controller:)
def detect_controller(self, operation_id, router_controller):
[end of new definitions in connexion/operation.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
change use of `operationId`
The spec field `operationId` is used for client generation method names. While connexion uses `operationId` for routing directly to a controller, it will make for some very ugly client SDK methods.
For example:
``` yaml
operationId: my.controller.default_controller.getUser
```
Will generate into this for ObjectiveC:
``` objc
-(NSNumber*) myControllerDefaultControllerGetUserWithCompletionBlock
```
It would be nice if connexion followed the same strategy as [swagger-inflector](https://github.com/swagger-api/swagger-inflector#locating-the-target-method) and [swagger-node](https://github.com/swagger-api/swagger-node#4-write-controller-code-in-nodejs) for locating the routing controller/method.
Specifically, I'd propose the following:
``` yaml
operationId: getUser
x-swagger-router-controller: my.controller.default_controller
```
Which allows for a friendly name for the client SDK and server controller method, and uses the vendor extensions for the controller location.
----------
--------------------
</issues> | ee399fbfe627f86c461b353132e4d710f7127440 | |
joke2k__faker-300 | 300 | joke2k/faker | null | 3da7fd3109e317025e033fb95260c0032668ae0e | 2015-11-12T06:14:19Z | diff --git a/faker/providers/internet/ja_JP/__init__.py b/faker/providers/internet/ja_JP/__init__.py
new file mode 100644
index 0000000000..0229a10aed
--- /dev/null
+++ b/faker/providers/internet/ja_JP/__init__.py
@@ -0,0 +1,18 @@
+# coding=utf-8
+from __future__ import unicode_literals
+from .. import Provider as InternetProvider
+from faker.utils.decorators import slugify
+
+
+class Provider(InternetProvider):
+ user_name_formats = (
+ '{{last_romanized_name}}.{{first_romanized_name}}',
+ '{{first_romanized_name}}.{{last_romanized_name}}',
+ '{{first_romanized_name}}##',
+ '?{{last_romanized_name}}',
+ )
+ tlds = ('com', 'com', 'com', 'net', 'org', 'jp', 'jp', 'jp')
+
+ @slugify
+ def domain_word(self):
+ return self.generator.format('last_romanized_name')
diff --git a/faker/providers/person/ja_JP/__init__.py b/faker/providers/person/ja_JP/__init__.py
index 0c4946cd28..e747721644 100644
--- a/faker/providers/person/ja_JP/__init__.py
+++ b/faker/providers/person/ja_JP/__init__.py
@@ -81,6 +81,49 @@ class Provider(PersonProvider):
'ワカマツ', 'ワタナベ',
)
+ romanized_formats = (
+ '{{first_romanized_name_female}} {{last_romanized_name}}',
+ '{{first_romanized_name_male}} {{last_romanized_name}}',
+ )
+
+ first_romanized_names_female = (
+ 'Akira', 'Akemi', 'Asuka',
+ 'Kaori', 'Kana', 'Kumiko',
+ 'Sayuri',
+ 'Chiyo', 'Tsubasa', 'Tomomi',
+ 'Naoko', 'Nanaka',
+ 'Hanako', 'Haruka',
+ 'Maaya', 'Mai', 'Miki', 'Momoko',
+ 'Yui', 'Yoko', 'Yumiko',
+ 'Rei', 'Rika',
+ )
+
+ first_romanized_names_male = (
+ 'Akira', 'Atsushi', 'Osamu',
+ 'Kyosuke', 'Kenichi',
+ 'Jun', 'Sotaro',
+ 'Taichi', 'Takuma', 'Taro', 'Tsubasa', 'Tomoya',
+ 'Naoki', 'Naoto'
+ 'Hideki', 'Hiroshi',
+ 'Manabu', 'Mituru', 'Minoru', 'Hiroki',
+ 'Yuta', 'Yasuhiro', 'Yoichi', 'Yosuke',
+ 'Ryosuke', 'Ryohei',
+ )
+
+ first_romanized_names = first_romanized_names_male + first_romanized_names_female
+
+ last_romanized_names = (
+ 'Aota', 'Aoyama', 'Ishida', 'Idaka', 'Ito', 'Uno', 'Ekoda', 'Ogaki',
+ 'Kato', 'Kano', 'Kijima', 'Kimura', 'Kiriyama', 'Kudo', 'Koizumi', 'Kobayashi', 'Kondo',
+ 'Saito', 'Sakamoto', 'Sasaki', 'Sato', 'Sasada', 'Suzuki', 'Sugiyama',
+ 'Takahashi', 'Tanaka', 'Tanabe', 'Tsuda', 'Tsuchiya',
+ 'Nakajima', 'Nakamura', 'Nagisa', 'Nakatsugawa', 'Nishinosono', 'Nomura',
+ 'Harada', 'Hamada', 'Hirokawa', 'Fujimoto',
+ 'Matsumoto', 'Miyake', 'Miyagawa', 'Murayama',
+ 'Yamagishi', 'Yamaguchi', 'Yamada', 'Yamamoto', 'Yoshida', 'Yoshimoto',
+ 'Wakamatsu', 'Watanabe',
+ )
+
def kana_name(self):
'''
@example 'アオタ アキラ'
@@ -109,3 +152,32 @@ def last_kana_name(cls):
@example 'アオタ'
'''
return cls.random_element(cls.last_kana_names)
+
+ def romanized_name(self):
+ '''
+ @example 'Akira Aota'
+ '''
+ pattern = self.random_element(self.romanized_formats)
+ return self.generator.parse(pattern)
+
+ @classmethod
+ def first_romanized_name(cls):
+ '''
+ @example 'Akira'
+ '''
+ return cls.random_element(cls.first_romanized_names)
+
+ @classmethod
+ def first_romanized_name_female(cls):
+ return cls.random_element(cls.first_romanized_names_female)
+
+ @classmethod
+ def first_romanized_name_male(cls):
+ return cls.random_element(cls.first_romanized_names_male)
+
+ @classmethod
+ def last_romanized_name(cls):
+ '''
+ @example 'Aota'
+ '''
+ return cls.random_element(cls.last_romanized_names)
| diff --git a/faker/tests/ja_JP/__init__.py b/faker/tests/ja_JP/__init__.py
index c0bd16e652..35af84fa94 100644
--- a/faker/tests/ja_JP/__init__.py
+++ b/faker/tests/ja_JP/__init__.py
@@ -6,6 +6,7 @@
import re
from faker import Factory
+from faker.utils import text
from .. import string_types
@@ -84,6 +85,23 @@ def test_ja_JP_company(self):
assert any(prefix in company for prefix in prefixes)
assert any(company.startswith(prefix) for prefix in prefixes)
+ def test_ja_JP_internet(self):
+ from faker.providers.person.ja_JP import Provider
+ last_romanized_names = Provider.last_romanized_names
+
+ domain_word = self.factory.domain_word()
+ assert domain_word
+ assert isinstance(domain_word, string_types)
+ assert any(domain_word == text.slugify(last_romanized_name) for last_romanized_name in last_romanized_names)
+
+ user_name = self.factory.user_name()
+ assert user_name
+ assert isinstance(user_name, string_types)
+
+ tld = self.factory.tld()
+ assert tld
+ assert isinstance(tld, string_types)
+
def test_ja_JP_person(self):
name = self.factory.name()
assert name
@@ -117,6 +135,26 @@ def test_ja_JP_person(self):
assert last_kana_name
assert isinstance(last_kana_name, string_types)
+ romanized_name = self.factory.romanized_name()
+ assert romanized_name
+ assert isinstance(romanized_name, string_types)
+
+ first_romanized_name = self.factory.first_romanized_name()
+ assert first_romanized_name
+ assert isinstance(first_romanized_name, string_types)
+
+ first_romanized_name_male = self.factory.first_romanized_name_male()
+ assert first_romanized_name_male
+ assert isinstance(first_romanized_name_male, string_types)
+
+ first_romanized_name_female = self.factory.first_romanized_name_female()
+ assert first_romanized_name_female
+ assert isinstance(first_romanized_name_female, string_types)
+
+ last_romanized_name = self.factory.last_romanized_name()
+ assert last_romanized_name
+ assert isinstance(last_romanized_name, string_types)
+
def test_ja_JP_phone_number(self):
pn = self.factory.phone_number()
formats = (
| [
{
"components": [
{
"doc": "",
"lines": [
7,
18
],
"name": "Provider",
"signature": "class Provider(InternetProvider):",
"type": "class"
},
{
"doc": "",
"lines": [
17,
18
],
... | [
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_internet",
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_person"
] | [
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_address",
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_company",
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_phone_number"
] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Update ja_JP for username and domain
In order to get username and domain in ja_JP
・Add [Romanized Japanese](https://en.wikipedia.org/wiki/Romanization_of_Japanese) person name
・Add internet provider for ja_JP
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/internet/ja_JP/__init__.py]
(definition of Provider:)
class Provider(InternetProvider):
(definition of Provider.domain_word:)
def domain_word(self):
[end of new definitions in faker/providers/internet/ja_JP/__init__.py]
[start of new definitions in faker/providers/person/ja_JP/__init__.py]
(definition of Provider.romanized_name:)
def romanized_name(self):
"""@example 'Akira Aota'"""
(definition of Provider.first_romanized_name:)
def first_romanized_name(cls):
"""@example 'Akira'"""
(definition of Provider.first_romanized_name_female:)
def first_romanized_name_female(cls):
(definition of Provider.first_romanized_name_male:)
def first_romanized_name_male(cls):
(definition of Provider.last_romanized_name:)
def last_romanized_name(cls):
"""@example 'Aota'"""
[end of new definitions in faker/providers/person/ja_JP/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
falconry__falcon-640 | 640 | falconry/falcon | null | 0fefca82a3e1f6f35f147af01c7daeebb0414d87 | 2015-10-27T00:17:50Z | diff --git a/falcon/request.py b/falcon/request.py
index 0a2eb8d34..3312667f7 100644
--- a/falcon/request.py
+++ b/falcon/request.py
@@ -27,9 +27,9 @@
import mimeparse
import six
-from falcon.errors import *
+from falcon.errors import * # NOQA
from falcon import util
-from falcon.util.uri import parse_query_string, parse_host
+from falcon.util.uri import parse_query_string, parse_host, unquote_string
from falcon import request_helpers as helpers
# NOTE(tbug): In some cases, http_cookies is not a module
@@ -205,6 +205,7 @@ class Request(object):
'_wsgierrors',
'options',
'_cookies',
+ '_cached_access_route',
)
# Allow child classes to override this
@@ -257,6 +258,7 @@ def __init__(self, env, options=None):
self._cached_headers = None
self._cached_uri = None
self._cached_relative_uri = None
+ self._cached_access_route = None
try:
self.content_type = self.env['CONTENT_TYPE']
@@ -521,6 +523,64 @@ def cookies(self):
return self._cookies.copy()
+ @property
+ def access_route(self):
+ """A list of all addresses from client to the last proxy server.
+
+ Inspired by werkzeug's ``access_route``.
+
+ Note:
+ The list may contain string(s) other than IPv4 / IPv6 address. For
+ example the "unknown" identifier and obfuscated identifier defined
+ by `RFC 7239`_.
+
+ .. _RFC 7239: https://tools.ietf.org/html/rfc7239#section-6
+
+ Warning:
+ HTTP Forwarded headers can be forged by any client or proxy.
+ Use this property with caution and write your own verify function.
+ The best practice is always using :py:attr:`~.remote_addr` unless
+ your application is hosted behind some reverse proxy server(s).
+ Also only trust the **last N** addresses provided by those reverse
+ proxy servers.
+
+ This property will try to derive addresses sequentially from:
+
+ - ``Forwarded``
+ - ``X-Forwarded-For``
+ - ``X-Real-IP``
+ - **or** the IP address of the closest client/proxy
+
+ """
+ if self._cached_access_route is None:
+ access_route = []
+ if 'HTTP_FORWARDED' in self.env:
+ access_route = self._parse_rfc_forwarded()
+ if not access_route and 'HTTP_X_FORWARDED_FOR' in self.env:
+ access_route = [ip.strip() for ip in
+ self.env['HTTP_X_FORWARDED_FOR'].split(',')]
+ if not access_route and 'HTTP_X_REAL_IP' in self.env:
+ access_route = [self.env['HTTP_X_REAL_IP']]
+ if not access_route and 'REMOTE_ADDR' in self.env:
+ access_route = [self.env['REMOTE_ADDR']]
+ self._cached_access_route = access_route
+
+ return self._cached_access_route
+
+ @property
+ def remote_addr(self):
+ """String of the IP address of the closest client/proxy.
+
+ Address will only be derived from WSGI ``REMOTE_ADDR`` header, which
+ can not be modified by any client or proxy.
+
+ Note:
+ If your application is behind one or more reverse proxies, you may
+ need to use :py:obj:`~.access_route` to retrieve the real IP
+ address of the client.
+ """
+ return self.env.get('REMOTE_ADDR')
+
# ------------------------------------------------------------------------
# Methods
# ------------------------------------------------------------------------
@@ -626,7 +686,8 @@ def get_header_as_datetime(self, header, required=False, obs_date=False):
``HTTPBadRequest`` instead of returning gracefully when the
header is not found (default ``False``).
obs_date (bool, optional): Support obs-date formats according to
- RFC 7231, e.g.: "Sunday, 06-Nov-94 08:49:37 GMT" (default ``False``).
+ RFC 7231, e.g.: "Sunday, 06-Nov-94 08:49:37 GMT"
+ (default ``False``).
Returns:
datetime: The value of the specified header if it exists,
@@ -1035,6 +1096,26 @@ def _parse_form_urlencoded(self):
self._params.update(extra_params)
+ def _parse_rfc_forwarded(self):
+ """Parse RFC 7239 "Forwarded" header.
+
+ Returns:
+ list: addresses derived from "for" parameters.
+ """
+ addr = []
+ for forwarded in self.env['HTTP_FORWARDED'].split(','):
+ for param in forwarded.split(';'):
+ param = param.strip().split('=', 1)
+ if len(param) == 1:
+ continue
+ key, val = param
+ if key.lower() != 'for':
+ # we only want for params
+ continue
+ host, _ = parse_host(unquote_string(val))
+ addr.append(host)
+ return addr
+
# PERF: To avoid typos and improve storage space and speed over a dict.
class RequestOptions(object):
diff --git a/falcon/util/misc.py b/falcon/util/misc.py
index 7c396611e..67755661c 100644
--- a/falcon/util/misc.py
+++ b/falcon/util/misc.py
@@ -107,8 +107,9 @@ def http_date_to_dt(http_date, obs_date=False):
Args:
http_date (str): An RFC 1123 date string, e.g.:
"Tue, 15 Nov 1994 12:45:26 GMT".
- obs_date (bool, optional): Support obs-date formats according to
- RFC 7231, e.g.: "Sunday, 06-Nov-94 08:49:37 GMT" (default ``False``).
+ obs_date (bool, optional): Support obs-date formats according to
+ RFC 7231, e.g.:
+ "Sunday, 06-Nov-94 08:49:37 GMT" (default ``False``).
Returns:
datetime: A UTC datetime instance corresponding to the given
diff --git a/falcon/util/uri.py b/falcon/util/uri.py
index b3e33e844..2b9af610b 100644
--- a/falcon/util/uri.py
+++ b/falcon/util/uri.py
@@ -376,3 +376,35 @@ def parse_host(host, default_port=None):
# or a domain name plus a port
name, _, port = host.partition(':')
return (name, int(port))
+
+
+def unquote_string(quoted):
+ """Unquote an RFC 7320 "quoted-string".
+
+ Args:
+ quoted (str): Original quoted string
+
+ Returns:
+ str: unquoted string
+
+ Raises:
+ TypeError: `quoted` was not a ``str``.
+ """
+ tmp_quoted = quoted.strip()
+ if len(tmp_quoted) < 2:
+ return quoted
+ elif tmp_quoted[0] != '"' or tmp_quoted[-1] != '"':
+ # return original one, prevent side-effect
+ return quoted
+
+ tmp_quoted = tmp_quoted[1:-1]
+ # PERF(philiptzou): Most header strings don't contain "quoted-pair" which
+ # defined by RFC 7320. We use this little trick (quick string search) to
+ # speed up string parsing by preventing unnecessary processes if possible.
+ if '\\' not in tmp_quoted:
+ return tmp_quoted
+ elif r'\\' not in tmp_quoted:
+ return tmp_quoted.replace('\\', '')
+ else:
+ return '\\'.join([q.replace('\\', '')
+ for q in tmp_quoted.split(r'\\')])
| diff --git a/tests/test_access_route.py b/tests/test_access_route.py
new file mode 100644
index 000000000..90fa871f9
--- /dev/null
+++ b/tests/test_access_route.py
@@ -0,0 +1,74 @@
+from falcon.request import Request
+import falcon.testing as testing
+
+
+class TestAccessRoute(testing.TestBase):
+
+ def test_remote_addr_only(self):
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'Forwarded': ('for=192.0.2.43, for="[2001:db8:cafe::17]:555",'
+ 'for="unknown", by=_hidden,for="\\"\\\\",'
+ 'for="198\\.51\\.100\\.17\\:1236";'
+ 'proto=https;host=example.com')
+ }))
+ self.assertEqual(req.remote_addr, '127.0.0.1')
+
+ def test_rfc_forwarded(self):
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'Forwarded': ('for=192.0.2.43,for=,'
+ 'for="[2001:db8:cafe::17]:555",'
+ 'for="unknown", by=_hidden,for="\\"\\\\",'
+ 'for="_don\\\"t_\\try_this\\\\at_home_\\42",'
+ 'for="198\\.51\\.100\\.17\\:1236";'
+ 'proto=https;host=example.com')
+ }))
+ compares = ['192.0.2.43', '', '2001:db8:cafe::17',
+ 'unknown', '"\\', '_don"t_try_this\\at_home_42',
+ '198.51.100.17']
+ self.assertEqual(req.access_route, compares)
+ # test cached
+ self.assertEqual(req.access_route, compares)
+
+ def test_malformed_rfc_forwarded(self):
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'Forwarded': 'for'
+ }))
+ self.assertEqual(req.access_route, ['127.0.0.1'])
+ # test cached
+ self.assertEqual(req.access_route, ['127.0.0.1'])
+
+ def test_x_forwarded_for(self):
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'X-Forwarded-For': ('192.0.2.43, 2001:db8:cafe::17,'
+ 'unknown, _hidden, 203.0.113.60')
+ }))
+ self.assertEqual(req.access_route,
+ ['192.0.2.43', '2001:db8:cafe::17',
+ 'unknown', '_hidden', '203.0.113.60'])
+
+ def test_x_real_ip(self):
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route',
+ headers={
+ 'X-Real-IP': '2001:db8:cafe::17'
+ }))
+ self.assertEqual(req.access_route, ['2001:db8:cafe::17'])
+
+ def test_remote_addr(self):
+ req = Request(testing.create_environ(
+ host='example.com',
+ path='/access_route'))
+ self.assertEqual(req.access_route, ['127.0.0.1'])
| [
{
"components": [
{
"doc": "A list of all addresses from client to the last proxy server.\n\nInspired by werkzeug's ``access_route``.\n\nNote:\n The list may contain string(s) other than IPv4 / IPv6 address. For\n example the \"unknown\" identifier and obfuscated identifier defined\n by `... | [
"tests/test_access_route.py::TestAccessRoute::test_malformed_rfc_forwarded",
"tests/test_access_route.py::TestAccessRoute::test_remote_addr",
"tests/test_access_route.py::TestAccessRoute::test_remote_addr_only",
"tests/test_access_route.py::TestAccessRoute::test_rfc_forwarded",
"tests/test_access_route.py::... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add `access_route` and `remote_addr` to Request object
which is similar to Werkzeug but more powerful
Supported:
1. fetch from "Forwarded" header (defined by RFC7239)
2. fetch from "X-Forwarded-For" header
3. fetch from "X-Read-IP" header
4. fetch from wsgi "REMOTE_ADDR" header
Related to #539 and #598
This pull-request is simply a duplicate of #599 but always rebased onto the latest origin/master. I don't want to lose any more comments due to the rebase. Please write your comment to #599.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/request.py]
(definition of Request.access_route:)
def access_route(self):
"""A list of all addresses from client to the last proxy server.
Inspired by werkzeug's ``access_route``.
Note:
The list may contain string(s) other than IPv4 / IPv6 address. For
example the "unknown" identifier and obfuscated identifier defined
by `RFC 7239`_.
.. _RFC 7239: https://tools.ietf.org/html/rfc7239#section-6
Warning:
HTTP Forwarded headers can be forged by any client or proxy.
Use this property with caution and write your own verify function.
The best practice is always using :py:attr:`~.remote_addr` unless
your application is hosted behind some reverse proxy server(s).
Also only trust the **last N** addresses provided by those reverse
proxy servers.
This property will try to derive addresses sequentially from:
- ``Forwarded``
- ``X-Forwarded-For``
- ``X-Real-IP``
- **or** the IP address of the closest client/proxy"""
(definition of Request.remote_addr:)
def remote_addr(self):
"""String of the IP address of the closest client/proxy.
Address will only be derived from WSGI ``REMOTE_ADDR`` header, which
can not be modified by any client or proxy.
Note:
If your application is behind one or more reverse proxies, you may
need to use :py:obj:`~.access_route` to retrieve the real IP
address of the client."""
(definition of Request._parse_rfc_forwarded:)
def _parse_rfc_forwarded(self):
"""Parse RFC 7239 "Forwarded" header.
Returns:
list: addresses derived from "for" parameters."""
[end of new definitions in falcon/request.py]
[start of new definitions in falcon/util/uri.py]
(definition of unquote_string:)
def unquote_string(quoted):
"""Unquote an RFC 7320 "quoted-string".
Args:
quoted (str): Original quoted string
Returns:
str: unquoted string
Raises:
TypeError: `quoted` was not a ``str``."""
[end of new definitions in falcon/util/uri.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | ||
python-hyper__h2-23 | 23 | python-hyper/h2 | null | ba8ec4633014b0e2c77211ec59233c78ca5f42c6 | 2015-10-01T16:44:55Z | diff --git a/h2/connection.py b/h2/connection.py
index ea6266a2d..484daf32c 100644
--- a/h2/connection.py
+++ b/h2/connection.py
@@ -13,7 +13,10 @@
)
from hpack.hpack import Encoder, Decoder
-from .events import WindowUpdated, RemoteSettingsChanged, PingAcknowledged
+from .events import (
+ WindowUpdated, RemoteSettingsChanged, PingAcknowledged,
+ SettingsAcknowledged,
+)
from .exceptions import ProtocolError, NoSuchStreamError, FlowControlError
from .frame_buffer import FrameBuffer
from .settings import Settings
@@ -441,6 +444,20 @@ def acknowledge_settings(self, event):
self._prepare_for_sending([f])
return []
+ def update_settings(self, new_settings):
+ """
+ Update the local settings. This will prepare and emit the appropriate
+ SETTINGS frame.
+
+ :param new_settings: A dictionary of {setting: new value}
+ """
+ self.state_machine.process_input(ConnectionInputs.SEND_SETTINGS)
+ self.local_settings.update(new_settings)
+ s = SettingsFrame(0)
+ s.settings = new_settings
+ self._prepare_for_sending([s])
+ return []
+
def flow_control_window(self, stream_id):
"""
Returns the maximum amount of data that can be sent on stream
@@ -589,7 +606,10 @@ def _receive_settings_frame(self, frame):
# This is an ack of the local settings.
if 'ACK' in frame.flags:
- self.local_settings.acknowledge()
+ changed_settings = self.local_settings.acknowledge()
+ ack_event = SettingsAcknowledged()
+ ack_event.changed_settings = changed_settings
+ events.append(ack_event)
return [], events
# Add the new settings.
diff --git a/h2/events.py b/h2/events.py
index e0e6db29b..bb33b1058 100644
--- a/h2/events.py
+++ b/h2/events.py
@@ -134,3 +134,14 @@ def __init__(self):
self.pushed_stream_id = None
self.parent_stream_id = None
self.headers = None
+
+
+class SettingsAcknowledged(object):
+ """
+ The SettingsAcknowledged event is fired whenever a settings ACK is received
+ from the remote peer. The event carries on it the settings that were
+ acknowedged, in the same format as
+ :class:`h2.events.RemoteSettingsChanged`.
+ """
+ def __init__(self):
+ self.changed_settings = {}
| diff --git a/test/test_basic_logic.py b/test/test_basic_logic.py
index 59f654902..88c72beb2 100644
--- a/test/test_basic_logic.py
+++ b/test/test_basic_logic.py
@@ -335,6 +335,48 @@ def test_can_consume_partial_data_from_connection(self):
assert len(c.data_to_send(10)) == 0
assert len(c.data_to_send()) == 0
+ def test_we_can_update_settings(self, frame_factory):
+ """
+ Updating the settings emits a SETTINGS frame.
+ """
+ c = h2.connection.H2Connection()
+ c.initiate_connection()
+ c.clear_outbound_data_buffer()
+
+ new_settings = {
+ hyperframe.frame.SettingsFrame.HEADER_TABLE_SIZE: 52,
+ hyperframe.frame.SettingsFrame.ENABLE_PUSH: 0,
+ }
+ events = c.update_settings(new_settings)
+ assert not events
+
+ f = frame_factory.build_settings_frame(new_settings)
+ assert c.data_to_send() == f.serialize()
+
+ def test_settings_get_acked_correctly(self, frame_factory):
+ """
+ When settings changes are ACKed, they contain the changed settings.
+ """
+ c = h2.connection.H2Connection()
+ c.initiate_connection()
+
+ new_settings = {
+ hyperframe.frame.SettingsFrame.HEADER_TABLE_SIZE: 52,
+ hyperframe.frame.SettingsFrame.ENABLE_PUSH: 0,
+ }
+ c.update_settings(new_settings)
+
+ f = frame_factory.build_settings_frame({}, ack=True)
+ events = c.receive_data(f.serialize())
+
+ assert len(events) == 1
+ event = events[0]
+
+ assert isinstance(event, h2.events.SettingsAcknowledged)
+ assert len(event.changed_settings) == len(new_settings)
+ for setting, value in new_settings.items():
+ assert event.changed_settings[setting].new_value == value
+
class TestBasicServer(object):
"""
@@ -538,22 +580,6 @@ def test_acknowledging_settings(self, frame_factory):
assert not events
assert c.data_to_send() == expected_data
- def test_settings_ack_is_ignored(self, frame_factory):
- """
- Receiving a SETTINGS ACK should cause no events or data to be emitted.
- """
- c = h2.connection.H2Connection(client_side=False)
- c.receive_data(frame_factory.preamble())
-
- f = frame_factory.build_settings_frame(
- settings={}, ack=True
- )
-
- events = c.receive_data(f.serialize())
-
- assert not events
- assert not c.data_to_send()
-
def test_close_connection(self, frame_factory):
"""
Closing the connection with no error code emits a GOAWAY frame with
| [
{
"components": [
{
"doc": "Update the local settings. This will prepare and emit the appropriate\nSETTINGS frame.\n\n:param new_settings: A dictionary of {setting: new value}",
"lines": [
447,
459
],
"name": "H2Connection.update_settings",
"sign... | [
"test/test_basic_logic.py::TestBasicClient::test_we_can_update_settings",
"test/test_basic_logic.py::TestBasicClient::test_settings_get_acked_correctly"
] | [
"test/test_basic_logic.py::TestBasicClient::test_begin_connection",
"test/test_basic_logic.py::TestBasicClient::test_sending_headers",
"test/test_basic_logic.py::TestBasicClient::test_sending_data",
"test/test_basic_logic.py::TestBasicClient::test_closing_stream_sending_data",
"test/test_basic_logic.py::Tes... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add logic for handling changes to local settings.
This is another pillar in #5.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in h2/connection.py]
(definition of H2Connection.update_settings:)
def update_settings(self, new_settings):
"""Update the local settings. This will prepare and emit the appropriate
SETTINGS frame.
:param new_settings: A dictionary of {setting: new value}"""
[end of new definitions in h2/connection.py]
[start of new definitions in h2/events.py]
(definition of SettingsAcknowledged:)
class SettingsAcknowledged(object):
"""The SettingsAcknowledged event is fired whenever a settings ACK is received
from the remote peer. The event carries on it the settings that were
acknowedged, in the same format as
:class:`h2.events.RemoteSettingsChanged`."""
(definition of SettingsAcknowledged.__init__:)
def __init__(self):
[end of new definitions in h2/events.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 9df8f94ce983d44ef57c8f332463f7b3cbe0127b | ||
python-hyper__h2-22 | 22 | python-hyper/h2 | null | befef347d73af95aa68858bda6762fb1a687e059 | 2015-09-21T11:29:02Z | diff --git a/h2/connection.py b/h2/connection.py
index 9c372d9e0..ea6266a2d 100644
--- a/h2/connection.py
+++ b/h2/connection.py
@@ -16,6 +16,7 @@
from .events import WindowUpdated, RemoteSettingsChanged, PingAcknowledged
from .exceptions import ProtocolError, NoSuchStreamError, FlowControlError
from .frame_buffer import FrameBuffer
+from .settings import Settings
from .stream import H2Stream
@@ -208,12 +209,9 @@ def __init__(self, client_side=True):
# outbound side of the connection.
self.outbound_flow_control_window = 65535
- # This might want to be an extensible class that does sensible stuff
- # with defaults. For now, a dict will do.
- self.local_settings = {}
- self.remote_settings = {
- SettingsFrame.INITIAL_WINDOW_SIZE: 65535,
- }
+ # Objects that store settings, including defaults.
+ self.local_settings = Settings(client=client_side)
+ self.remote_settings = Settings(client=not client_side)
# Buffer for incoming data.
self.incoming_buffer = FrameBuffer(server=not client_side)
@@ -429,16 +427,15 @@ def acknowledge_settings(self, event):
assert isinstance(event, RemoteSettingsChanged)
self.state_machine.process_input(ConnectionInputs.SEND_SETTINGS)
- if SettingsFrame.INITIAL_WINDOW_SIZE in event.changed_settings:
- setting = event.changed_settings[SettingsFrame.INITIAL_WINDOW_SIZE]
+ changes = self.remote_settings.acknowledge()
+
+ if SettingsFrame.INITIAL_WINDOW_SIZE in changes:
+ setting = changes[SettingsFrame.INITIAL_WINDOW_SIZE]
self._flow_control_change_from_settings(
setting.original_value,
setting.new_value,
)
- self.remote_settings.update(
- (k, v.new_value) for k, v in event.changed_settings.items()
- )
f = SettingsFrame(0)
f.flags.add('ACK')
self._prepare_for_sending([f])
@@ -590,10 +587,14 @@ def _receive_settings_frame(self, frame):
ConnectionInputs.RECV_SETTINGS
)
- # This is an ack of the local settings. Right now, do nothing.
+ # This is an ack of the local settings.
if 'ACK' in frame.flags:
+ self.local_settings.acknowledge()
return [], events
+ # Add the new settings.
+ self.remote_settings.update(frame.settings)
+
events.append(RemoteSettingsChanged.from_settings(
self.remote_settings, frame.settings
))
diff --git a/h2/events.py b/h2/events.py
index 4befc6cb9..e0e6db29b 100644
--- a/h2/events.py
+++ b/h2/events.py
@@ -9,7 +9,7 @@
track of events triggered by receiving data. Each time data is provided to the
H2 state machine it processes the data and returns a list of Event objects.
"""
-from collections import namedtuple
+from .settings import ChangedSetting
class RequestReceived(object):
@@ -71,11 +71,6 @@ class RemoteSettingsChanged(object):
settings are acceptable, and then acknowledge them. If they are not
acceptable, the user should close the connection.
"""
- #: A value structure for storing changed settings.
- ChangedSetting = namedtuple(
- 'ChangedSetting', ['setting', 'original_value', 'new_value']
- )
-
def __init__(self):
self.changed_settings = {}
@@ -92,7 +87,7 @@ def from_settings(cls, old_settings, new_settings):
e = cls()
for setting, new_value in new_settings.items():
original_value = old_settings.get(setting)
- change = cls.ChangedSetting(setting, original_value, new_value)
+ change = ChangedSetting(setting, original_value, new_value)
e.changed_settings[setting] = change
return e
diff --git a/h2/settings.py b/h2/settings.py
new file mode 100644
index 000000000..50dc22c37
--- /dev/null
+++ b/h2/settings.py
@@ -0,0 +1,148 @@
+# -*- coding: utf-8 -*-
+"""
+h2/settings
+~~~~~~~~~~~
+
+This module contains a HTTP/2 settings object. This object provides a simple
+API for manipulating HTTP/2 settings, keeping track of both the current active
+state of the settings and the unacknowledged future values of the settings.
+"""
+import collections
+
+from hyperframe.frame import SettingsFrame
+
+
+#: A value structure for storing changed settings.
+ChangedSetting = collections.namedtuple(
+ 'ChangedSetting', ['setting', 'original_value', 'new_value']
+)
+
+
+class Settings(collections.MutableMapping):
+ """
+ An object that encapsulates HTTP/2 settings state.
+
+ HTTP/2 Settings are a complex beast. Each party, remote and local, has its
+ own settings and a view of the other party's settings. When a settings
+ frame is emitted by a peer it cannot assume that the new settings values
+ are in place until the remote peer acknowledges the setting. In principle,
+ multiple settings changes can be "in flight" at the same time, all with
+ different values.
+
+ This object encapsulates this mess. It provides a dict-like interface to
+ settings, which return the *current* values pf the settings in question.
+ Additionally, it keeps track of the stack of proposed values: each time an
+ acknowledgement is sent/received, it updates the current values with the
+ stack of proposed values.
+
+ Finally, this object understands what the default values of the HTTP/2
+ settings are, and sets those defaults appropriately.
+ """
+ def __init__(self, client=True):
+ # Backing object for the settings. This is a dictionary of
+ # (setting: [list of values]), where the first value in the list is the
+ # current value of the setting. Strictly this doesn't use lists but
+ # instead uses collections.deque to avoid repeated memory allocations.
+ #
+ # This contains the default values for HTTP/2.
+ self._settings = {
+ SettingsFrame.HEADER_TABLE_SIZE: collections.deque([4096]),
+ SettingsFrame.ENABLE_PUSH: collections.deque([int(client)]),
+ SettingsFrame.INITIAL_WINDOW_SIZE: collections.deque([65535]),
+ SettingsFrame.SETTINGS_MAX_FRAME_SIZE: collections.deque([16384]),
+ }
+
+ def acknowledge(self):
+ """
+ The settings have been acknowledged, either by the user (remote
+ settings) or by the remote peer (local settings).
+
+ :returns: A dict of {setting: ChangedSetting} that were applied.
+ """
+ changed_settings = {}
+
+ # If there is more than one setting in the list, we have a setting
+ # value outstanding. Update them.
+ for k, v in self._settings.items():
+ if len(v) > 1:
+ old_setting = v.popleft()
+ new_setting = v[0]
+ changed_settings[k] = ChangedSetting(
+ k, old_setting, new_setting
+ )
+
+ return changed_settings
+
+ # Provide easy-access to well known settings.
+ @property
+ def header_table_size(self):
+ """
+ The current value of the SETTINGS_HEADER_TABLE_SIZE setting.
+ """
+ return self[SettingsFrame.HEADER_TABLE_SIZE]
+
+ @header_table_size.setter
+ def header_table_size(self, value):
+ self[SettingsFrame.HEADER_TABLE_SIZE] = value
+
+ @property
+ def enable_push(self):
+ """
+ The current value of the SETTINGS_ENABLE_PUSH setting.
+ """
+ return self[SettingsFrame.ENABLE_PUSH]
+
+ @enable_push.setter
+ def enable_push(self, value):
+ self[SettingsFrame.ENABLE_PUSH] = value
+
+ @property
+ def initial_window_size(self):
+ """
+ The current value of the SETTINGS_INITIAL_WINDOW_SIZE setting.
+ """
+ return self[SettingsFrame.INITIAL_WINDOW_SIZE]
+
+ @initial_window_size.setter
+ def initial_window_size(self, value):
+ self[SettingsFrame.INITIAL_WINDOW_SIZE] = value
+
+ @property
+ def max_frame_size(self):
+ """
+ The current value of the SETTINGS_MAX_FRAME_SIZE setting.
+ """
+ return self[SettingsFrame.SETTINGS_MAX_FRAME_SIZE]
+
+ @max_frame_size.setter
+ def max_frame_size(self, value):
+ self[SettingsFrame.SETTINGS_MAX_FRAME_SIZE] = value
+
+ # Implement the MutableMapping API.
+ def __getitem__(self, key):
+ val = self._settings[key][0]
+
+ # Things that were created when a setting was received should stay
+ # KeyError'd.
+ if val is None:
+ raise KeyError
+
+ return val
+
+ def __setitem__(self, key, value):
+ try:
+ items = self._settings[key]
+ except KeyError:
+ items = collections.deque([None])
+ self._settings[key] = items
+
+ items.append(value)
+
+ def __delitem__(self, key):
+ del self._settings[key]
+
+ def __iter__(self):
+ return self._settings.__iter__()
+
+ def __len__(self):
+ return len(self._settings)
| diff --git a/test/test_settings.py b/test/test_settings.py
new file mode 100644
index 000000000..fc1424aa4
--- /dev/null
+++ b/test/test_settings.py
@@ -0,0 +1,193 @@
+# -*- coding: utf-8 -*-
+"""
+test_settings
+~~~~~~~~~~~~~
+
+Test the Settings object.
+"""
+import pytest
+
+import h2.settings
+
+from hyperframe.frame import SettingsFrame
+
+
+class TestSettings(object):
+ """
+ Test the Settings object behaves as expected.
+ """
+ def test_settings_defaults_client(self):
+ """
+ The Settings object begins with the appropriate defaults for clients.
+ """
+ s = h2.settings.Settings(client=True)
+
+ assert s[SettingsFrame.HEADER_TABLE_SIZE] == 4096
+ assert s[SettingsFrame.ENABLE_PUSH] == 1
+ assert s[SettingsFrame.INITIAL_WINDOW_SIZE] == 65535
+ assert s[SettingsFrame.SETTINGS_MAX_FRAME_SIZE] == 16384
+
+ def test_settings_defaults_server(self):
+ """
+ The Settings object begins with the appropriate defaults for servers.
+ """
+ s = h2.settings.Settings(client=False)
+
+ assert s[SettingsFrame.HEADER_TABLE_SIZE] == 4096
+ assert s[SettingsFrame.ENABLE_PUSH] == 0
+ assert s[SettingsFrame.INITIAL_WINDOW_SIZE] == 65535
+ assert s[SettingsFrame.SETTINGS_MAX_FRAME_SIZE] == 16384
+
+ def test_applying_value_doesnt_take_effect_immediately(self):
+ """
+ When a value is applied to the settings object, it doesn't immediately
+ take effect.
+ """
+ s = h2.settings.Settings(client=True)
+ s[SettingsFrame.HEADER_TABLE_SIZE] == 8000
+
+ assert s[SettingsFrame.HEADER_TABLE_SIZE] == 4096
+
+ def test_acknowledging_values(self):
+ """
+ When we acknowledge settings, the values change.
+ """
+ s = h2.settings.Settings(client=True)
+ old_settings = dict(s)
+
+ new_settings = {
+ SettingsFrame.HEADER_TABLE_SIZE: 4000,
+ SettingsFrame.ENABLE_PUSH: 0,
+ SettingsFrame.INITIAL_WINDOW_SIZE: 60,
+ SettingsFrame.SETTINGS_MAX_FRAME_SIZE: 30,
+ }
+ s.update(new_settings)
+
+ assert dict(s) == old_settings
+ s.acknowledge()
+ assert dict(s) == new_settings
+
+ def test_acknowledging_returns_the_changed_settings(self):
+ """
+ Acknowledging settings returns the changes.
+ """
+ s = h2.settings.Settings(client=True)
+ s[SettingsFrame.HEADER_TABLE_SIZE] = 8000
+ s[SettingsFrame.ENABLE_PUSH] = 0
+
+ changes = s.acknowledge()
+ assert len(changes) == 2
+
+ table_size_change = changes[SettingsFrame.HEADER_TABLE_SIZE]
+ push_change = changes[SettingsFrame.ENABLE_PUSH]
+
+ assert table_size_change.setting == SettingsFrame.HEADER_TABLE_SIZE
+ assert table_size_change.original_value == 4096
+ assert table_size_change.new_value == 8000
+
+ assert push_change.setting == SettingsFrame.ENABLE_PUSH
+ assert push_change.original_value == 1
+ assert push_change.new_value == 0
+
+ def test_acknowledging_only_returns_changed_settings(self):
+ """
+ Acknowledging settings does not return unchanged settings.
+ """
+ s = h2.settings.Settings(client=True)
+ s[SettingsFrame.INITIAL_WINDOW_SIZE] = 70
+
+ changes = s.acknowledge()
+ assert len(changes) == 1
+ assert list(changes.keys()) == [SettingsFrame.INITIAL_WINDOW_SIZE]
+
+ def test_deleting_values_deletes_all_of_them(self):
+ """
+ When we delete a key we lose all state about it.
+ """
+ s = h2.settings.Settings(client=True)
+ s[SettingsFrame.HEADER_TABLE_SIZE] == 8000
+
+ del s[SettingsFrame.HEADER_TABLE_SIZE]
+
+ with pytest.raises(KeyError):
+ s[SettingsFrame.HEADER_TABLE_SIZE]
+
+ def test_length_correctly_reported(self):
+ """
+ Length is related only to the number of keys.
+ """
+ s = h2.settings.Settings(client=True)
+ assert len(s) == 4
+
+ s[SettingsFrame.HEADER_TABLE_SIZE] == 8000
+ assert len(s) == 4
+
+ s.acknowledge()
+ assert len(s) == 4
+
+ del s[SettingsFrame.HEADER_TABLE_SIZE]
+ assert len(s) == 3
+
+ def test_new_values_work(self):
+ """
+ New values initially don't appear
+ """
+ s = h2.settings.Settings(client=True)
+ s[80] = 81
+
+ with pytest.raises(KeyError):
+ s[80]
+
+ def test_new_values_follow_basic_acknowledgement_rules(self):
+ """
+ A new value properly appears when acknowledged.
+ """
+ s = h2.settings.Settings(client=True)
+ s[80] = 81
+ changed_settings = s.acknowledge()
+
+ assert s[80] == 81
+ assert len(changed_settings) == 1
+
+ changed = changed_settings[80]
+ assert changed.setting == 80
+ assert changed.original_value is None
+ assert changed.new_value == 81
+
+ def test_single_values_arent_affected_by_acknowledgement(self):
+ """
+ When acknowledged, unchanged settings remain unchanged.
+ """
+ s = h2.settings.Settings(client=True)
+ assert s[SettingsFrame.HEADER_TABLE_SIZE] == 4096
+
+ s.acknowledge()
+ assert s[SettingsFrame.HEADER_TABLE_SIZE] == 4096
+
+ def test_settings_getters(self):
+ """
+ Getters exist for well-known settings.
+ """
+ s = h2.settings.Settings(client=True)
+
+ assert s.header_table_size == s[SettingsFrame.HEADER_TABLE_SIZE]
+ assert s.enable_push == s[SettingsFrame.ENABLE_PUSH]
+ assert s.initial_window_size == s[SettingsFrame.INITIAL_WINDOW_SIZE]
+ assert s.max_frame_size == s[SettingsFrame.SETTINGS_MAX_FRAME_SIZE]
+
+ def test_settings_setters(self):
+ """
+ Setters exist for well-known settings.
+ """
+ s = h2.settings.Settings(client=True)
+
+ s.header_table_size = 0
+ s.enable_push = 1
+ s.initial_window_size = 2
+ s.max_frame_size = 3
+
+ s.acknowledge()
+ assert s[SettingsFrame.HEADER_TABLE_SIZE] == 0
+ assert s[SettingsFrame.ENABLE_PUSH] == 1
+ assert s[SettingsFrame.INITIAL_WINDOW_SIZE] == 2
+ assert s[SettingsFrame.SETTINGS_MAX_FRAME_SIZE] == 3
| [
{
"components": [
{
"doc": "An object that encapsulates HTTP/2 settings state.\n\nHTTP/2 Settings are a complex beast. Each party, remote and local, has its\nown settings and a view of the other party's settings. When a settings\nframe is emitted by a peer it cannot assume that the new settings va... | [
"test/test_settings.py::TestSettings::test_settings_defaults_client",
"test/test_settings.py::TestSettings::test_settings_defaults_server",
"test/test_settings.py::TestSettings::test_applying_value_doesnt_take_effect_immediately",
"test/test_settings.py::TestSettings::test_acknowledging_values",
"test/test_... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Remote settings handling
This change introduces a Settings management object to handle adjustments in remote settings. It allows us to keep track of the difference between acknowledged settings and unacknowledged settings, and lets us easily manage the transitions between them.
This is part of #5. It does not add any function for remote settings management: that is yet to come.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in h2/settings.py]
(definition of Settings:)
class Settings(collections.MutableMapping):
"""An object that encapsulates HTTP/2 settings state.
HTTP/2 Settings are a complex beast. Each party, remote and local, has its
own settings and a view of the other party's settings. When a settings
frame is emitted by a peer it cannot assume that the new settings values
are in place until the remote peer acknowledges the setting. In principle,
multiple settings changes can be "in flight" at the same time, all with
different values.
This object encapsulates this mess. It provides a dict-like interface to
settings, which return the *current* values pf the settings in question.
Additionally, it keeps track of the stack of proposed values: each time an
acknowledgement is sent/received, it updates the current values with the
stack of proposed values.
Finally, this object understands what the default values of the HTTP/2
settings are, and sets those defaults appropriately."""
(definition of Settings.__init__:)
def __init__(self, client=True):
(definition of Settings.acknowledge:)
def acknowledge(self):
"""The settings have been acknowledged, either by the user (remote
settings) or by the remote peer (local settings).
:returns: A dict of {setting: ChangedSetting} that were applied."""
(definition of Settings.header_table_size:)
def header_table_size(self):
"""The current value of the SETTINGS_HEADER_TABLE_SIZE setting."""
(definition of Settings.header_table_size:)
def header_table_size(self, value):
(definition of Settings.enable_push:)
def enable_push(self):
"""The current value of the SETTINGS_ENABLE_PUSH setting."""
(definition of Settings.enable_push:)
def enable_push(self, value):
(definition of Settings.initial_window_size:)
def initial_window_size(self):
"""The current value of the SETTINGS_INITIAL_WINDOW_SIZE setting."""
(definition of Settings.initial_window_size:)
def initial_window_size(self, value):
(definition of Settings.max_frame_size:)
def max_frame_size(self):
"""The current value of the SETTINGS_MAX_FRAME_SIZE setting."""
(definition of Settings.max_frame_size:)
def max_frame_size(self, value):
(definition of Settings.__getitem__:)
def __getitem__(self, key):
(definition of Settings.__setitem__:)
def __setitem__(self, key, value):
(definition of Settings.__delitem__:)
def __delitem__(self, key):
(definition of Settings.__iter__:)
def __iter__(self):
(definition of Settings.__len__:)
def __len__(self):
[end of new definitions in h2/settings.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 9df8f94ce983d44ef57c8f332463f7b3cbe0127b | ||
python-hyper__h2-15 | 15 | python-hyper/h2 | null | 48d17d3d9ec51a537af845c877455e3b70b9b146 | 2015-09-18T13:16:40Z | diff --git a/h2/connection.py b/h2/connection.py
index 03bcd59de..ff42476d1 100644
--- a/h2/connection.py
+++ b/h2/connection.py
@@ -14,7 +14,7 @@
from hpack.hpack import Encoder, Decoder
from .events import WindowUpdated, RemoteSettingsChanged, PingAcknowledged
-from .exceptions import ProtocolError, NoSuchStreamError
+from .exceptions import ProtocolError, NoSuchStreamError, FlowControlError
from .frame_buffer import FrameBuffer
from .stream import H2Stream
@@ -204,6 +204,10 @@ def __init__(self, client_side=True):
self.decoder = Decoder()
self.client_side = client_side
+ # The curent value of the connection flow control window on the
+ # outbound side of the connection.
+ self.outbound_flow_control_window = 65535
+
# This might want to be an extensible class that does sensible stuff
# with defaults. For now, a dict will do.
self.local_settings = {}
@@ -305,9 +309,19 @@ def send_data(self, stream_id, data, end_stream=False):
"""
Send data on a given stream.
"""
+ if len(data) > self.flow_control_window(stream_id):
+ raise FlowControlError(
+ "Cannot send %d bytes, flow control window is %d." %
+ (len(data), self.flow_control_window(stream_id))
+ )
+
self.state_machine.process_input(ConnectionInputs.SEND_DATA)
frames, events = self.streams[stream_id].send_data(data, end_stream)
self._prepare_for_sending(frames)
+
+ self.outbound_flow_control_window -= len(data)
+ assert self.outbound_flow_control_window >= 0
+
return events
def end_stream(self, stream_id):
@@ -417,6 +431,26 @@ def acknowledge_settings(self, event):
self._prepare_for_sending([f])
return []
+ def flow_control_window(self, stream_id):
+ """
+ Returns the maximum amount of data that can be sent on stream
+ ``stream_id``.
+
+ This value will never be larger than the total data that can be sent on
+ the connection: even if the given stream allows more data, the
+ connection window provides a logical maximum to the amount of data that
+ can be sent.
+
+ The maximum data that can be sent in a single data frame on a stream
+ is either this value, or the maximum frame size, whichever is
+ *smaller*.
+ """
+ stream = self.get_stream_by_id(stream_id)
+ return min(
+ self.outbound_flow_control_window,
+ stream.outbound_flow_control_window
+ )
+
def data_to_send(self, amt=None):
"""
Returns some data for sending out of the internal data buffer.
@@ -550,6 +584,9 @@ def _receive_window_update_frame(self, frame):
frame.window_increment
)
else:
+ # Increment our local flow control window.
+ self.outbound_flow_control_window += frame.window_increment
+
# FIXME: Should we split this into one event per active stream?
window_updated_event = WindowUpdated()
window_updated_event.stream_id = 0
diff --git a/h2/exceptions.py b/h2/exceptions.py
index 2d2f9868c..4b0707ed4 100644
--- a/h2/exceptions.py
+++ b/h2/exceptions.py
@@ -26,3 +26,10 @@ class NoSuchStreamError(H2Error):
"""
def __init__(self, stream_id):
self.stream_id = stream_id
+
+
+class FlowControlError(H2Error):
+ """
+ An attempted action violates flow control constraints.
+ """
+ pass
diff --git a/h2/stream.py b/h2/stream.py
index 19034a5ad..f502cb71e 100644
--- a/h2/stream.py
+++ b/h2/stream.py
@@ -380,6 +380,10 @@ def __init__(self, stream_id):
self.max_outbound_frame_size = None
self.max_inbound_frame_size = None
+ # The curent value of the stream flow control window on the outbound
+ # side of the stream.
+ self.outbound_flow_control_window = 65535
+
def send_headers(self, headers, encoder, end_stream=False):
"""
Returns a list of HEADERS/CONTINUATION frames to emit as either headers
@@ -430,8 +434,9 @@ def locally_pushed(self):
def send_data(self, data, end_stream=False):
"""
Prepare some data frames. Optionally end the stream.
+
+ .. warning:: Does not perform flow control checks.
"""
- # TODO: Something something flow control.
frames = []
for offset in range(0, len(data), self.max_outbound_frame_size):
self.state_machine.process_input(StreamInputs.SEND_DATA)
@@ -443,6 +448,9 @@ def send_data(self, data, end_stream=False):
self.state_machine.process_input(StreamInputs.SEND_END_STREAM)
frames[-1].flags.add('END_STREAM')
+ self.outbound_flow_control_window -= len(data)
+ assert self.outbound_flow_control_window >= 0
+
return frames, []
def end_stream(self):
@@ -523,6 +531,7 @@ def receive_window_update(self, increment):
StreamInputs.RECV_WINDOW_UPDATE
)
events[0].delta = increment
+ self.outbound_flow_control_window += increment
return [], events
def reset_stream(self, error_code=0):
| diff --git a/test/test_flow_control_window.py b/test/test_flow_control_window.py
new file mode 100644
index 000000000..3187be23e
--- /dev/null
+++ b/test/test_flow_control_window.py
@@ -0,0 +1,115 @@
+# -*- coding: utf-8 -*-
+"""
+test_flow_control
+~~~~~~~~~~~~~~~~~
+
+Tests of the flow control management in h2
+"""
+import pytest
+
+import h2.connection
+import h2.exceptions
+
+
+class TestFlowControl(object):
+ """
+ Tests of the flow control management in the connection objects.
+ """
+ example_request_headers = [
+ (':authority', 'example.com'),
+ (':path', '/'),
+ (':scheme', 'https'),
+ (':method', 'GET'),
+ ]
+
+ DEFAULT_FLOW_WINDOW = 65535
+
+ def test_flow_control_initializes_properly(self):
+ """
+ The flow control window for a stream should initially be the default
+ flow control value.
+ """
+ c = h2.connection.H2Connection()
+ c.send_headers(1, self.example_request_headers)
+
+ assert c.flow_control_window(1) == self.DEFAULT_FLOW_WINDOW
+
+ def test_flow_control_decreases_with_sent_data(self):
+ """
+ When data is sent on a stream, the flow control window should drop.
+ """
+ c = h2.connection.H2Connection()
+ c.send_headers(1, self.example_request_headers)
+ c.send_data(1, b'some data')
+
+ remaining_length = self.DEFAULT_FLOW_WINDOW - len(b'some data')
+ assert (c.flow_control_window(1) == remaining_length)
+
+ def test_flow_control_is_limited_by_connection(self):
+ """
+ The flow control window is limited by the flow control of the
+ connection.
+ """
+ c = h2.connection.H2Connection()
+ c.send_headers(1, self.example_request_headers)
+ c.send_data(1, b'some data')
+ c.send_headers(2, self.example_request_headers)
+
+ remaining_length = self.DEFAULT_FLOW_WINDOW - len(b'some data')
+ assert (c.flow_control_window(2) == remaining_length)
+
+ def test_cannot_send_more_data_than_window(self):
+ """
+ Sending more data than the remaining flow control window raises a
+ FlowControlError.
+ """
+ c = h2.connection.H2Connection()
+ c.send_headers(1, self.example_request_headers)
+ c.outbound_flow_control_window = 5
+
+ with pytest.raises(h2.exceptions.FlowControlError):
+ c.send_data(1, b'some data')
+
+ def test_increasing_connection_window_allows_sending(self, frame_factory):
+ """
+ Confirm that sending a WindowUpdate frame on the connection frees
+ up space for further frames.
+ """
+ c = h2.connection.H2Connection()
+ c.send_headers(1, self.example_request_headers)
+ c.outbound_flow_control_window = 5
+
+ with pytest.raises(h2.exceptions.FlowControlError):
+ c.send_data(1, b'some data')
+
+ f = frame_factory.build_window_update_frame(
+ stream_id=0,
+ increment=5,
+ )
+ c.receive_data(f.serialize())
+
+ c.clear_outbound_data_buffer()
+ c.send_data(1, b'some data')
+ assert c.data_to_send()
+
+ def test_increasing_stream_window_allows_sending(self, frame_factory):
+ """
+ Confirm that sending a WindowUpdate frame on the connection frees
+ up space for further frames.
+ """
+ c = h2.connection.H2Connection()
+ c.send_headers(1, self.example_request_headers)
+ c.get_stream_by_id(1).outbound_flow_control_window = 5
+
+ with pytest.raises(h2.exceptions.FlowControlError):
+ c.send_data(1, b'some data')
+
+ f = frame_factory.build_window_update_frame(
+ stream_id=1,
+ increment=5,
+ )
+ c.receive_data(f.serialize())
+
+ c.clear_outbound_data_buffer()
+ c.send_data(1, b'some data')
+ assert c.data_to_send()
| [
{
"components": [
{
"doc": "Returns the maximum amount of data that can be sent on stream\n``stream_id``.\n\nThis value will never be larger than the total data that can be sent on\nthe connection: even if the given stream allows more data, the\nconnection window provides a logical maximum to the ... | [
"test/test_flow_control_window.py::TestFlowControl::test_flow_control_initializes_properly",
"test/test_flow_control_window.py::TestFlowControl::test_flow_control_decreases_with_sent_data",
"test/test_flow_control_window.py::TestFlowControl::test_flow_control_is_limited_by_connection",
"test/test_flow_control... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Local flow control tracking and enforcement.
Resolves #4.
This contains a system tracking flow control for the data we _emit_. It ensures that we do not send too much data, and that we make it possible for the user to understand what data they can send and when they can send it.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in h2/connection.py]
(definition of H2Connection.flow_control_window:)
def flow_control_window(self, stream_id):
"""Returns the maximum amount of data that can be sent on stream
``stream_id``.
This value will never be larger than the total data that can be sent on
the connection: even if the given stream allows more data, the
connection window provides a logical maximum to the amount of data that
can be sent.
The maximum data that can be sent in a single data frame on a stream
is either this value, or the maximum frame size, whichever is
*smaller*."""
[end of new definitions in h2/connection.py]
[start of new definitions in h2/exceptions.py]
(definition of FlowControlError:)
class FlowControlError(H2Error):
"""An attempted action violates flow control constraints."""
[end of new definitions in h2/exceptions.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Complete local flow control handling.
Right now we handle flow control locally by emitting a WindowUpdated event. That's not really enough, because we take no action to prevent users from sending more data than they should, and we don't provide the tools to allow external callers to check flow control state themselves.
We need:
- To enable users to query the flow control window for a given stream, which _must_ take the connection-level flow control window into account.
- To throw exceptions when users attempt to send more data than they can fit into the flow control window.
----------
--------------------
</issues> | 9df8f94ce983d44ef57c8f332463f7b3cbe0127b | |
boto__boto3-246 | 246 | boto/boto3 | null | 314104da31ab09825d4d1d9289977b431ffa6048 | 2015-09-04T22:57:40Z | diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index d1c775c7e8..aa74844268 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -1,6 +1,13 @@
Changelog
=========
+Next Release - (TBD)
+--------------------
+
+* bugfix:Identifier: Make resource identifiers immutable.
+ (`issue 246 <https://github.com/boto/boto3/pull/246>`__)
+
+
1.1.3 - 2015-09-03
------------------
diff --git a/boto3/resources/base.py b/boto3/resources/base.py
index f1d9b77acd..b3b902594f 100644
--- a/boto3/resources/base.py
+++ b/boto3/resources/base.py
@@ -99,7 +99,7 @@ def __init__(self, *args, **kwargs):
# Allow setting identifiers as positional arguments in the order
# in which they were defined in the ResourceJSON.
for i, value in enumerate(args):
- setattr(self, self.meta.identifiers[i], value)
+ setattr(self, '_' + self.meta.identifiers[i], value)
# Allow setting identifiers via keyword arguments. Here we need
# extra logic to ignore other keyword arguments like ``client``.
@@ -110,7 +110,7 @@ def __init__(self, *args, **kwargs):
if name not in self.meta.identifiers:
raise ValueError('Unknown keyword argument: {0}'.format(name))
- setattr(self, name, value)
+ setattr(self, '_' + name, value)
# Validate that all identifiers have been set.
for identifier in self.meta.identifiers:
diff --git a/boto3/resources/factory.py b/boto3/resources/factory.py
index 198e78d34e..57b2294025 100644
--- a/boto3/resources/factory.py
+++ b/boto3/resources/factory.py
@@ -112,7 +112,7 @@ def _load_identifiers(self, attrs, meta, model):
"""
for identifier in model.identifiers:
meta.identifiers.append(identifier.name)
- attrs[identifier.name] = None
+ attrs[identifier.name] = self._create_identifier(identifier.name)
def _load_actions(self, attrs, model, resource_defs, service_model):
"""
@@ -194,6 +194,23 @@ def _load_waiters(self, attrs, model):
for waiter in model.waiters:
attrs[waiter.name] = self._create_waiter(waiter)
+ def _create_identifier(factory_self, name):
+ """
+ Creates a read-only property for identifier attributes.
+ """
+ def identifier(self):
+ # The default value is set to ``None`` instead of
+ # raising an AttributeError because when resources are
+ # instantiated a check is made such that none of the
+ # identifiers have a value ``None``. If any are ``None``,
+ # a more informative user error than a generic AttributeError
+ # is raised.
+ return getattr(self, '_' + name, None)
+
+ identifier.__name__ = str(name)
+ identifier.__doc__ = 'TODO'
+ return property(identifier)
+
def _create_autoload_property(factory_self, name, snake_cased):
"""
Creates a new property on the resource to lazy-load its value
| diff --git a/tests/unit/resources/test_factory.py b/tests/unit/resources/test_factory.py
index 2d9578f18f..5df0585f2e 100644
--- a/tests/unit/resources/test_factory.py
+++ b/tests/unit/resources/test_factory.py
@@ -648,6 +648,13 @@ def test_dangling_resource_raises_for_unknown_arg(self):
with self.assertRaises(ValueError):
resource.Queue(url='foo', bar='baz')
+ def test_dangling_resource_identifier_is_immutable(self):
+ resource = self.load('test', 'test', self.model, self.defs, None)()
+ queue = resource.Queue('url')
+ # We should not be able to change the identifier's value
+ with self.assertRaises(AttributeError):
+ queue.url = 'foo'
+
def test_dangling_resource_equality(self):
resource = self.load('test', 'test', self.model, self.defs, None)()
| diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index d1c775c7e8..aa74844268 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -1,6 +1,13 @@
Changelog
=========
+Next Release - (TBD)
+--------------------
+
+* bugfix:Identifier: Make resource identifiers immutable.
+ (`issue 246 <https://github.com/boto/boto3/pull/246>`__)
+
+
1.1.3 - 2015-09-03
------------------
| [
{
"components": [
{
"doc": "Creates a read-only property for identifier attributes.",
"lines": [
197,
212
],
"name": "ResourceFactory._create_identifier",
"signature": "def _create_identifier(factory_self, name):",
"type": "function"
... | [
"tests/unit/resources/test_factory.py::TestResourceFactoryDanglingResource::test_dangling_resource_identifier_is_immutable"
] | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_can_instantiate_service_resource",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_factory_creates_dangling_resources",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_factory_creates_properties",
"tests/unit/re... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Make resource identifiers immutable
Before you could switch out the identifiers on a resource, which can have unintended affects if you switch the identifier and then make subsequent calls with the resource. Now identifiers are read-only, much like how attributes are read-only:
``` py
In [1]: import boto3
In [2]: s3 = boto3.resource('s3')
In [3]: bucket = s3.Bucket('mybucketfoo')
In [4]: bucket.name = 'foo'
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-4-5d85c15550cd> in <module>()
----> 1 bucket.name = 'foo'
AttributeError: can't set attribute
```
cc @jamesls @mtdowling @rayluo
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/resources/factory.py]
(definition of ResourceFactory._create_identifier:)
def _create_identifier(factory_self, name):
"""Creates a read-only property for identifier attributes."""
(definition of ResourceFactory._create_identifier.identifier:)
def identifier(self):
[end of new definitions in boto3/resources/factory.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | |
boto__boto3-106 | 106 | boto/boto3 | null | fec3bf95b9a9bf8b3af37353aeee2baf03848c87 | 2015-04-29T17:47:48Z | diff --git a/boto3/dynamodb/conditions.py b/boto3/dynamodb/conditions.py
new file mode 100644
index 0000000000..1f38724767
--- /dev/null
+++ b/boto3/dynamodb/conditions.py
@@ -0,0 +1,416 @@
+# Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"). You
+# may not use this file except in compliance with the License. A copy of
+# the License is located at
+#
+# http://aws.amazon.com/apache2.0/
+#
+# or in the "license" file accompanying this file. This file is
+# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
+# ANY KIND, either express or implied. See the License for the specific
+# language governing permissions and limitations under the License.
+from collections import namedtuple
+import functools
+import re
+
+from boto3.exceptions import DynanmoDBOperationNotSupportedError
+from boto3.exceptions import DynamoDBNeedsConditionError
+from boto3.exceptions import DynamoDBNeedsKeyConditionError
+
+
+ATTR_NAME_REGEX = re.compile(r'[^.\[\]]+(?![^\[]*\])')
+
+
+class ConditionBase(object):
+
+ expression_format = ''
+ expression_operator = ''
+ has_grouped_values = False
+
+ def __init__(self, *values):
+ self._values = values
+
+ def __and__(self, other):
+ if not isinstance(other, ConditionBase):
+ raise DynanmoDBOperationNotSupportedError('AND', other)
+ return And(self, other)
+
+ def __or__(self, other):
+ if not isinstance(other, ConditionBase):
+ raise DynanmoDBOperationNotSupportedError('OR', other)
+ return Or(self, other)
+
+ def __invert__(self):
+ return Not(self)
+
+ def get_expression(self):
+ return {'format': self.expression_format,
+ 'operator': self.expression_operator,
+ 'values': self._values}
+
+ def __eq__(self, other):
+ if isinstance(other, type(self)):
+ if self._values == other._values:
+ return True
+ return False
+
+ def __ne__(self, other):
+ return not self.__eq__(other)
+
+
+class AttributeBase(object):
+ def __init__(self, name):
+ self.name = name
+
+ def __and__(self, value):
+ raise DynanmoDBOperationNotSupportedError('AND', self)
+
+ def __or__(self, value):
+ raise DynanmoDBOperationNotSupportedError('OR', self)
+
+ def __invert__(self):
+ raise DynanmoDBOperationNotSupportedError('NOT', self)
+
+ def eq(self, value):
+ """Creates a condtion where the attribute is equal to the value.
+
+ :param value: The value that the attribute is equal to.
+ """
+ return Equals(self, value)
+
+ def lt(self, value):
+ """Creates a condtion where the attribute is less than the value.
+
+ :param value: The value that the attribute is less than.
+ """
+ return LessThan(self, value)
+
+ def lte(self, value):
+ """Creates a condtion where the attribute is less than or equal to the
+ value.
+
+ :param value: The value that the attribute is less than or equal to.
+ """
+ return LessThanEquals(self, value)
+
+ def gt(self, value):
+ """Creates a condtion where the attribute is greater than the value.
+
+ :param value: The value that the attribute is greater than.
+ """
+ return GreaterThan(self, value)
+
+ def gte(self, value):
+ """Creates a condtion where the attribute is greater than or equal to
+ the value.
+
+ :param value: The value that the attribute is greater than or equal to.
+ """
+ return GreaterThanEquals(self, value)
+
+ def begins_with(self, value):
+ """Creates a condtion where the attribute begins with the value.
+
+ :param value: The value that the attribute begins with.
+ """
+ return BeginsWith(self, value)
+
+ def between(self, low_value, high_value):
+ """Creates a condtion where the attribute is between the low value and
+ the high value.
+
+ :param low_value: The value that the attribute is greater than.
+ :param high_value: The value that the attribute is less than.
+ """
+ return Between(self, low_value, high_value)
+
+
+class ConditionAttributeBase(ConditionBase, AttributeBase):
+ """This base class is for conditions that can have attribute methods.
+
+ One example is the Size condition. To complete a condition, you need
+ to apply another AttributeBase method like eq().
+ """
+ def __init__(self, *values):
+ ConditionBase.__init__(self, *values)
+ # This is assuming the first value to the condition is the attribute
+ # in which can be used to generate its attribute base.
+ AttributeBase.__init__(self, values[0].name)
+
+
+class ComparisonCondition(ConditionBase):
+ expression_format = '{0} {operator} {1}'
+
+
+class Equals(ComparisonCondition):
+ expression_operator = '='
+
+
+class NotEquals(ComparisonCondition):
+ expression_operator = '<>'
+
+
+class LessThan(ComparisonCondition):
+ expression_operator = '<'
+
+
+class LessThanEquals(ComparisonCondition):
+ expression_operator = '<='
+
+
+class GreaterThan(ComparisonCondition):
+ expression_operator = '>'
+
+
+class GreaterThanEquals(ComparisonCondition):
+ expression_operator = '>='
+
+
+class In(ComparisonCondition):
+ expression_operator = 'IN'
+ has_grouped_values = True
+
+
+class Between(ConditionBase):
+ expression_operator = 'BETWEEN'
+ expression_format = '{0} {operator} {1} AND {2}'
+
+
+class BeginsWith(ConditionBase):
+ expression_operator = 'begins_with'
+ expression_format = '{operator}({0}, {1})'
+
+
+class Contains(ConditionBase):
+ expression_operator = 'contains'
+ expression_format = '{operator}({0}, {1})'
+
+
+class Size(ConditionAttributeBase):
+ expression_operator = 'size'
+ expression_format = '{operator}({0})'
+
+
+class AttributeType(ConditionBase):
+ expression_operator = 'attribute_type'
+ expression_format = '{operator}({0}, {1})'
+
+
+class AttributeExists(ConditionBase):
+ expression_operator = 'attribute_exists'
+ expression_format = '{operator}({0})'
+
+
+class AttributeNotExists(ConditionBase):
+ expression_operator = 'attribute_not_exists'
+ expression_format = '{operator}({0})'
+
+
+class And(ConditionBase):
+ expression_operator = 'AND'
+ expression_format = '({0} {operator} {1})'
+
+
+class Or(ConditionBase):
+ expression_operator = 'OR'
+ expression_format = '({0} {operator} {1})'
+
+
+class Not(ConditionBase):
+ expression_operator = 'NOT'
+ expression_format = '({operator} {0})'
+
+
+class Key(AttributeBase):
+ pass
+
+
+class Attr(AttributeBase):
+ """Represents an DynamoDB item's attribute."""
+ def ne(self, value):
+ """Creates a condtion where the attribute is not equal to the value
+
+ :param value: The value that the attribute is not equal to.
+ """
+ return NotEquals(self, value)
+
+ def is_in(self, value):
+ """Creates a condtion where the attribute is in the value,
+
+ :type value: list
+ :param value: The value that the attribute is in.
+ """
+ return In(self, value)
+
+ def exists(self):
+ """Creates a condtion where the attribute exists."""
+ return AttributeExists(self)
+
+ def not_exists(self):
+ """Creates a condtion where the attribute does not exist."""
+ return AttributeNotExists(self)
+
+ def contains(self, value):
+ """Creates a condition where the attribute contains the value.
+
+ :param value: The value the attribute contains.
+ """
+ return Contains(self, value)
+
+ def size(self):
+ """Creates a condition for the attribute size.
+
+ Note another AttributeBase method must be called on the returned
+ size condition to be a valid DynamoDB condition.
+ """
+ return Size(self)
+
+ def attribute_type(self, value):
+ """Creates a condition for the attribute type.
+
+ :param value: The type of the attribute.
+ """
+ return AttributeType(self, value)
+
+
+BuiltConditionExpression = namedtuple(
+ 'BuiltConditionExpression',
+ ['condition_expression', 'attribute_name_placeholders',
+ 'attribute_value_placeholders']
+)
+
+
+class ConditionExpressionBuilder(object):
+ """This class is used to build condition expressions with placeholders"""
+ def __init__(self):
+ self._name_count = 0
+ self._value_count = 0
+ self._name_placeholder = 'n'
+ self._value_placeholder = 'v'
+
+ def _get_name_placeholder(self):
+ return '#' + self._name_placeholder + str(self._name_count)
+
+ def _get_value_placeholder(self):
+ return ':' + self._value_placeholder + str(self._value_count)
+
+ def reset(self):
+ """Resets the placeholder name and values"""
+ self._name_count = 0
+ self._value_count = 0
+
+ def build_expression(self, condition, is_key_condition=False):
+ """Builds the condition expression and the dictionary of placeholders.
+
+ :type condition: ConditionBase
+ :param condition: A condition to be built into a condition expression
+ string with any necessary placeholders.
+
+ :type is_key_condition: Boolean
+ :param is_key_condition: True if the expression is for a
+ KeyConditionExpression. False otherwise.
+
+ :rtype: (string, dict, dict)
+ :returns: Will return a string representing the condition with
+ placeholders inserted where necessary, a dictionary of
+ placeholders for attribute names, and a dictionary of
+ placeholders for attribute values. Here is a sample return value:
+
+ ('#n0 = :v0', {'#n0': 'myattribute'}, {':v1': 'myvalue'})
+ """
+ if not isinstance(condition, ConditionBase):
+ raise DynamoDBNeedsConditionError(condition)
+ attribute_name_placeholders = {}
+ attribute_value_placeholders = {}
+ condition_expression = self._build_expression(
+ condition, attribute_name_placeholders,
+ attribute_value_placeholders, is_key_condition=is_key_condition)
+ return BuiltConditionExpression(
+ condition_expression=condition_expression,
+ attribute_name_placeholders=attribute_name_placeholders,
+ attribute_value_placeholders=attribute_value_placeholders
+ )
+
+ def _build_expression(self, condition, attribute_name_placeholders,
+ attribute_value_placeholders, is_key_condition):
+ expression_dict = condition.get_expression()
+ replaced_values = []
+ for value in expression_dict['values']:
+ # Build the necessary placeholders for that value.
+ # Placeholders are built for both attribute names and values.
+ replaced_value = self._build_expression_component(
+ value, attribute_name_placeholders,
+ attribute_value_placeholders, condition.has_grouped_values,
+ is_key_condition)
+ replaced_values.append(replaced_value)
+ # Fill out the expression using the operator and the
+ # values that have been replaced with placeholders.
+ return expression_dict['format'].format(
+ *replaced_values, operator=expression_dict['operator'])
+
+ def _build_expression_component(self, value, attribute_name_placeholders,
+ attribute_value_placeholders,
+ has_grouped_values, is_key_condition):
+ # Continue to recurse if the value is a ConditionBase in order
+ # to extract out all parts of the expression.
+ if isinstance(value, ConditionBase):
+ return self._build_expression(
+ value, attribute_name_placeholders,
+ attribute_value_placeholders, is_key_condition)
+ # If it is not a ConditionBase, we can recurse no further.
+ # So we check if it is an attribute and add placeholders for
+ # its name
+ elif isinstance(value, AttributeBase):
+ if is_key_condition and not isinstance(value, Key):
+ raise DynamoDBNeedsKeyConditionError(
+ 'Attribute object %s is of type %s. '
+ 'KeyConditionExpression only supports Attribute objects '
+ 'of type Key' % (value.name, type(value)))
+ return self._build_name_placeholder(
+ value, attribute_name_placeholders)
+ # If it is anything else, we treat it as a value and thus placeholders
+ # are needed for the value.
+ else:
+ return self._build_value_placeholder(
+ value, attribute_value_placeholders, has_grouped_values)
+
+ def _build_name_placeholder(self, value, attribute_name_placeholders):
+ attribute_name = value.name
+ # Figure out which parts of the attribute name that needs replacement.
+ attribute_name_parts = ATTR_NAME_REGEX.findall(attribute_name)
+
+ # Add a temporary placeholder for each of these parts.
+ placeholder_format = ATTR_NAME_REGEX.sub('%s', attribute_name)
+ str_format_args = []
+ for part in attribute_name_parts:
+ name_placeholder = self._get_name_placeholder()
+ self._name_count += 1
+ str_format_args.append(name_placeholder)
+ # Add the placeholder and value to dictionary of name placeholders.
+ attribute_name_placeholders[name_placeholder] = part
+ # Replace the temporary placeholders with the designated placeholders.
+ return placeholder_format % tuple(str_format_args)
+
+ def _build_value_placeholder(self, value, attribute_value_placeholders,
+ has_grouped_values=False):
+ # If the values are grouped, we need to add a placeholder for
+ # each element inside of the actual value.
+ if has_grouped_values:
+ placeholder_list = []
+ for v in value:
+ value_placeholder = self._get_value_placeholder()
+ self._value_count += 1
+ placeholder_list.append(value_placeholder)
+ attribute_value_placeholders[value_placeholder] = v
+ # Assuming the values are grouped by parenthesis.
+ # IN is the currently the only one that uses this so it maybe
+ # needed to be changed in future.
+ return '(' + ', '.join(placeholder_list) + ')'
+ # Otherwise, treat the value as a single value that needs only
+ # one placeholder.
+ else:
+ value_placeholder = self._get_value_placeholder()
+ self._value_count += 1
+ attribute_value_placeholders[value_placeholder] = value
+ return value_placeholder
diff --git a/boto3/exceptions.py b/boto3/exceptions.py
index 75d4a6334d..3e3e7c910e 100644
--- a/boto3/exceptions.py
+++ b/boto3/exceptions.py
@@ -31,3 +31,28 @@ class S3TransferFailedError(Exception):
class S3UploadFailedError(Exception):
pass
+
+
+class DynanmoDBOperationNotSupportedError(Exception):
+ """Raised for operantions that are not supported for an operand"""
+ def __init__(self, operation, value):
+ msg = (
+ '%s operation cannot be applied to value %s of type %s directly. '
+ 'Must use AttributeBase object methods (i.e. Attr().eq()). to '
+ 'generate ConditionBase instances first.' %
+ (operation, value, type(value)))
+ Exception.__init__(self, msg)
+
+
+class DynamoDBNeedsConditionError(Exception):
+ """Raised when input is not a condition"""
+ def __init__(self, value):
+ msg = (
+ 'Expecting a ConditionBase object. Got %s of type %s. '
+ 'Use AttributeBase object methods (i.e. Attr().eq()). to '
+ 'generate ConditionBase instances.' % (value, type(value)))
+ Exception.__init__(self, msg)
+
+
+class DynamoDBNeedsKeyConditionError(Exception):
+ pass
| diff --git a/tests/unit/dynamodb/test_conditions.py b/tests/unit/dynamodb/test_conditions.py
new file mode 100644
index 0000000000..f6e8c82913
--- /dev/null
+++ b/tests/unit/dynamodb/test_conditions.py
@@ -0,0 +1,473 @@
+# Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"). You
+# may not use this file except in compliance with the License. A copy of
+# the License is located at
+#
+# http://aws.amazon.com/apache2.0/
+#
+# or in the "license" file accompanying this file. This file is
+# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
+# ANY KIND, either express or implied. See the License for the specific
+# language governing permissions and limitations under the License.
+from tests import unittest
+
+from boto3.exceptions import DynanmoDBOperationNotSupportedError
+from boto3.exceptions import DynamoDBNeedsConditionError
+from boto3.exceptions import DynamoDBNeedsKeyConditionError
+from boto3.dynamodb.conditions import Attr, Key
+from boto3.dynamodb.conditions import And, Or, Not, Equals, LessThan
+from boto3.dynamodb.conditions import LessThanEquals, GreaterThan
+from boto3.dynamodb.conditions import GreaterThanEquals, BeginsWith, Between
+from boto3.dynamodb.conditions import NotEquals, In, AttributeExists
+from boto3.dynamodb.conditions import AttributeNotExists, Contains, Size
+from boto3.dynamodb.conditions import AttributeType
+from boto3.dynamodb.conditions import ConditionExpressionBuilder
+
+
+class TestK(unittest.TestCase):
+ def setUp(self):
+ self.attr = Key('mykey')
+ self.attr2 = Key('myotherkey')
+ self.value = 'foo'
+ self.value2 = 'foo2'
+
+ def test_and(self):
+ with self.assertRaisesRegexp(
+ DynanmoDBOperationNotSupportedError, 'AND'):
+ self.attr & self.attr2
+
+ def test_or(self):
+ with self.assertRaisesRegexp(
+ DynanmoDBOperationNotSupportedError, 'OR'):
+ self.attr | self.attr2
+
+ def test_not(self):
+ with self.assertRaisesRegexp(
+ DynanmoDBOperationNotSupportedError, 'NOT'):
+ ~self.attr
+
+ def test_eq(self):
+ self.assertEqual(
+ self.attr.eq(self.value), Equals(self.attr, self.value))
+
+ def test_lt(self):
+ self.assertEqual(
+ self.attr.lt(self.value), LessThan(self.attr, self.value))
+
+ def test_lte(self):
+ self.assertEqual(
+ self.attr.lte(self.value), LessThanEquals(self.attr, self.value))
+
+ def test_gt(self):
+ self.assertEqual(
+ self.attr.gt(self.value), GreaterThan(self.attr, self.value))
+
+ def test_gte(self):
+ self.assertEqual(
+ self.attr.gte(self.value),
+ GreaterThanEquals(self.attr, self.value))
+
+ def test_begins_with(self):
+ self.assertEqual(self.attr.begins_with(self.value),
+ BeginsWith(self.attr, self.value))
+
+ def test_between(self):
+ self.assertEqual(self.attr.between(self.value, self.value2),
+ Between(self.attr, self.value, self.value2))
+
+
+class TestA(TestK):
+ def setUp(self):
+ self.attr = Attr('mykey')
+ self.attr2 = Attr('myotherkey')
+ self.value = 'foo'
+ self.value2 = 'foo2'
+
+ def test_ne(self):
+ self.assertEqual(self.attr.ne(self.value),
+ NotEquals(self.attr, self.value))
+
+ def test_is_in(self):
+ self.assertEqual(self.attr.is_in([self.value]),
+ In(self.attr, [self.value]))
+
+ def test_exists(self):
+ self.assertEqual(self.attr.exists(), AttributeExists(self.attr))
+
+ def test_not_exists(self):
+ self.assertEqual(self.attr.not_exists(), AttributeNotExists(self.attr))
+
+ def test_contains(self):
+ self.assertEqual(self.attr.contains(self.value),
+ Contains(self.attr, self.value))
+
+ def test_size(self):
+ self.assertEqual(self.attr.size(), Size(self.attr))
+
+ def test_attribute_type(self):
+ self.assertEqual(self.attr.attribute_type(self.value),
+ AttributeType(self.attr, self.value))
+
+
+class TestConditions(unittest.TestCase):
+ def setUp(self):
+ self.value = Attr('mykey')
+ self.value2 = 'foo'
+
+ def build_and_assert_expression(self, condition,
+ reference_expression_dict):
+ expression_dict = condition.get_expression()
+ self.assertDictEqual(expression_dict, reference_expression_dict)
+
+ def test_equal_operator(self):
+ cond1 = Equals(self.value, self.value2)
+ cond2 = Equals(self.value, self.value2)
+ self.assertTrue(cond1 == cond2)
+
+ def test_equal_operator_type(self):
+ cond1 = Equals(self.value, self.value2)
+ cond2 = NotEquals(self.value, self.value2)
+ self.assertFalse(cond1 == cond2)
+
+ def test_equal_operator_value(self):
+ cond1 = Equals(self.value, self.value2)
+ cond2 = Equals(self.value, self.value)
+ self.assertFalse(cond1 == cond2)
+
+ def test_not_equal_operator(self):
+ cond1 = Equals(self.value, self.value2)
+ cond2 = NotEquals(self.value, self.value)
+ self.assertTrue(cond1 != cond2)
+
+ def test_and_operator(self):
+ cond1 = Equals(self.value, self.value2)
+ cond2 = Equals(self.value, self.value2)
+ self.assertEqual(cond1 & cond2, And(cond1, cond2))
+
+ def test_and_operator_throws_excepetion(self):
+ cond1 = Equals(self.value, self.value2)
+ with self.assertRaisesRegexp(
+ DynanmoDBOperationNotSupportedError, 'AND'):
+ cond1 & self.value2
+
+ def test_or_operator(self):
+ cond1 = Equals(self.value, self.value2)
+ cond2 = Equals(self.value, self.value2)
+ self.assertEqual(cond1 | cond2, Or(cond1, cond2))
+
+ def test_or_operator_throws_excepetion(self):
+ cond1 = Equals(self.value, self.value2)
+ with self.assertRaisesRegexp(
+ DynanmoDBOperationNotSupportedError, 'OR'):
+ cond1 | self.value2
+
+ def test_not_operator(self):
+ cond1 = Equals(self.value, self.value2)
+ self.assertEqual(~cond1, Not(cond1))
+
+ def test_eq(self):
+ self.build_and_assert_expression(
+ Equals(self.value, self.value2),
+ {'format': '{0} {operator} {1}',
+ 'operator': '=', 'values': (self.value, self.value2)})
+
+ def test_ne(self):
+ self.build_and_assert_expression(
+ NotEquals(self.value, self.value2),
+ {'format': '{0} {operator} {1}',
+ 'operator': '<>', 'values': (self.value, self.value2)})
+
+ def test_lt(self):
+ self.build_and_assert_expression(
+ LessThan(self.value, self.value2),
+ {'format': '{0} {operator} {1}',
+ 'operator': '<', 'values': (self.value, self.value2)})
+
+ def test_lte(self):
+ self.build_and_assert_expression(
+ LessThanEquals(self.value, self.value2),
+ {'format': '{0} {operator} {1}',
+ 'operator': '<=', 'values': (self.value, self.value2)})
+
+ def test_gt(self):
+ self.build_and_assert_expression(
+ GreaterThan(self.value, self.value2),
+ {'format': '{0} {operator} {1}',
+ 'operator': '>', 'values': (self.value, self.value2)})
+
+ def test_gte(self):
+ self.build_and_assert_expression(
+ GreaterThanEquals(self.value, self.value2),
+ {'format': '{0} {operator} {1}',
+ 'operator': '>=', 'values': (self.value, self.value2)})
+
+ def test_in(self):
+ cond = In(self.value, (self.value2))
+ self.build_and_assert_expression(
+ cond,
+ {'format': '{0} {operator} {1}',
+ 'operator': 'IN', 'values': (self.value, (self.value2))})
+ self.assertTrue(cond.has_grouped_values)
+
+ def test_bet(self):
+ self.build_and_assert_expression(
+ Between(self.value, self.value2, 'foo2'),
+ {'format': '{0} {operator} {1} AND {2}',
+ 'operator': 'BETWEEN',
+ 'values': (self.value, self.value2, 'foo2')})
+
+ def test_beg(self):
+ self.build_and_assert_expression(
+ BeginsWith(self.value, self.value2),
+ {'format': '{operator}({0}, {1})',
+ 'operator': 'begins_with', 'values': (self.value, self.value2)})
+
+ def test_cont(self):
+ self.build_and_assert_expression(
+ Contains(self.value, self.value2),
+ {'format': '{operator}({0}, {1})',
+ 'operator': 'contains', 'values': (self.value, self.value2)})
+
+ def test_ae(self):
+ self.build_and_assert_expression(
+ AttributeExists(self.value),
+ {'format': '{operator}({0})',
+ 'operator': 'attribute_exists', 'values': (self.value,)})
+
+ def test_ane(self):
+ self.build_and_assert_expression(
+ AttributeNotExists(self.value),
+ {'format': '{operator}({0})',
+ 'operator': 'attribute_not_exists', 'values': (self.value,)})
+
+ def test_size(self):
+ self.build_and_assert_expression(
+ Size(self.value),
+ {'format': '{operator}({0})',
+ 'operator': 'size', 'values': (self.value,)})
+
+ def test_size_can_use_attr_methods(self):
+ size = Size(self.value)
+ self.build_and_assert_expression(
+ size.eq(self.value),
+ {'format': '{0} {operator} {1}',
+ 'operator': '=', 'values': (size, self.value)})
+
+ def test_size_can_use_and(self):
+ size = Size(self.value)
+ ae = AttributeExists(self.value)
+ self.build_and_assert_expression(
+ size & ae,
+ {'format': '({0} {operator} {1})',
+ 'operator': 'AND', 'values': (size, ae)})
+
+ def test_attribute_type(self):
+ self.build_and_assert_expression(
+ AttributeType(self.value, self.value2),
+ {'format': '{operator}({0}, {1})',
+ 'operator': 'attribute_type',
+ 'values': (self.value, self.value2)})
+
+ def test_and(self):
+ cond1 = Equals(self.value, self.value2)
+ cond2 = Equals(self.value, self.value2)
+ and_cond = And(cond1, cond2)
+ self.build_and_assert_expression(
+ and_cond,
+ {'format': '({0} {operator} {1})',
+ 'operator': 'AND', 'values': (cond1, cond2)})
+
+ def test_or(self):
+ cond1 = Equals(self.value, self.value2)
+ cond2 = Equals(self.value, self.value2)
+ or_cond = Or(cond1, cond2)
+ self.build_and_assert_expression(
+ or_cond,
+ {'format': '({0} {operator} {1})',
+ 'operator': 'OR', 'values': (cond1, cond2)})
+
+ def test_not(self):
+ cond = Equals(self.value, self.value2)
+ not_cond = Not(cond)
+ self.build_and_assert_expression(
+ not_cond,
+ {'format': '({operator} {0})',
+ 'operator': 'NOT', 'values': (cond,)})
+
+
+class TestConditionExpressionBuilder(unittest.TestCase):
+ def setUp(self):
+ self.builder = ConditionExpressionBuilder()
+
+ def assert_condition_expression_build(
+ self, condition, ref_string, ref_names, ref_values,
+ is_key_condition=False):
+ exp_string, names, values = self.builder.build_expression(
+ condition, is_key_condition=is_key_condition)
+ self.assertEqual(exp_string, ref_string)
+ self.assertEqual(names, ref_names)
+ self.assertEqual(values, ref_values)
+
+ def test_bad_input(self):
+ a = Attr('myattr')
+ with self.assertRaises(DynamoDBNeedsConditionError):
+ self.builder.build_expression(a)
+
+ def test_build_expression_eq(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.eq('foo'), '#n0 = :v0', {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_reset(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.eq('foo'), '#n0 = :v0', {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ self.assert_condition_expression_build(
+ a.eq('foo'), '#n1 = :v1', {'#n1': 'myattr'}, {':v1': 'foo'})
+
+ self.builder.reset()
+ self.assert_condition_expression_build(
+ a.eq('foo'), '#n0 = :v0', {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_expression_lt(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.lt('foo'), '#n0 < :v0', {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_expression_lte(self):
+ a1 = Attr('myattr')
+ self.assert_condition_expression_build(
+ a1.lte('foo'), '#n0 <= :v0', {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_expression_gt(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.gt('foo'), '#n0 > :v0', {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_expression_gte(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.gte('foo'), '#n0 >= :v0', {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_expression_begins_with(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.begins_with('foo'), 'begins_with(#n0, :v0)',
+ {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_expression_between(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.between('foo', 'foo2'), '#n0 BETWEEN :v0 AND :v1',
+ {'#n0': 'myattr'}, {':v0': 'foo', ':v1': 'foo2'})
+
+ def test_build_expression_ne(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.ne('foo'), '#n0 <> :v0', {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_expression_in(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.is_in([1, 2, 3]), '#n0 IN (:v0, :v1, :v2)',
+ {'#n0': 'myattr'}, {':v0': 1, ':v1': 2, ':v2': 3})
+
+ def test_build_expression_exists(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.exists(), 'attribute_exists(#n0)', {'#n0': 'myattr'}, {})
+
+ def test_build_expression_not_exists(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.not_exists(), 'attribute_not_exists(#n0)', {'#n0': 'myattr'}, {})
+
+ def test_build_contains(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.contains('foo'), 'contains(#n0, :v0)',
+ {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_size(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.size(), 'size(#n0)', {'#n0': 'myattr'}, {})
+
+ def test_build_size_with_other_conditons(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.size().eq(5), 'size(#n0) = :v0', {'#n0': 'myattr'}, {':v0': 5})
+
+ def test_build_attribute_type(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ a.attribute_type('foo'), 'attribute_type(#n0, :v0)',
+ {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_and(self):
+ a = Attr('myattr')
+ a2 = Attr('myattr2')
+ self.assert_condition_expression_build(
+ a.eq('foo') & a2.eq('bar'), '(#n0 = :v0 AND #n1 = :v1)',
+ {'#n0': 'myattr', '#n1': 'myattr2'}, {':v0': 'foo', ':v1': 'bar'})
+
+ def test_build_or(self):
+ a = Attr('myattr')
+ a2 = Attr('myattr2')
+ self.assert_condition_expression_build(
+ a.eq('foo') | a2.eq('bar'), '(#n0 = :v0 OR #n1 = :v1)',
+ {'#n0': 'myattr', '#n1': 'myattr2'}, {':v0': 'foo', ':v1': 'bar'})
+
+ def test_build_not(self):
+ a = Attr('myattr')
+ self.assert_condition_expression_build(
+ ~a.eq('foo'), '(NOT #n0 = :v0)',
+ {'#n0': 'myattr'}, {':v0': 'foo'})
+
+ def test_build_attribute_with_attr_value(self):
+ a = Attr('myattr')
+ value = Attr('myreference')
+ self.assert_condition_expression_build(
+ a.eq(value), '#n0 = #n1',
+ {'#n0': 'myattr', '#n1': 'myreference'}, {})
+
+ def test_build_with_is_key_condition(self):
+ k = Key('myattr')
+ self.assert_condition_expression_build(
+ k.eq('foo'), '#n0 = :v0',
+ {'#n0': 'myattr'}, {':v0': 'foo'}, is_key_condition=True)
+
+ def test_build_with_is_key_condition_throws_error(self):
+ a = Attr('myattr')
+ with self.assertRaises(DynamoDBNeedsKeyConditionError):
+ self.builder.build_expression(a.eq('foo'), is_key_condition=True)
+
+ def test_build_attr_map(self):
+ a = Attr('MyMap.MyKey')
+ self.assert_condition_expression_build(
+ a.eq('foo'), '#n0.#n1 = :v0', {'#n0': 'MyMap', '#n1': 'MyKey'},
+ {':v0': 'foo'})
+
+ def test_build_attr_list(self):
+ a = Attr('MyList[0]')
+ self.assert_condition_expression_build(
+ a.eq('foo'), '#n0[0] = :v0', {'#n0': 'MyList'}, {':v0': 'foo'})
+
+ def test_build_nested_attr_map_list(self):
+ a = Attr('MyMap.MyList[2].MyElement')
+ self.assert_condition_expression_build(
+ a.eq('foo'), '#n0.#n1[2].#n2 = :v0',
+ {'#n0': 'MyMap', '#n1': 'MyList', '#n2': 'MyElement'},
+ {':v0': 'foo'})
+
+ def test_build_double_nested_and_or(self):
+ a = Attr('myattr')
+ a2 = Attr('myattr2')
+ self.assert_condition_expression_build(
+ (a.eq('foo') & a2.eq('foo2')) | (a.eq('bar') & a2.eq('bar2')),
+ '((#n0 = :v0 AND #n1 = :v1) OR (#n2 = :v2 AND #n3 = :v3))',
+ {'#n0': 'myattr', '#n1': 'myattr2', '#n2': 'myattr',
+ '#n3': 'myattr2'},
+ {':v0': 'foo', ':v1': 'foo2', ':v2': 'bar', ':v3': 'bar2'})
| [
{
"components": [
{
"doc": "",
"lines": [
25,
59
],
"name": "ConditionBase",
"signature": "class ConditionBase(object):",
"type": "class"
},
{
"doc": "",
"lines": [
31,
32
],
... | [
"tests/unit/dynamodb/test_conditions.py::TestK::test_and",
"tests/unit/dynamodb/test_conditions.py::TestK::test_begins_with",
"tests/unit/dynamodb/test_conditions.py::TestK::test_between",
"tests/unit/dynamodb/test_conditions.py::TestK::test_eq",
"tests/unit/dynamodb/test_conditions.py::TestK::test_gt",
"... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add condition expression builder
This is an easy way to build conditional expressions for operations like query, scan, etc. It helps build both the conditional expression string and any placeholders that may be required for the attribute name or value. This can be applied whenever the dynamodb shape is `ConditionExpression` or any other similar shapes like `KeyExpression`
To get a sense of all of the different options I would first checkout the TestConditionExpressionBuilder to see what the end output of using the builder with the various attribute and attribute methods is.
cc @jamesls
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/dynamodb/conditions.py]
(definition of ConditionBase:)
class ConditionBase(object):
(definition of ConditionBase.__init__:)
def __init__(self, *values):
(definition of ConditionBase.__and__:)
def __and__(self, other):
(definition of ConditionBase.__or__:)
def __or__(self, other):
(definition of ConditionBase.__invert__:)
def __invert__(self):
(definition of ConditionBase.get_expression:)
def get_expression(self):
(definition of ConditionBase.__eq__:)
def __eq__(self, other):
(definition of ConditionBase.__ne__:)
def __ne__(self, other):
(definition of AttributeBase:)
class AttributeBase(object):
(definition of AttributeBase.__init__:)
def __init__(self, name):
(definition of AttributeBase.__and__:)
def __and__(self, value):
(definition of AttributeBase.__or__:)
def __or__(self, value):
(definition of AttributeBase.__invert__:)
def __invert__(self):
(definition of AttributeBase.eq:)
def eq(self, value):
"""Creates a condtion where the attribute is equal to the value.
:param value: The value that the attribute is equal to."""
(definition of AttributeBase.lt:)
def lt(self, value):
"""Creates a condtion where the attribute is less than the value.
:param value: The value that the attribute is less than."""
(definition of AttributeBase.lte:)
def lte(self, value):
"""Creates a condtion where the attribute is less than or equal to the
value.
:param value: The value that the attribute is less than or equal to."""
(definition of AttributeBase.gt:)
def gt(self, value):
"""Creates a condtion where the attribute is greater than the value.
:param value: The value that the attribute is greater than."""
(definition of AttributeBase.gte:)
def gte(self, value):
"""Creates a condtion where the attribute is greater than or equal to
the value.
:param value: The value that the attribute is greater than or equal to."""
(definition of AttributeBase.begins_with:)
def begins_with(self, value):
"""Creates a condtion where the attribute begins with the value.
:param value: The value that the attribute begins with."""
(definition of AttributeBase.between:)
def between(self, low_value, high_value):
"""Creates a condtion where the attribute is between the low value and
the high value.
:param low_value: The value that the attribute is greater than.
:param high_value: The value that the attribute is less than."""
(definition of ConditionAttributeBase:)
class ConditionAttributeBase(ConditionBase, AttributeBase):
"""This base class is for conditions that can have attribute methods.
One example is the Size condition. To complete a condition, you need
to apply another AttributeBase method like eq()."""
(definition of ConditionAttributeBase.__init__:)
def __init__(self, *values):
(definition of ComparisonCondition:)
class ComparisonCondition(ConditionBase):
(definition of Equals:)
class Equals(ComparisonCondition):
(definition of NotEquals:)
class NotEquals(ComparisonCondition):
(definition of LessThan:)
class LessThan(ComparisonCondition):
(definition of LessThanEquals:)
class LessThanEquals(ComparisonCondition):
(definition of GreaterThan:)
class GreaterThan(ComparisonCondition):
(definition of GreaterThanEquals:)
class GreaterThanEquals(ComparisonCondition):
(definition of In:)
class In(ComparisonCondition):
(definition of Between:)
class Between(ConditionBase):
(definition of BeginsWith:)
class BeginsWith(ConditionBase):
(definition of Contains:)
class Contains(ConditionBase):
(definition of Size:)
class Size(ConditionAttributeBase):
(definition of AttributeType:)
class AttributeType(ConditionBase):
(definition of AttributeExists:)
class AttributeExists(ConditionBase):
(definition of AttributeNotExists:)
class AttributeNotExists(ConditionBase):
(definition of And:)
class And(ConditionBase):
(definition of Or:)
class Or(ConditionBase):
(definition of Not:)
class Not(ConditionBase):
(definition of Key:)
class Key(AttributeBase):
(definition of Attr:)
class Attr(AttributeBase):
"""Represents an DynamoDB item's attribute."""
(definition of Attr.ne:)
def ne(self, value):
"""Creates a condtion where the attribute is not equal to the value
:param value: The value that the attribute is not equal to."""
(definition of Attr.is_in:)
def is_in(self, value):
"""Creates a condtion where the attribute is in the value,
:type value: list
:param value: The value that the attribute is in."""
(definition of Attr.exists:)
def exists(self):
"""Creates a condtion where the attribute exists."""
(definition of Attr.not_exists:)
def not_exists(self):
"""Creates a condtion where the attribute does not exist."""
(definition of Attr.contains:)
def contains(self, value):
"""Creates a condition where the attribute contains the value.
:param value: The value the attribute contains."""
(definition of Attr.size:)
def size(self):
"""Creates a condition for the attribute size.
Note another AttributeBase method must be called on the returned
size condition to be a valid DynamoDB condition."""
(definition of Attr.attribute_type:)
def attribute_type(self, value):
"""Creates a condition for the attribute type.
:param value: The type of the attribute."""
(definition of ConditionExpressionBuilder:)
class ConditionExpressionBuilder(object):
"""This class is used to build condition expressions with placeholders"""
(definition of ConditionExpressionBuilder.__init__:)
def __init__(self):
(definition of ConditionExpressionBuilder._get_name_placeholder:)
def _get_name_placeholder(self):
(definition of ConditionExpressionBuilder._get_value_placeholder:)
def _get_value_placeholder(self):
(definition of ConditionExpressionBuilder.reset:)
def reset(self):
"""Resets the placeholder name and values"""
(definition of ConditionExpressionBuilder.build_expression:)
def build_expression(self, condition, is_key_condition=False):
"""Builds the condition expression and the dictionary of placeholders.
:type condition: ConditionBase
:param condition: A condition to be built into a condition expression
string with any necessary placeholders.
:type is_key_condition: Boolean
:param is_key_condition: True if the expression is for a
KeyConditionExpression. False otherwise.
:rtype: (string, dict, dict)
:returns: Will return a string representing the condition with
placeholders inserted where necessary, a dictionary of
placeholders for attribute names, and a dictionary of
placeholders for attribute values. Here is a sample return value:
('#n0 = :v0', {'#n0': 'myattribute'}, {':v1': 'myvalue'})"""
(definition of ConditionExpressionBuilder._build_expression:)
def _build_expression(self, condition, attribute_name_placeholders, attribute_value_placeholders, is_key_condition):
(definition of ConditionExpressionBuilder._build_expression_component:)
def _build_expression_component(self, value, attribute_name_placeholders, attribute_value_placeholders, has_grouped_values, is_key_condition):
(definition of ConditionExpressionBuilder._build_name_placeholder:)
def _build_name_placeholder(self, value, attribute_name_placeholders):
(definition of ConditionExpressionBuilder._build_value_placeholder:)
def _build_value_placeholder(self, value, attribute_value_placeholders, has_grouped_values=False):
[end of new definitions in boto3/dynamodb/conditions.py]
[start of new definitions in boto3/exceptions.py]
(definition of DynanmoDBOperationNotSupportedError:)
class DynanmoDBOperationNotSupportedError(Exception):
"""Raised for operantions that are not supported for an operand"""
(definition of DynanmoDBOperationNotSupportedError.__init__:)
def __init__(self, operation, value):
(definition of DynamoDBNeedsConditionError:)
class DynamoDBNeedsConditionError(Exception):
"""Raised when input is not a condition"""
(definition of DynamoDBNeedsConditionError.__init__:)
def __init__(self, value):
(definition of DynamoDBNeedsKeyConditionError:)
class DynamoDBNeedsKeyConditionError(Exception):
[end of new definitions in boto3/exceptions.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | ||
joke2k__faker-234 | 234 | joke2k/faker | null | 6c389615fa5975ff9458e67510bb89ef6c6e05ff | 2015-04-26T15:44:08Z | diff --git a/faker/providers/address/ja_JP/__init__.py b/faker/providers/address/ja_JP/__init__.py
index ec12a6884c..f268df50e3 100644
--- a/faker/providers/address/ja_JP/__init__.py
+++ b/faker/providers/address/ja_JP/__init__.py
@@ -1,9 +1,17 @@
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
+import random
from .. import Provider as AddressProvider
class Provider(AddressProvider):
+ address_formats = (
+ '{{prefecture}}{{city}}{{town}}{{chome}}{{ban}}{{gou}}',
+ '{{prefecture}}{{city}}{{town}}{{chome}}{{ban}}{{gou}} {{town}}{{building_name}}{{building_number}}',
+ '{{prefecture}}{{city}}{{town}}{{chome}}{{ban}}{{gou}} {{building_name}}{{town}}{{building_number}}', )
+
+ building_number_formats = ('###', )
+
countries = (
'アフガニスタン', 'アルバニア', 'アルジェリア', 'アメリカ領サモア', 'アンドラ', 'アンゴラ', 'アンギラ', '南極大陸', 'アンティグアバーブーダ', 'アルゼンチン', 'アルメニア', 'アルバ', 'オーストラリア', 'オーストリア', 'アゼルバイジャン',
'バハマ', 'バーレーン', 'バングラデシュ', 'バルバドス', 'ベラルーシ', 'ベルギー', 'ベリーズ', 'ベナン', 'バミューダ島', 'ブータン', 'ボリビア', 'ボスニア・ヘルツェゴビナ', 'ボツワナ', 'ブーベ島', 'ブラジル', 'イギリス領インド洋地域', 'イギリス領ヴァージン諸島', 'ブルネイ', 'ブルガリア', 'ブルキナファソ', 'ブルンジ',
@@ -31,3 +39,105 @@ class Provider(AddressProvider):
'イエメン',
'ザンビア', 'ジンバブエ'
)
+
+ prefectures = (
+ '北海道', '青森県', '岩手県', '宮城県', '秋田県', '山形県', '福島県', '茨城県', '栃木県', '群馬県',
+ '埼玉県', '千葉県', '東京都', '神奈川県', '新潟県', '富山県', '石川県', '福井県', '山梨県', '長野県',
+ '岐阜県', '静岡県', '愛知県', '三重県', '滋賀県', '京都府', '大阪府', '兵庫県', '奈良県', '和歌山県',
+ '鳥取県', '島根県', '岡山県', '広島県', '山口県', '徳島県', '香川県', '愛媛県', '高知県', '福岡県',
+ '佐賀県', '長崎県', '熊本県', '大分県', '宮崎県', '鹿児島県', '沖縄県'
+ )
+
+ cities = (
+ '八千代市', '我孫子市', '鴨川市', '鎌ケ谷市', '君津市', '富津市', '浦安市', '四街道市', '袖ケ浦市',
+ '八街市', '印西市', '白井市', '富里市', '南房総市', '匝瑳市', '香取市', '山武市', 'いすみ市', '大網白里市',
+ '印旛郡酒々井町', '印旛郡印旛村', '印旛郡本埜村', '印旛郡栄町', '香取郡神崎町', '香取郡多古町', '香取郡東庄町',
+ '山武郡九十九里町', '山武郡芝山町', '山武郡横芝光町', '長生郡一宮町', '長生郡睦沢町', '長生郡長生村',
+ '長生郡白子町', '長生郡長柄町', '長生郡長南町', '夷隅郡大多喜町', '夷隅郡御宿町', '安房郡鋸南町', '千代田区',
+ '中央区', '港区', '新宿区', '文京区', '台東区', '墨田区', '江東区', '品川区', '目黒区', '大田区',
+ '世田谷区', '渋谷区', '中野区', '杉並区', '豊島区', '北区', '荒川区', '板橋区', '練馬区', '足立区',
+ '葛飾区', '江戸川区', '八王子市', '立川市', '武蔵野市', '三鷹市', '青梅市', '府中市', '昭島市', '調布市',
+ '町田市', '小金井市', '小平市', '日野市', '東村山市', '国分寺市', '国立市', '福生市', '狛江市', '東大和市',
+ '清瀬市', '東久留米市', '武蔵村山市', '多摩市', '稲城市', '羽村市', 'あきる野市', '西東京市', '西多摩郡瑞穂町',
+ '西多摩郡日の出町', '西多摩郡檜原村', '西多摩郡奥多摩町', '大島町', '利島村', '新島村', '神津島村', '三宅島三宅村',
+ '御蔵島村', '八丈島八丈町', '青ヶ島村', '小笠原村', '横浜市鶴見区', '横浜市神奈川区', '横浜市西区', '横浜市中区',
+ '横浜市南区', '横浜市保土ケ谷区', '横浜市磯子区', '横浜市金沢区', '横浜市港北区', '横浜市戸塚区', '横浜市港南区',
+ '横浜市旭区', '横浜市緑区', '横浜市瀬谷区', '横浜市栄区', '横浜市泉区', '横浜市青葉区', '横浜市都筑区',
+ '川崎市川崎区', '川崎市幸区', '川崎市中原区', '川崎市高津区', '川崎市多摩区', '川崎市宮前区'
+ )
+
+ towns = (
+ '丹勢', '中宮祠', '手岡', '東和町', '所野', '土沢', '独鈷沢', '轟', '土呂部', '中小来川', '長畑', '中鉢石町',
+ '中三依', '西小来川', '西川', '日光', '東三島', '東大和町', '蟇沼', '二つ室', '方京', '細竹', '前弥六',
+ '前弥六南町', '松浦町', '南赤田', '南郷屋', '美原町', '無栗屋', '睦', '百村', '箭坪', '山中新田', '油井',
+ '湯宮', '豊町', '湯本塩原', '横林', '四区町', '渡辺', '氏家', '氏家新田', '卯の里', '小入', '大中', '押上',
+ '柿木沢', '柿木沢新田', '鍛冶ケ沢', '上高野', '上吉羽', '木立', '権現堂', '幸手', '下宇和田', '下吉羽', '神明内',
+ '外国府間', '千塚', '天神島', '戸島', '中川崎', '長間', '西関宿', '花島', '平須賀', '細野', '松石', '太田ヶ谷',
+ '上広谷', '五味ヶ谷', '脚折', '脚折町', '鶴ヶ丘', '羽折町', '藤金', '九段南', '皇居外苑', '麹町', '猿楽町',
+ '外神田', '西神田', '隼町', '東神田', '一ツ橋', '日比谷公園', '平河町', '丸の内', '丸の内JPタワー', '四番町',
+ '六番町', '明石町', '勝どき', '京橋', '月島', '北青山', '港南', '芝浦', '芝公園', '芝大門', '白金', '白金台',
+ '台場', '高輪', '虎ノ門', '虎ノ門虎ノ門ヒルズ森タワー', '大京町', '高田馬場', '箪笥町', '津久戸町', '筑土八幡町',
+ '戸塚町', '富久町', '戸山', '秋葉原', '浅草', '浅草橋', '池之端', '今戸', '入谷', '上野公園', '上野桜木',
+ '雷門', '北上野', '蔵前', '千束', '台東', '鳥越', '西浅草', '日本堤', '橋場', '花川戸', '東浅草', '東上野',
+ '松が谷', '三筋', '三ノ輪', '元浅草', '竜泉', '吾妻橋'
+ )
+
+ building_names = (
+ 'パレス', 'ハイツ', 'コーポ', 'アーバン', 'クレスト' , 'パーク', 'シティ' , 'シャルム', 'コート'
+ )
+
+ @classmethod
+ def prefecture(cls):
+ """
+ :example '東京都'
+ """
+ return cls.random_element(cls.prefectures)
+
+ @classmethod
+ def city(cls):
+ """
+ :example '台東区'
+ """
+ return cls.random_element(cls.cities)
+
+ @classmethod
+ def town(cls):
+ """
+ :example '浅草'
+ """
+ return cls.random_element(cls.towns)
+
+ @classmethod
+ def chome(cls):
+ """
+ :example '1丁目'
+ """
+ return "%d丁目" % random.randint(1,42)
+
+ @classmethod
+ def ban(cls):
+ """
+ :example '3番'
+ """
+ return "%d番" % random.randint(1,27)
+
+ @classmethod
+ def gou(cls):
+ """
+ :example '10号'
+ """
+ return "%d号" % random.randint(1,20)
+
+ @classmethod
+ def building_name(cls):
+ """
+ :example 'コーポ芝浦'
+ """
+ return cls.random_element(cls.building_names)
+
+ @classmethod
+ def zipcode(cls):
+ """
+ :example '101-1212'
+ """
+ return "%03d-%04d" % (random.randint(0,999), random.randint(0,9999))
| diff --git a/faker/tests/ja_JP/__init__.py b/faker/tests/ja_JP/__init__.py
index b63fcb2620..c0bd16e652 100644
--- a/faker/tests/ja_JP/__init__.py
+++ b/faker/tests/ja_JP/__init__.py
@@ -3,6 +3,7 @@
from __future__ import unicode_literals
import unittest
+import re
from faker import Factory
from .. import string_types
@@ -15,12 +16,59 @@ def setUp(self):
def test_ja_JP_address(self):
from faker.providers.address.ja_JP import Provider
countries = Provider.countries
-
- country = self.factory.country()
+ country = self.factory.country()
assert country
assert isinstance(country, string_types)
assert country in countries
+ prefectures = Provider.prefectures
+ prefecture = self.factory.prefecture()
+ assert prefecture
+ assert isinstance(prefecture, string_types)
+ assert prefecture in prefectures
+
+ cities = Provider.cities
+ city = self.factory.city()
+ assert city
+ assert isinstance(city, string_types)
+ assert city in cities
+
+ towns = Provider.towns
+ town = self.factory.town()
+ assert town
+ assert isinstance(town, string_types)
+ assert town in towns
+
+ chome = self.factory.chome()
+ assert chome
+ assert isinstance(chome, string_types)
+ assert re.match("\d{1,2}丁目", chome)
+
+ ban = self.factory.ban()
+ assert ban
+ assert isinstance(ban, string_types)
+ assert re.match("\d{1,2}番", ban)
+
+ gou = self.factory.gou()
+ assert gou
+ assert isinstance(gou, string_types)
+ assert re.match("\d{1,2}号", gou)
+
+ building_names = Provider.building_names
+ building_name = self.factory.building_name()
+ assert building_name
+ assert isinstance(building_name, string_types)
+ assert building_name in building_names
+
+ zipcode = self.factory.zipcode()
+ assert zipcode
+ assert isinstance(zipcode, string_types)
+ assert re.match("\d{3}-\d{4}", zipcode)
+
+ address = self.factory.address()
+ assert address
+ assert isinstance(address, string_types)
+
def test_ja_JP_company(self):
from faker.providers.company.ja_JP import Provider
prefixes = Provider.company_prefixes
| [
{
"components": [
{
"doc": ":example '東京都'",
"lines": [
90,
94
],
"name": "Provider.prefecture",
"signature": "def prefecture(cls):",
"type": "function"
},
{
"doc": ":example '台東区'",
"lines": [
97,
... | [
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_address"
] | [
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_company",
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_person",
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_phone_number"
] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Update ja_JP address.
Changed address formats.
Added the following method:
- prefecture, city, town, chome, ban, gou, building_name, zipcode
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/address/ja_JP/__init__.py]
(definition of Provider.prefecture:)
def prefecture(cls):
""":example '東京都'"""
(definition of Provider.city:)
def city(cls):
""":example '台東区'"""
(definition of Provider.town:)
def town(cls):
""":example '浅草'"""
(definition of Provider.chome:)
def chome(cls):
""":example '1丁目'"""
(definition of Provider.ban:)
def ban(cls):
""":example '3番'"""
(definition of Provider.gou:)
def gou(cls):
""":example '10号'"""
(definition of Provider.building_name:)
def building_name(cls):
""":example 'コーポ芝浦'"""
(definition of Provider.zipcode:)
def zipcode(cls):
""":example '101-1212'"""
[end of new definitions in faker/providers/address/ja_JP/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
boto__boto3-94 | 94 | boto/boto3 | null | cf2c2452eb0d8de58babf7e1db01a0f7e1e0a22c | 2015-04-17T20:21:53Z | diff --git a/boto3/dynamodb/__init__.py b/boto3/dynamodb/__init__.py
new file mode 100644
index 0000000000..c89416d7a5
--- /dev/null
+++ b/boto3/dynamodb/__init__.py
@@ -0,0 +1,12 @@
+# Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"). You
+# may not use this file except in compliance with the License. A copy of
+# the License is located at
+#
+# http://aws.amazon.com/apache2.0/
+#
+# or in the "license" file accompanying this file. This file is
+# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
+# ANY KIND, either express or implied. See the License for the specific
+# language governing permissions and limitations under the License.
diff --git a/boto3/dynamodb/types.py b/boto3/dynamodb/types.py
new file mode 100644
index 0000000000..36b9c512a3
--- /dev/null
+++ b/boto3/dynamodb/types.py
@@ -0,0 +1,297 @@
+# Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"). You
+# may not use this file except in compliance with the License. A copy of
+# the License is located at
+#
+# http://aws.amazon.com/apache2.0/
+#
+# or in the "license" file accompanying this file. This file is
+# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
+# ANY KIND, either express or implied. See the License for the specific
+# language governing permissions and limitations under the License.
+from collections import Mapping, Set
+from decimal import Decimal, Context, Clamped
+from decimal import Overflow, Inexact, Underflow, Rounded
+
+from botocore.compat import six
+
+
+STRING = 'S'
+NUMBER = 'N'
+BINARY = 'B'
+STRING_SET = 'SS'
+NUMBER_SET = 'NS'
+BINARY_SET = 'BS'
+NULL = 'NULL'
+BOOLEAN = 'BOOL'
+MAP = 'M'
+LIST = 'L'
+
+
+DYNAMODB_CONTEXT = Context(
+ Emin=-128, Emax=126, rounding=None, prec=38,
+ traps=[Clamped, Overflow, Inexact, Rounded, Underflow])
+
+
+BINARY_TYPES = (bytearray, six.binary_type)
+
+
+class Binary(object):
+ """A class for representing Binary in dynamodb
+
+ Especially for Python 2, use this class to explicitly specify
+ binary data for item in DynamoDB. It is essentially a wrapper around
+ binary. Unicode and Python 3 string types are not allowed.
+ """
+ def __init__(self, value):
+ if not isinstance(value, BINARY_TYPES):
+ raise TypeError('Value must be of the following types: %s.' %
+ ', '.join([str(t) for t in BINARY_TYPES]))
+ self.value = value
+
+ def __eq__(self, other):
+ if isinstance(other, Binary):
+ return self.value == other.value
+ return self.value == other
+
+ def __ne__(self, other):
+ return not self.__eq__(other)
+
+ def __repr__(self):
+ return 'Binary(%r)' % self.value
+
+ def __str__(self):
+ return self.value
+
+ def __hash__(self):
+ return hash(self.value)
+
+
+class TypeSerializer(object):
+ """This class serializes Python data types to DynamoDB types."""
+ def serialize(self, value):
+ """The method to serialize the Python data types.
+
+ :param value: A python value to be serialized to DynamoDB. Here are
+ the various conversions:
+
+ Python DynamoDB
+ ------ --------
+ None {'NULL': True}
+ True/False {'BOOL': True/False}
+ int/Decimal {'N': str(value)}
+ string {'S': string}
+ Binary/bytearray/bytes (py3 only) {'B': bytes}
+ set([int/Decimal]) {'NS': [str(value)]}
+ set([string]) {'SS': [string])
+ set([Binary/bytearray/bytes]) {'BS': [bytes]}
+ list {'L': list}
+ dict {'M': dict}
+
+ For types that involve numbers, it is recommended that ``Decimal``
+ objects are used to be able to round-trip the Python type.
+ For types that involve binary, it is recommended that ``Binary``
+ objects are used to be able to round-trip the Python type.
+
+ :rtype: dict
+ :returns: A dictionary that represents a dynamoDB data type. These
+ dictionaries can be directly passed to botocore methods.
+ """
+ dynamodb_type = self._get_dynamodb_type(value)
+ serializer = getattr(self, '_serialize_%s' % dynamodb_type.lower())
+ return {dynamodb_type: serializer(value)}
+
+ def _get_dynamodb_type(self, value):
+ dynamodb_type = None
+
+ if self._is_null(value):
+ dynamodb_type = NULL
+
+ elif self._is_boolean(value):
+ dynamodb_type = BOOLEAN
+
+ elif self._is_number(value):
+ dynamodb_type = NUMBER
+
+ elif self._is_string(value):
+ dynamodb_type = STRING
+
+ elif self._is_binary(value):
+ dynamodb_type = BINARY
+
+ elif self._is_type_set(value, self._is_number):
+ dynamodb_type = NUMBER_SET
+
+ elif self._is_type_set(value, self._is_string):
+ dynamodb_type = STRING_SET
+
+ elif self._is_type_set(value, self._is_binary):
+ dynamodb_type = BINARY_SET
+
+ elif self._is_map(value):
+ dynamodb_type = MAP
+
+ elif self._is_list(value):
+ dynamodb_type = LIST
+
+ else:
+ msg = 'Unsupported type "%s" for value "%s"' % (type(value), value)
+ raise TypeError(msg)
+
+ return dynamodb_type
+
+ def _is_null(self, value):
+ if value is None:
+ return True
+ return False
+
+ def _is_boolean(self, value):
+ if isinstance(value, bool):
+ return True
+ return False
+
+ def _is_number(self, value):
+ if isinstance(value, (six.integer_types, Decimal)):
+ return True
+ elif isinstance(value, float):
+ raise TypeError(
+ 'Float types are not supported. Use Decimal types instead.')
+ return False
+
+ def _is_string(self, value):
+ if isinstance(value, six.string_types):
+ return True
+ return False
+
+ def _is_binary(self, value):
+ if isinstance(value, Binary):
+ return True
+ elif isinstance(value, bytearray):
+ return True
+ elif six.PY3 and isinstance(value, six.binary_type):
+ return True
+ return False
+
+ def _is_set(self, value):
+ if isinstance(value, Set):
+ return True
+ return False
+
+ def _is_type_set(self, value, type_validator):
+ if self._is_set(value):
+ if False not in map(type_validator, value):
+ return True
+ return False
+
+ def _is_map(self, value):
+ if isinstance(value, Mapping):
+ return True
+ return False
+
+ def _is_list(self, value):
+ if isinstance(value, list):
+ return True
+ return False
+
+ def _serialize_null(self, value):
+ return True
+
+ def _serialize_bool(self, value):
+ return value
+
+ def _serialize_n(self, value):
+ number = str(DYNAMODB_CONTEXT.create_decimal(value))
+ if number in ['Infinity', 'NaN']:
+ raise TypeError('Infinity and NaN not supported')
+ return number
+
+ def _serialize_s(self, value):
+ return value
+
+ def _serialize_b(self, value):
+ if isinstance(value, Binary):
+ value = value.value
+ return value
+
+ def _serialize_ss(self, value):
+ return [self._serialize_s(s) for s in value]
+
+ def _serialize_ns(self, value):
+ return [self._serialize_n(n) for n in value]
+
+ def _serialize_bs(self, value):
+ return [self._serialize_b(b) for b in value]
+
+ def _serialize_l(self, value):
+ return [self.serialize(v) for v in value]
+
+ def _serialize_m(self, value):
+ return dict([(k, self.serialize(v)) for k, v in value.items()])
+
+
+class TypeDeserializer(object):
+ """This class deserializes DynamoDB types to Python types."""
+ def deserialize(self, value):
+ """The method to deserialize the DynamoDB data types.
+
+ :param value: A DynamoDB value to be deserialized to a pythonic value.
+ Here are the various conversions:
+
+ DynamoDB Python
+ -------- ------
+ {'NULL': True} None
+ {'BOOL': True/False} True/False
+ {'N': str(value)} Decimal(str(value))
+ {'S': string} string
+ {'B': bytes} Binary(bytes)
+ {'NS': [str(value)]} set([Decimal(str(value))])
+ {'SS': [string]} set([string])
+ {'BS': [bytes]} set([bytes])
+ {'L': list} list
+ {'M': dict} dict
+
+ :returns: The pythonic value of the DynamoDB type.
+ """
+
+ if not value:
+ raise TypeError('Value must be a nonempty dictionary whose key '
+ 'is a valid dynamodb type.')
+ dynamodb_type = list(value.keys())[0]
+ try:
+ deserializer = getattr(
+ self, '_deserialize_%s' % dynamodb_type.lower())
+ except AttributeError:
+ raise TypeError(
+ 'Dynamodb type %s is not supported' % dynamodb_type)
+ return deserializer(value[dynamodb_type])
+
+ def _deserialize_null(self, value):
+ return None
+
+ def _deserialize_bool(self, value):
+ return value
+
+ def _deserialize_n(self, value):
+ return DYNAMODB_CONTEXT.create_decimal(value)
+
+ def _deserialize_s(self, value):
+ return value
+
+ def _deserialize_b(self, value):
+ return Binary(value)
+
+ def _deserialize_ns(self, value):
+ return set(map(self._deserialize_n, value))
+
+ def _deserialize_ss(self, value):
+ return set(map(self._deserialize_s, value))
+
+ def _deserialize_bs(self, value):
+ return set(map(self._deserialize_b, value))
+
+ def _deserialize_l(self, value):
+ return [self.deserialize(v) for v in value]
+
+ def _deserialize_m(self, value):
+ return dict([(k, self.deserialize(v)) for k, v in value.items()])
| diff --git a/tests/__init__.py b/tests/__init__.py
index da171af2c6..f166ea16e4 100644
--- a/tests/__init__.py
+++ b/tests/__init__.py
@@ -15,6 +15,8 @@
import sys
import time
+from botocore.compat import six
+
# The unittest module got a significant overhaul
# in 2.7, so if we're in 2.6 we can use the backported
@@ -32,6 +34,14 @@
from unittest import mock
+# In python 3, order matters when calling assertEqual to
+# compare lists and dictionaries with lists. Therefore,
+# assertItemsEqual needs to be used but it is renamed to
+# assertCountEqual in python 3.
+if six.PY2:
+ unittest.TestCase.assertCountEqual = unittest.TestCase.assertItemsEqual
+
+
def unique_id(name):
"""
Generate a unique ID that includes the given name,
diff --git a/tests/unit/dynamodb/__init__.py b/tests/unit/dynamodb/__init__.py
new file mode 100644
index 0000000000..c89416d7a5
--- /dev/null
+++ b/tests/unit/dynamodb/__init__.py
@@ -0,0 +1,12 @@
+# Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"). You
+# may not use this file except in compliance with the License. A copy of
+# the License is located at
+#
+# http://aws.amazon.com/apache2.0/
+#
+# or in the "license" file accompanying this file. This file is
+# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
+# ANY KIND, either express or implied. See the License for the specific
+# language governing permissions and limitations under the License.
diff --git a/tests/unit/dynamodb/test_types.py b/tests/unit/dynamodb/test_types.py
new file mode 100644
index 0000000000..f4089e8247
--- /dev/null
+++ b/tests/unit/dynamodb/test_types.py
@@ -0,0 +1,203 @@
+# Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the 'License'). You
+# may not use this file except in compliance with the License. A copy of
+# the License is located at
+#
+# http://aws.amazon.com/apache2.0/
+#
+# or in the 'license' file accompanying this file. This file is
+# distributed on an 'AS IS' BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
+# ANY KIND, either express or implied. See the License for the specific
+# language governing permissions and limitations under the License.
+from decimal import Decimal
+from tests import unittest
+
+from botocore.compat import six
+
+from boto3.dynamodb.types import Binary, TypeSerializer, TypeDeserializer
+
+
+class TestBinary(unittest.TestCase):
+ def test_bytes_input(self):
+ data = Binary(b'\x01')
+ self.assertEqual(b'\x01', data)
+ self.assertEqual(b'\x01', data.value)
+
+ def test_non_ascii_bytes_input(self):
+ # Binary data that is out of ASCII range
+ data = Binary(b'\x88')
+ self.assertEqual(b'\x88', data)
+ self.assertEqual(b'\x88', data.value)
+
+ def test_bytearray_input(self):
+ data = Binary(bytearray([1]))
+ self.assertEqual(b'\x01', data)
+ self.assertEqual(b'\x01', data.value)
+
+ def test_unicode_throws_error(self):
+ with self.assertRaises(TypeError):
+ Binary(u'\u00e9')
+
+ def test_integer_throws_error(self):
+ with self.assertRaises(TypeError):
+ Binary(1)
+
+ def test_not_equal(self):
+ self.assertTrue(Binary(b'\x01') != b'\x02')
+
+ def test_str(self):
+ self.assertEqual(Binary(b'\x01').__str__(), b'\x01')
+
+ def test_repr(self):
+ self.assertIn('Binary', repr(Binary(b'1')))
+
+
+class TestSerializer(unittest.TestCase):
+ def setUp(self):
+ self.serializer = TypeSerializer()
+
+ def test_serialize_unsupported_type(self):
+ with self.assertRaisesRegexp(TypeError, 'Unsupported type'):
+ self.serializer.serialize(object())
+
+ def test_serialize_null(self):
+ self.assertEqual(self.serializer.serialize(None), {'NULL': True})
+
+ def test_serialize_boolean(self):
+ self.assertEqual(self.serializer.serialize(False), {'BOOL': False})
+
+ def test_serialize_integer(self):
+ self.assertEqual(self.serializer.serialize(1), {'N': '1'})
+
+ def test_serialize_decimal(self):
+ self.assertEqual(
+ self.serializer.serialize(Decimal('1.25')), {'N': '1.25'})
+
+ def test_serialize_float_error(self):
+ with self.assertRaisesRegexp(
+ TypeError,
+ 'Float types are not supported. Use Decimal types instead'):
+ self.serializer.serialize(1.25)
+
+ def test_serialize_NaN_error(self):
+ with self.assertRaisesRegexp(
+ TypeError,
+ 'Infinity and NaN not supported'):
+ self.serializer.serialize(Decimal('NaN'))
+
+ def test_serialize_string(self):
+ self.assertEqual(self.serializer.serialize('foo'), {'S': 'foo'})
+
+ def test_serialize_binary(self):
+ self.assertEqual(self.serializer.serialize(
+ Binary(b'\x01')), {'B': b'\x01'})
+
+ def test_serialize_bytearray(self):
+ self.assertEqual(self.serializer.serialize(bytearray([1])),
+ {'B': b'\x01'})
+
+ @unittest.skipIf(six.PY2,
+ 'This is a test when using python3 version of bytes')
+ def test_serialize_bytes(self):
+ self.assertEqual(self.serializer.serialize(b'\x01'), {'B': b'\x01'})
+
+ def test_serialize_number_set(self):
+ serialized_value = self.serializer.serialize(set([1, 2, 3]))
+ self.assertEqual(len(serialized_value), 1)
+ self.assertIn('NS', serialized_value)
+ self.assertCountEqual(serialized_value['NS'], ['1', '2', '3'])
+
+ def test_serialize_string_set(self):
+ serialized_value = self.serializer.serialize(set(['foo', 'bar']))
+ self.assertEqual(len(serialized_value), 1)
+ self.assertIn('SS', serialized_value)
+ self.assertCountEqual(serialized_value['SS'], ['foo', 'bar'])
+
+ def test_serialize_binary_set(self):
+ serialized_value = self.serializer.serialize(
+ set([Binary(b'\x01'), Binary(b'\x02')]))
+ self.assertEqual(len(serialized_value), 1)
+ self.assertIn('BS', serialized_value)
+ self.assertCountEqual(serialized_value['BS'], [b'\x01', b'\x02'])
+
+ def test_serialize_list(self):
+ serialized_value = self.serializer.serialize(['foo', 1, [1]])
+ self.assertEqual(len(serialized_value), 1)
+ self.assertIn('L', serialized_value)
+ self.assertCountEqual(
+ serialized_value['L'],
+ [{'S': 'foo'}, {'N': '1'}, {'L': [{'N': '1'}]}]
+ )
+
+ def test_serialize_map(self):
+ serialized_value = self.serializer.serialize(
+ {'foo': 'bar', 'baz': {'biz': 1}})
+ self.assertEqual(
+ serialized_value,
+ {'M': {'foo': {'S': 'bar'}, 'baz': {'M': {'biz': {'N': '1'}}}}})
+
+
+class TestDeserializer(unittest.TestCase):
+ def setUp(self):
+ self.deserializer = TypeDeserializer()
+
+ def test_deserialize_invalid_type(self):
+ with self.assertRaisesRegexp(TypeError, 'FOO is not supported'):
+ self.deserializer.deserialize({'FOO': 'bar'})
+
+ def test_deserialize_empty_structure(self):
+ with self.assertRaisesRegexp(TypeError, 'Value must be a nonempty'):
+ self.assertEqual(self.deserializer.deserialize({}), {})
+
+ def test_deserialize_null(self):
+ self.assertEqual(self.deserializer.deserialize({"NULL": True}), None)
+
+ def test_deserialize_boolean(self):
+ self.assertEqual(self.deserializer.deserialize({"BOOL": False}), False)
+
+ def test_deserialize_integer(self):
+ self.assertEqual(
+ self.deserializer.deserialize({'N': '1'}), Decimal('1'))
+
+ def test_deserialize_decimal(self):
+ self.assertEqual(
+ self.deserializer.deserialize({'N': '1.25'}), Decimal('1.25'))
+
+ def test_deserialize_string(self):
+ self.assertEqual(
+ self.deserializer.deserialize({'S': 'foo'}), 'foo')
+
+ def test_deserialize_binary(self):
+ self.assertEqual(
+ self.deserializer.deserialize({'B': b'\x00'}), Binary(b'\x00'))
+
+ def test_deserialize_number_set(self):
+ self.assertEqual(
+ self.deserializer.deserialize(
+ {'NS': ['1', '1.25']}), set([Decimal('1'), Decimal('1.25')]))
+
+ def test_deserialize_string_set(self):
+ self.assertEqual(
+ self.deserializer.deserialize(
+ {'SS': ['foo', 'bar']}), set(['foo', 'bar']))
+
+ def test_deserialize_binary_set(self):
+ self.assertEqual(
+ self.deserializer.deserialize(
+ {'BS': [b'\x00', b'\x01']}),
+ set([Binary(b'\x00'), Binary(b'\x01')]))
+
+ def test_deserialize_list(self):
+ self.assertEqual(
+ self.deserializer.deserialize(
+ {'L': [{'N': '1'}, {'S': 'foo'}, {'L': [{'N': '1.25'}]}]}),
+ [Decimal('1'), 'foo', [Decimal('1.25')]])
+
+ def test_deserialize_map(self):
+ self.assertEqual(
+ self.deserializer.deserialize(
+ {'M': {'foo': {'S': 'mystring'},
+ 'bar': {'M': {'baz': {'N': '1'}}}}}),
+ {'foo': 'mystring', 'bar': {'baz': Decimal('1')}}
+ )
| [
{
"components": [
{
"doc": "A class for representing Binary in dynamodb\n\nEspecially for Python 2, use this class to explicitly specify\nbinary data for item in DynamoDB. It is essentially a wrapper around\nbinary. Unicode and Python 3 string types are not allowed.",
"lines": [
... | [
"tests/unit/dynamodb/test_types.py::TestBinary::test_bytearray_input",
"tests/unit/dynamodb/test_types.py::TestBinary::test_bytes_input",
"tests/unit/dynamodb/test_types.py::TestBinary::test_integer_throws_error",
"tests/unit/dynamodb/test_types.py::TestBinary::test_non_ascii_bytes_input",
"tests/unit/dynam... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Dynamodb types
This PR adds type serialization/deserialization for dynamodb. It works much like the `Dynamizer` class in boto. This interface is built specifically for botocore. So, stuff like base64 encoding/decoding binary is handed off to botocore.
cc @jamesls
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/dynamodb/types.py]
(definition of Binary:)
class Binary(object):
"""A class for representing Binary in dynamodb
Especially for Python 2, use this class to explicitly specify
binary data for item in DynamoDB. It is essentially a wrapper around
binary. Unicode and Python 3 string types are not allowed."""
(definition of Binary.__init__:)
def __init__(self, value):
(definition of Binary.__eq__:)
def __eq__(self, other):
(definition of Binary.__ne__:)
def __ne__(self, other):
(definition of Binary.__repr__:)
def __repr__(self):
(definition of Binary.__str__:)
def __str__(self):
(definition of Binary.__hash__:)
def __hash__(self):
(definition of TypeSerializer:)
class TypeSerializer(object):
"""This class serializes Python data types to DynamoDB types."""
(definition of TypeSerializer.serialize:)
def serialize(self, value):
"""The method to serialize the Python data types.
:param value: A python value to be serialized to DynamoDB. Here are
the various conversions:
Python DynamoDB
------ --------
None {'NULL': True}
True/False {'BOOL': True/False}
int/Decimal {'N': str(value)}
string {'S': string}
Binary/bytearray/bytes (py3 only) {'B': bytes}
set([int/Decimal]) {'NS': [str(value)]}
set([string]) {'SS': [string])
set([Binary/bytearray/bytes]) {'BS': [bytes]}
list {'L': list}
dict {'M': dict}
For types that involve numbers, it is recommended that ``Decimal``
objects are used to be able to round-trip the Python type.
For types that involve binary, it is recommended that ``Binary``
objects are used to be able to round-trip the Python type.
:rtype: dict
:returns: A dictionary that represents a dynamoDB data type. These
dictionaries can be directly passed to botocore methods."""
(definition of TypeSerializer._get_dynamodb_type:)
def _get_dynamodb_type(self, value):
(definition of TypeSerializer._is_null:)
def _is_null(self, value):
(definition of TypeSerializer._is_boolean:)
def _is_boolean(self, value):
(definition of TypeSerializer._is_number:)
def _is_number(self, value):
(definition of TypeSerializer._is_string:)
def _is_string(self, value):
(definition of TypeSerializer._is_binary:)
def _is_binary(self, value):
(definition of TypeSerializer._is_set:)
def _is_set(self, value):
(definition of TypeSerializer._is_type_set:)
def _is_type_set(self, value, type_validator):
(definition of TypeSerializer._is_map:)
def _is_map(self, value):
(definition of TypeSerializer._is_list:)
def _is_list(self, value):
(definition of TypeSerializer._serialize_null:)
def _serialize_null(self, value):
(definition of TypeSerializer._serialize_bool:)
def _serialize_bool(self, value):
(definition of TypeSerializer._serialize_n:)
def _serialize_n(self, value):
(definition of TypeSerializer._serialize_s:)
def _serialize_s(self, value):
(definition of TypeSerializer._serialize_b:)
def _serialize_b(self, value):
(definition of TypeSerializer._serialize_ss:)
def _serialize_ss(self, value):
(definition of TypeSerializer._serialize_ns:)
def _serialize_ns(self, value):
(definition of TypeSerializer._serialize_bs:)
def _serialize_bs(self, value):
(definition of TypeSerializer._serialize_l:)
def _serialize_l(self, value):
(definition of TypeSerializer._serialize_m:)
def _serialize_m(self, value):
(definition of TypeDeserializer:)
class TypeDeserializer(object):
"""This class deserializes DynamoDB types to Python types."""
(definition of TypeDeserializer.deserialize:)
def deserialize(self, value):
"""The method to deserialize the DynamoDB data types.
:param value: A DynamoDB value to be deserialized to a pythonic value.
Here are the various conversions:
DynamoDB Python
-------- ------
{'NULL': True} None
{'BOOL': True/False} True/False
{'N': str(value)} Decimal(str(value))
{'S': string} string
{'B': bytes} Binary(bytes)
{'NS': [str(value)]} set([Decimal(str(value))])
{'SS': [string]} set([string])
{'BS': [bytes]} set([bytes])
{'L': list} list
{'M': dict} dict
:returns: The pythonic value of the DynamoDB type."""
(definition of TypeDeserializer._deserialize_null:)
def _deserialize_null(self, value):
(definition of TypeDeserializer._deserialize_bool:)
def _deserialize_bool(self, value):
(definition of TypeDeserializer._deserialize_n:)
def _deserialize_n(self, value):
(definition of TypeDeserializer._deserialize_s:)
def _deserialize_s(self, value):
(definition of TypeDeserializer._deserialize_b:)
def _deserialize_b(self, value):
(definition of TypeDeserializer._deserialize_ns:)
def _deserialize_ns(self, value):
(definition of TypeDeserializer._deserialize_ss:)
def _deserialize_ss(self, value):
(definition of TypeDeserializer._deserialize_bs:)
def _deserialize_bs(self, value):
(definition of TypeDeserializer._deserialize_l:)
def _deserialize_l(self, value):
(definition of TypeDeserializer._deserialize_m:)
def _deserialize_m(self, value):
[end of new definitions in boto3/dynamodb/types.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | ||
falconry__falcon-502 | 502 | falconry/falcon | null | 73d9f4142ff9f04342f6abfe649c98dda409c477 | 2015-04-14T21:16:38Z | diff --git a/falcon/api.py b/falcon/api.py
index ff61dc163..901e9c727 100644
--- a/falcon/api.py
+++ b/falcon/api.py
@@ -17,6 +17,7 @@
from falcon import api_helpers as helpers
from falcon import DEFAULT_MEDIA_TYPE
from falcon.http_error import HTTPError
+from falcon.http_status import HTTPStatus
from falcon.request import Request, RequestOptions
from falcon.response import Response
import falcon.responders
@@ -205,6 +206,11 @@ def __call__(self, env, start_response):
self._call_resp_mw(middleware_stack, req, resp, resource)
raise
+ except HTTPStatus as ex:
+ self._compose_status_response(req, resp, ex)
+ self._call_after_hooks(req, resp, resource)
+ self._call_resp_mw(middleware_stack, req, resp, resource)
+
except HTTPError as ex:
self._compose_error_response(req, resp, ex)
self._call_after_hooks(req, resp, resource)
@@ -475,13 +481,22 @@ def _get_responder(self, req):
return (responder, params, resource)
+ def _compose_status_response(self, req, resp, http_status):
+ """Composes a response for the given HTTPStatus instance."""
+
+ resp.status = http_status.status
+
+ if http_status.headers is not None:
+ resp.set_headers(http_status.headers)
+
+ if getattr(http_status, "body", None) is not None:
+ resp.body = http_status.body
+
def _compose_error_response(self, req, resp, error):
"""Composes a response for the given HTTPError instance."""
- resp.status = error.status
-
- if error.headers is not None:
- resp.set_headers(error.headers)
+ # Use the HTTPStatus handler function to set status/headers
+ self._compose_status_response(req, resp, error)
if error.has_representation:
media_type, body = self._serialize_error(req, error)
diff --git a/falcon/http_status.py b/falcon/http_status.py
new file mode 100644
index 000000000..fb2733aba
--- /dev/null
+++ b/falcon/http_status.py
@@ -0,0 +1,45 @@
+# Copyright 2015 by Hurricane Labs LLC
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+class HTTPStatus(Exception):
+ """Represents a generic HTTP status.
+
+ Raise this class from a hook, middleware, or a responder to stop handling
+ the request and skip to the response handling.
+
+ Attributes:
+ status (str): HTTP status line, e.g. '748 Confounded by Ponies'.
+ headers (dict): Extra headers to add to the response.
+ body (str or unicode): String representing response content. If
+ Unicode, Falcon will encode as UTF-8 in the response.
+
+ Args:
+ status (str): HTTP status code and text, such as
+ '748 Confounded by Ponies'.
+ headers (dict): Extra headers to add to the response.
+ body (str or unicode): String representing response content. If
+ Unicode, Falcon will encode as UTF-8 in the response.
+ """
+
+ __slots__ = (
+ 'status',
+ 'headers',
+ 'body'
+ )
+
+ def __init__(self, status, headers=None, body=None):
+ self.status = status
+ self.headers = headers
+ self.body = body
| diff --git a/tests/test_httpstatus.py b/tests/test_httpstatus.py
new file mode 100644
index 000000000..3569e6b26
--- /dev/null
+++ b/tests/test_httpstatus.py
@@ -0,0 +1,196 @@
+# -*- coding: utf-8
+
+import falcon.testing as testing
+import falcon
+from falcon.http_status import HTTPStatus
+
+
+def before_hook(req, resp, params):
+ raise HTTPStatus(falcon.HTTP_200,
+ headers={"X-Failed": "False"},
+ body="Pass")
+
+
+def after_hook(req, resp, resource):
+ resp.status = falcon.HTTP_200
+ resp.set_header("X-Failed", "False")
+ resp.body = "Pass"
+
+
+def noop_after_hook(req, resp, resource):
+ pass
+
+
+class TestStatusResource:
+
+ @falcon.before(before_hook)
+ def on_get(self, req, resp):
+ resp.status = falcon.HTTP_500
+ resp.set_header("X-Failed", "True")
+ resp.body = "Fail"
+
+ def on_post(self, req, resp):
+ resp.status = falcon.HTTP_500
+ resp.set_header("X-Failed", "True")
+ resp.body = "Fail"
+
+ raise HTTPStatus(falcon.HTTP_200,
+ headers={"X-Failed": "False"},
+ body="Pass")
+
+ @falcon.after(after_hook)
+ def on_put(self, req, resp):
+ resp.status = falcon.HTTP_500
+ resp.set_header("X-Failed", "True")
+ resp.body = "Fail"
+
+ def on_patch(self, req, resp):
+ raise HTTPStatus(falcon.HTTP_200,
+ body=None)
+
+ @falcon.after(noop_after_hook)
+ def on_delete(self, req, resp):
+ raise HTTPStatus(falcon.HTTP_200,
+ headers={"X-Failed": "False"},
+ body="Pass")
+
+
+class TestHookResource:
+
+ def on_get(self, req, resp):
+ resp.status = falcon.HTTP_500
+ resp.set_header("X-Failed", "True")
+ resp.body = "Fail"
+
+ def on_patch(self, req, resp):
+ raise HTTPStatus(falcon.HTTP_200,
+ body=None)
+
+ def on_delete(self, req, resp):
+ raise HTTPStatus(falcon.HTTP_200,
+ headers={"X-Failed": "False"},
+ body="Pass")
+
+
+class TestHTTPStatus(testing.TestBase):
+ def before(self):
+ self.resource = TestStatusResource()
+ self.api.add_route('/status', self.resource)
+
+ def test_raise_status_in_before_hook(self):
+ """ Make sure we get the 200 raised by before hook """
+ body = self.simulate_request('/status', method='GET', decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
+
+ def test_raise_status_in_responder(self):
+ """ Make sure we get the 200 raised by responder """
+ body = self.simulate_request('/status', method='POST', decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
+
+ def test_raise_status_runs_after_hooks(self):
+ """ Make sure after hooks still run """
+ body = self.simulate_request('/status', method='PUT', decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
+
+ def test_raise_status_survives_after_hooks(self):
+ """ Make sure after hook doesn't overwrite our status """
+ body = self.simulate_request('/status', method='DELETE',
+ decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
+
+ def test_raise_status_empty_body(self):
+ """ Make sure passing None to body results in empty body """
+ body = self.simulate_request('/status', method='PATCH', decode='utf-8')
+ self.assertEqual(body, '')
+
+
+class TestHTTPStatusWithGlobalHooks(testing.TestBase):
+ def before(self):
+ self.resource = TestHookResource()
+
+ def test_raise_status_in_before_hook(self):
+ """ Make sure we get the 200 raised by before hook """
+ self.api = falcon.API(before=[before_hook])
+ self.api.add_route('/status', self.resource)
+
+ body = self.simulate_request('/status', method='GET', decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
+
+ def test_raise_status_runs_after_hooks(self):
+ """ Make sure we still run after hooks """
+ self.api = falcon.API(after=[after_hook])
+ self.api.add_route('/status', self.resource)
+
+ body = self.simulate_request('/status', method='GET', decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
+
+ def test_raise_status_survives_after_hooks(self):
+ """ Make sure after hook doesn't overwrite our status """
+ self.api = falcon.API(after=[noop_after_hook])
+ self.api.add_route('/status', self.resource)
+
+ body = self.simulate_request('/status', method='DELETE',
+ decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
+
+ def test_raise_status_in_process_request(self):
+ """ Make sure we can raise status from middleware process request """
+ class TestMiddleware:
+ def process_request(self, req, resp):
+ raise HTTPStatus(falcon.HTTP_200,
+ headers={"X-Failed": "False"},
+ body="Pass")
+
+ self.api = falcon.API(middleware=TestMiddleware())
+ self.api.add_route('/status', self.resource)
+
+ body = self.simulate_request('/status', method='GET', decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
+
+ def test_raise_status_in_process_resource(self):
+ """ Make sure we can raise status from middleware process resource """
+ class TestMiddleware:
+ def process_resource(self, req, resp, resource):
+ raise HTTPStatus(falcon.HTTP_200,
+ headers={"X-Failed": "False"},
+ body="Pass")
+
+ self.api = falcon.API(middleware=TestMiddleware())
+ self.api.add_route('/status', self.resource)
+
+ body = self.simulate_request('/status', method='GET', decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
+
+ def test_raise_status_runs_process_response(self):
+ """ Make sure process_response still runs """
+ class TestMiddleware:
+ def process_response(self, req, resp, response):
+ resp.status = falcon.HTTP_200
+ resp.set_header("X-Failed", "False")
+ resp.body = "Pass"
+
+ self.api = falcon.API(middleware=TestMiddleware())
+ self.api.add_route('/status', self.resource)
+
+ body = self.simulate_request('/status', method='GET', decode='utf-8')
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn(('x-failed', 'False'), self.srmock.headers)
+ self.assertEqual(body, 'Pass')
| [
{
"components": [
{
"doc": "Composes a response for the given HTTPStatus instance.",
"lines": [
484,
493
],
"name": "API._compose_status_response",
"signature": "def _compose_status_response(self, req, resp, http_status):",
"type": "funct... | [
"tests/test_httpstatus.py::TestHTTPStatus::test_raise_status_empty_body",
"tests/test_httpstatus.py::TestHTTPStatus::test_raise_status_in_before_hook",
"tests/test_httpstatus.py::TestHTTPStatus::test_raise_status_in_responder",
"tests/test_httpstatus.py::TestHTTPStatus::test_raise_status_runs_after_hooks",
... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat(api) Add HTTPStatus exception for immediate response
Add an HTTPStatus exception that is handled similarly to HTTPError in
terms of hooks and middleware, but does not pass through error handling.
Fixes #414
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/api.py]
(definition of API._compose_status_response:)
def _compose_status_response(self, req, resp, http_status):
"""Composes a response for the given HTTPStatus instance."""
[end of new definitions in falcon/api.py]
[start of new definitions in falcon/http_status.py]
(definition of HTTPStatus:)
class HTTPStatus(Exception):
"""Represents a generic HTTP status.
Raise this class from a hook, middleware, or a responder to stop handling
the request and skip to the response handling.
Attributes:
status (str): HTTP status line, e.g. '748 Confounded by Ponies'.
headers (dict): Extra headers to add to the response.
body (str or unicode): String representing response content. If
Unicode, Falcon will encode as UTF-8 in the response.
Args:
status (str): HTTP status code and text, such as
'748 Confounded by Ponies'.
headers (dict): Extra headers to add to the response.
body (str or unicode): String representing response content. If
Unicode, Falcon will encode as UTF-8 in the response."""
(definition of HTTPStatus.__init__:)
def __init__(self, status, headers=None, body=None):
[end of new definitions in falcon/http_status.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
How do I force an immediate response?
In Falcon I can force an immediate error code response to return from anywhere in my code by raising a Falcon exception with the appropriate error code.
Is there a way to force an immediate return of the response to the client for VALID (i.e. not errors) responses?
----------
@dukedougal You could register a custom error handler and then raise exceptions of that type. It is a bit of a hack, but will work with 0.2. We'll look at doing something more elegant in 0.3 or 0.4.
--------------------
</issues> | 77d5e6394a88ead151c9469494749f95f06b24bf | |
joke2k__faker-215 | 215 | joke2k/faker | null | a40a4121db15d527fad8922dee3b8f889bb55be7 | 2015-03-30T06:46:38Z | diff --git a/faker/providers/address/ja_JP/__init__.py b/faker/providers/address/ja_JP/__init__.py
new file mode 100644
index 0000000000..ec12a6884c
--- /dev/null
+++ b/faker/providers/address/ja_JP/__init__.py
@@ -0,0 +1,33 @@
+# -*- coding: utf-8 -*-
+from __future__ import unicode_literals
+from .. import Provider as AddressProvider
+
+
+class Provider(AddressProvider):
+ countries = (
+ 'アフガニスタン', 'アルバニア', 'アルジェリア', 'アメリカ領サモア', 'アンドラ', 'アンゴラ', 'アンギラ', '南極大陸', 'アンティグアバーブーダ', 'アルゼンチン', 'アルメニア', 'アルバ', 'オーストラリア', 'オーストリア', 'アゼルバイジャン',
+ 'バハマ', 'バーレーン', 'バングラデシュ', 'バルバドス', 'ベラルーシ', 'ベルギー', 'ベリーズ', 'ベナン', 'バミューダ島', 'ブータン', 'ボリビア', 'ボスニア・ヘルツェゴビナ', 'ボツワナ', 'ブーベ島', 'ブラジル', 'イギリス領インド洋地域', 'イギリス領ヴァージン諸島', 'ブルネイ', 'ブルガリア', 'ブルキナファソ', 'ブルンジ',
+ 'カンボジア', 'カメルーン', 'カナダ', 'カーボベルデ', 'ケイマン諸島', '中央アフリカ共和国', 'チャド', 'チリ', '中国', 'クリスマス島', 'ココス諸島', 'コロンビア', 'コモロ', 'コンゴ共和国', 'クック諸島', 'コスタリカ', 'コートジボワール', 'クロアチア', 'キューバ', 'キプロス共和国', 'チェコ共和国',
+ 'デンマーク', 'ジブチ共和国', 'ドミニカ国', 'ドミニカ共和国',
+ 'エクアドル', 'エジプト', 'エルサルバドル', '赤道ギニア共和国', 'エリトリア', 'エストニア', 'エチオピア',
+ 'フェロー諸島', 'フォークランド諸島', 'フィジー共和国', 'フィンランド', 'フランス', 'フランス領ギアナ', 'フランス領ポリネシア', 'フランス領極南諸島',
+ 'ガボン', 'ガンビア', 'グルジア', 'ドイツ', 'ガーナ', 'ジブラルタル', 'ギリシャ', 'グリーンランド', 'グレナダ', 'グアドループ', 'グアム', 'グアテマラ', 'ガーンジー', 'ギニア', 'ギニアビサウ', 'ガイアナ',
+ 'ハイチ', 'ハード島とマクドナルド諸島', 'バチカン市国', 'ホンジュラス', '香港', 'ハンガリー',
+ 'アイスランド', 'インド', 'インドネシア', 'イラン', 'イラク', 'アイルランド共和国', 'マン島', 'イスラエル', 'イタリア',
+ 'ジャマイカ', '日本', 'ジャージー島', 'ヨルダン',
+ 'カザフスタン', 'ケニア', 'キリバス', '朝鮮', '韓国', 'クウェート', 'キルギス共和国',
+ 'ラオス人民民主共和国', 'ラトビア', 'レバノン', 'レソト', 'リベリア', 'リビア国', 'リヒテンシュタイン', 'リトアニア', 'ルクセンブルク',
+ 'マカオ', 'マケドニア共和国', 'マダガスカル', 'マラウィ', 'マレーシア', 'モルディブ', 'マリ', 'マルタ共和国', 'マーシャル諸島', 'マルティニーク', 'モーリタニア・イスラム共和国', 'モーリシャス', 'マヨット', 'メキシコ', 'ミクロネシア連邦', 'モルドバ共和国', 'モナコ公国', 'モンゴル', 'モンテネグロ共和国', 'モントセラト', 'モロッコ', 'モザンビーク', 'ミャンマー',
+ 'ナミビア', 'ナウル', 'ネパール', 'オランダ領アンティル', 'オランダ', 'ニューカレドニア', 'ニュージーランド', 'ニカラグア', 'ニジェール', 'ナイジェリア', 'ニース', 'ノーフォーク島', '北マリアナ諸島', 'ノルウェー',
+ 'オマーン',
+ 'パキスタン', 'パラオ', 'パレスチナ自治区', 'パナマ', 'パプアニューギニア', 'パラグアイ', 'ペルー', 'フィリピン', 'ピトケアン諸島', 'ポーランド', 'ポルトガル', 'プエルトリコ',
+ 'カタール',
+ 'レユニオン', 'ルーマニア', 'ロシア', 'ルワンダ',
+ 'サン・バルテルミー島', 'セントヘレナ', 'セントクリストファー・ネイビス連邦', 'セントルシア', 'セント・マーチン島', 'サンピエール島・ミクロン島', 'セントビンセント・グレナディーン', 'サモア', 'サンマリノ', 'サントメプリンシペ', 'サウジアラビア', 'セネガル', 'セルビア', 'セイシェル', 'シエラレオネ', 'シンガポール', 'スロバキア', 'スロベニア', 'ソロモン諸島', 'ソマリア', '南アフリカ共和国', 'サウスジョージア・サウスサンドウィッチ諸島', 'スペイン', 'スリランカ', 'スーダン', 'スリナム', 'スヴァールバル諸島およびヤンマイエン島', 'スワジランド王国', 'スウェーデン', 'スイス', 'シリア',
+ '台湾', 'タジキスタン共和国', 'タンザニア', 'タイ', '東ティモール', 'トーゴ', 'トケラウ', 'トンガ', 'トリニダード・トバゴ', 'チュニジア', 'トルコ', 'トルクメニスタン', 'タークス・カイコス諸島', 'ツバル',
+ 'ウガンダ', 'ウクライナ', 'アラブ首長国連邦', 'イギリス', 'アメリカ合衆国', '合衆国領有小離島', 'アメリカ領ヴァージン諸島', 'ウルグアイ', 'ウズベキスタン',
+ 'バヌアツ', 'ベネズエラ', 'ベトナム',
+'ウォリス・フツナ', '西サハラ',
+ 'イエメン',
+ 'ザンビア', 'ジンバブエ'
+ )
diff --git a/faker/providers/company/ja_JP/__init__.py b/faker/providers/company/ja_JP/__init__.py
new file mode 100644
index 0000000000..c0e2b18b6a
--- /dev/null
+++ b/faker/providers/company/ja_JP/__init__.py
@@ -0,0 +1,15 @@
+# -*- coding: utf-8 -*-
+from __future__ import unicode_literals
+from .. import Provider as CompanyProvider
+
+
+class Provider(CompanyProvider):
+ formats = (
+ '{{company_prefix}} {{last_name}}',
+ )
+
+ company_prefixes = ('株式会社', '有限会社', '合同会社')
+
+ @classmethod
+ def company_prefix(cls):
+ return cls.random_element(cls.company_prefixes)
diff --git a/faker/providers/person/ja_JP/__init__.py b/faker/providers/person/ja_JP/__init__.py
new file mode 100644
index 0000000000..0c4946cd28
--- /dev/null
+++ b/faker/providers/person/ja_JP/__init__.py
@@ -0,0 +1,111 @@
+# -*- coding: utf-8 -*-
+from __future__ import unicode_literals
+from .. import Provider as PersonProvider
+
+
+class Provider(PersonProvider):
+ formats_female = (
+ '{{last_name}} {{first_name_female}}',
+ )
+
+ formats_male = (
+ '{{last_name}} {{first_name_male}}',
+ )
+
+ formats = formats_male + formats_female
+
+ first_names_female = (
+ '明美', 'あすか', '香織', '加奈', 'くみ子', 'さゆり', '知実', '千代',
+ '直子', '七夏', '花子', '春香', '真綾', '舞', '美加子', '幹', '桃子', '結衣', '裕美子', '陽子', '里佳',
+ )
+
+ first_names_male = (
+ '晃', '篤司', '治', '和也', '京助', '健一', '修平', '翔太', '淳', '聡太郎', '太一', '太郎', '拓真', '翼', '智也',
+ '直樹', '直人', '英樹', '浩', '学', '充', '稔', '裕樹', '裕太', '康弘', '陽一', '洋介', '亮介', '涼平', '零',
+ )
+
+ first_names = first_names_male + first_names_female
+
+ last_names = (
+ '青田', '青山', '石田', '井高', '伊藤', '井上', '宇野', '江古田', '大垣',
+ '加藤', '加納', '喜嶋', '木村', '桐山', '工藤', '小泉', '小林', '近藤',
+ '斉藤', '坂本', '佐々木', '佐藤', '笹田', '鈴木', '杉山',
+ '高橋', '田中', '田辺', '津田',
+ '中島', '中村', '渚', '中津川', '西之園', '野村',
+ '原田', '浜田', '廣川', '藤本',
+ '松本', '三宅', '宮沢', '村山',
+ '山岸', '山口', '山田', '山本', '吉田', '吉本',
+ '若松', '渡辺',
+ )
+
+ kana_formats = (
+ '{{last_kana_name}} {{first_kana_name_female}}',
+ '{{last_kana_name}} {{first_kana_name_male}}',
+ )
+
+ first_kana_names_female = (
+ 'アキラ', 'アケミ', 'アスカ',
+ 'カオリ', 'カナ', 'クミコ',
+ 'サユリ',
+ 'チヨ', 'ツバサ', 'トモミ',
+ 'ナオコ', 'ナナカ',
+ 'ハナコ', 'ハルカ',
+ 'マアヤ', 'マイ', 'ミキ', 'モモコ',
+ 'ユイ', 'ヨウコ', 'ユミコ',
+ 'レイ', 'リカ',
+ )
+
+ first_kana_names_male = (
+ 'アキラ', 'アツシ', 'オサム',
+ 'キョウスケ', 'ケンイチ',
+ 'ジュン', 'ソウタロウ',
+ 'タイチ', 'タクマ', 'タロウ', 'ツバサ', 'トモヤ',
+ 'ナオキ', 'ナオト',
+ 'ヒデキ', 'ヒロシ',
+ 'マナブ', 'ミツル', 'ミノル', 'ヒロキ',
+ 'ユウタ', 'ヤスヒロ', 'ヨウイチ', 'ヨウスケ',
+ 'リョウスケ', 'リョウヘイ',
+ )
+
+ first_kana_names = first_kana_names_male + first_kana_names_female
+
+ last_kana_names = (
+ 'アオタ', 'アオヤマ', 'イシダ', 'イダカ', 'イトウ', 'ウノ', 'エコダ', 'オオガキ',
+ 'カノウ', 'カノウ', 'キジマ', 'キムラ', 'キリヤマ', 'クドウ', 'コイズミ', 'コバヤシ', 'コンドウ',
+ 'サイトウ', 'サカモト', 'ササキ', 'サトウ', 'ササダ', 'スズキ', 'スギヤマ',
+ 'タカハシ', 'タナカ', 'タナベ', 'ツダ', 'ツチヤ',
+ 'ナカジマ', 'ナカムラ', 'ナギサ', 'ナカツガワ', 'ニシノソノ', 'ノムラ',
+ 'ハラダ', 'ハマダ', 'ヒロカワ', 'フジモト',
+ 'マツモト', 'ミヤケ', 'ミヤザワ', 'ムラヤマ',
+ 'ヤマギシ', 'ヤマグチ', 'ヤマダ', 'ヤマモト', 'ヨシダ', 'ヨシモト',
+ 'ワカマツ', 'ワタナベ',
+ )
+
+ def kana_name(self):
+ '''
+ @example 'アオタ アキラ'
+ '''
+ pattern = self.random_element(self.kana_formats)
+ return self.generator.parse(pattern)
+
+ @classmethod
+ def first_kana_name(cls):
+ '''
+ @example 'アキラ'
+ '''
+ return cls.random_element(cls.first_kana_names)
+
+ @classmethod
+ def first_kana_name_female(cls):
+ return cls.random_element(cls.first_kana_names_female)
+
+ @classmethod
+ def first_kana_name_male(cls):
+ return cls.random_element(cls.first_kana_names_male)
+
+ @classmethod
+ def last_kana_name(cls):
+ '''
+ @example 'アオタ'
+ '''
+ return cls.random_element(cls.last_kana_names)
diff --git a/faker/providers/phone_number/ja_JP/__init__.py b/faker/providers/phone_number/ja_JP/__init__.py
new file mode 100644
index 0000000000..59d10d4233
--- /dev/null
+++ b/faker/providers/phone_number/ja_JP/__init__.py
@@ -0,0 +1,12 @@
+# -*- coding: utf-8 -*-
+from __future__ import unicode_literals
+from .. import Provider as PhoneNumberProvider
+
+
+class Provider(PhoneNumberProvider):
+ formats = (
+ '070-####-####',
+ '080-####-####',
+ '090-####-####',
+ '##-####-####',
+ )
| diff --git a/faker/tests/ja_JP/__init__.py b/faker/tests/ja_JP/__init__.py
new file mode 100644
index 0000000000..b63fcb2620
--- /dev/null
+++ b/faker/tests/ja_JP/__init__.py
@@ -0,0 +1,96 @@
+# coding=utf-8
+
+from __future__ import unicode_literals
+
+import unittest
+
+from faker import Factory
+from .. import string_types
+
+
+class ja_JP_FactoryTestCase(unittest.TestCase):
+ def setUp(self):
+ self.factory = Factory.create('ja')
+
+ def test_ja_JP_address(self):
+ from faker.providers.address.ja_JP import Provider
+ countries = Provider.countries
+
+ country = self.factory.country()
+ assert country
+ assert isinstance(country, string_types)
+ assert country in countries
+
+ def test_ja_JP_company(self):
+ from faker.providers.company.ja_JP import Provider
+ prefixes = Provider.company_prefixes
+
+ prefix = self.factory.company_prefix()
+ assert prefix
+ assert isinstance(prefix, string_types)
+ assert prefix in prefixes
+
+ company = self.factory.company()
+ assert company
+ assert isinstance(company, string_types)
+ assert any(prefix in company for prefix in prefixes)
+ assert any(company.startswith(prefix) for prefix in prefixes)
+
+ def test_ja_JP_person(self):
+ name = self.factory.name()
+ assert name
+ assert isinstance(name, string_types)
+
+ first_name = self.factory.first_name()
+ assert first_name
+ assert isinstance(first_name, string_types)
+
+ last_name = self.factory.last_name()
+ assert last_name
+ assert isinstance(last_name, string_types)
+
+ kana_name = self.factory.kana_name()
+ assert kana_name
+ assert isinstance(kana_name, string_types)
+
+ first_kana_name = self.factory.first_kana_name()
+ assert first_kana_name
+ assert isinstance(first_kana_name, string_types)
+
+ first_kana_name_male = self.factory.first_kana_name_male()
+ assert first_kana_name_male
+ assert isinstance(first_kana_name_male, string_types)
+
+ first_kana_name_female = self.factory.first_kana_name_female()
+ assert first_kana_name_female
+ assert isinstance(first_kana_name_female, string_types)
+
+ last_kana_name = self.factory.last_kana_name()
+ assert last_kana_name
+ assert isinstance(last_kana_name, string_types)
+
+ def test_ja_JP_phone_number(self):
+ pn = self.factory.phone_number()
+ formats = (
+ '070',
+ '080',
+ '090',
+ )
+
+ assert pn
+ assert isinstance(pn, string_types)
+ first, second, third = pn.split('-')
+ assert first
+ assert first.isdigit()
+ assert second
+ assert second.isdigit()
+ assert third
+ assert third.isdigit()
+ if len(first) == 2:
+ assert len(second) == 4
+ assert len(third) == 4
+ else:
+ assert len(first) == 3
+ assert len(second) == 4
+ assert len(third) == 4
+ assert first in formats
| [
{
"components": [
{
"doc": "",
"lines": [
6,
32
],
"name": "Provider",
"signature": "class Provider(AddressProvider):",
"type": "class"
}
],
"file": "faker/providers/address/ja_JP/__init__.py"
},
{
"components": [
... | [
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_address",
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_company",
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_person",
"faker/tests/ja_JP/__init__.py::ja_JP_FactoryTestCase::test_ja_JP_phone_number"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add localized providers ja_JP
I added ja_JP(Japanese) provider, and tests.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in faker/providers/address/ja_JP/__init__.py]
(definition of Provider:)
class Provider(AddressProvider):
[end of new definitions in faker/providers/address/ja_JP/__init__.py]
[start of new definitions in faker/providers/company/ja_JP/__init__.py]
(definition of Provider:)
class Provider(CompanyProvider):
(definition of Provider.company_prefix:)
def company_prefix(cls):
[end of new definitions in faker/providers/company/ja_JP/__init__.py]
[start of new definitions in faker/providers/person/ja_JP/__init__.py]
(definition of Provider:)
class Provider(PersonProvider):
(definition of Provider.kana_name:)
def kana_name(self):
"""@example 'アオタ アキラ'"""
(definition of Provider.first_kana_name:)
def first_kana_name(cls):
"""@example 'アキラ'"""
(definition of Provider.first_kana_name_female:)
def first_kana_name_female(cls):
(definition of Provider.first_kana_name_male:)
def first_kana_name_male(cls):
(definition of Provider.last_kana_name:)
def last_kana_name(cls):
"""@example 'アオタ'"""
[end of new definitions in faker/providers/person/ja_JP/__init__.py]
[start of new definitions in faker/providers/phone_number/ja_JP/__init__.py]
(definition of Provider:)
class Provider(PhoneNumberProvider):
[end of new definitions in faker/providers/phone_number/ja_JP/__init__.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 6edfdbf6ae90b0153309e3bf066aa3b2d16494a7 | ||
boto__boto3-74 | 74 | boto/boto3 | null | ebc0f95261025aa02c474ec8ffa3e0a0604cb3c6 | 2015-03-20T20:39:55Z | diff --git a/boto3/docs.py b/boto3/docs.py
index 9aace7848a..f5535f5e2d 100644
--- a/boto3/docs.py
+++ b/boto3/docs.py
@@ -52,6 +52,7 @@ def py_type_name(type_name):
:rtype: string
"""
return {
+ 'blob': 'bytes',
'character': 'string',
'double': 'float',
'long': 'integer',
@@ -88,7 +89,7 @@ def py_default(type_name):
}.get(type_name, '...')
-def html_to_rst(html, indent=0, indentFirst=False):
+def html_to_rst(html, indent=0, indent_first=False):
"""
Use bcdoc to convert html to rst.
@@ -96,8 +97,8 @@ def html_to_rst(html, indent=0, indentFirst=False):
:param html: Input HTML to be converted
:type indent: int
:param indent: Number of spaces to indent each line
- :type indentFirst: boolean
- :param indentFirst: Whether to indent the first line
+ :type indent_first: boolean
+ :param indent_first: Whether to indent the first line
:rtype: string
"""
doc = ReSTDocument()
@@ -113,7 +114,7 @@ def html_to_rst(html, indent=0, indentFirst=False):
if indent:
rst = '\n'.join([(' ' * indent) + line for line in rst.splitlines()])
- if not indentFirst:
+ if not indent_first:
rst = rst.strip()
return rst
@@ -563,7 +564,7 @@ def document_operation(operation_model, service_name, operation_name=None,
if description is None:
description = html_to_rst(
operation_model._operation_model.get('documentation', ''),
- indent=6, indentFirst=True)
+ indent=6, indent_first=True)
docs = ' .. py:method:: {0}({1})\n\n{2}\n\n'.format(
operation_name, param_desc, description)
@@ -591,11 +592,97 @@ def document_operation(operation_model, service_name, operation_name=None,
if key in ignore_params:
continue
param_type = py_type_name(value.type_name)
+
+ # Convert the description from HTML to RST (to later be converted
+ # into HTML... don't ask). If the parameter is a nested structure
+ # then we also describe its members.
+ param_desc = html_to_rst(
+ value.documentation, indent=9, indent_first=True)
+ if param_type in ['list', 'dict']:
+ param_desc = ('\n Structure description::\n\n' +
+ ' ' + key + ' = ' +
+ document_structure(
+ key, value, indent=12, indent_first=False) +
+ '\n' + param_desc)
required = key in required_params and 'Required' or 'Optional'
docs += (' :param {0} {1}: *{2}* - {3}\n'.format(
- param_type, key, required,
- html_to_rst(value.documentation, indent=9)))
+ param_type, key, required, param_desc))
if rtype is not None:
- docs += '\n\n :rtype: {0}\n\n'.format(rtype)
+ docs += ' :rtype: {0}\n\n'.format(rtype)
+
+ # Only document the return structure if it isn't a resource. Usually
+ # this means either a list or structure.
+ output_shape = operation_model.output_shape
+ if rtype in ['list', 'dict'] and output_shape is not None:
+ docs += (' :return:\n Structure description::\n\n' +
+ document_structure(None, output_shape, indent=12) + '\n')
+
+ return docs
+
+
+def document_structure(name, shape, indent=0, indent_first=True,
+ parent_type=None, eol='\n'):
+ """
+ Document a nested structure (list or dict) parameter or return value as
+ a snippet of Python code with dummy placeholders. For example:
+
+ {
+ 'Param1': [
+ STRING,
+ ...
+ ],
+ 'Param2': BOOLEAN,
+ 'Param3': {
+ 'Param4': FLOAT,
+ 'Param5': INTEGER
+ }
+ }
+
+ """
+ docs = ''
+
+ # Add spaces if the first line is indented.
+ if indent_first:
+ docs += (' ' * indent)
+
+ if shape.type_name == 'structure':
+ # Only include the name if the parent is also a structure.
+ if parent_type == 'structure':
+ docs += "'" + name + '\': {\n'
+ else:
+ docs += '{\n'
+
+ # Go through each member and recursively process them.
+ for i, member_name in enumerate(shape.members):
+ member_eol = '\n'
+ if i < len(shape.members) - 1:
+ member_eol = ',\n'
+ docs += document_structure(
+ member_name, shape.members[member_name],
+ indent=indent + 2, parent_type=shape.type_name,
+ eol=member_eol)
+ docs += (' ' * indent) + '}' + eol
+ elif shape.type_name == 'list':
+ # Only include the name if the parent is a structure.
+ if parent_type == 'structure':
+ docs += "'" + name + '\': [\n'
+ else:
+ docs += '[\n'
+
+ # Lists have only a single member. Here we document it, plus add
+ # an ellipsis to signify that more of the same member type can be
+ # added in a list.
+ docs += document_structure(
+ None, shape.member, indent=indent + 2, eol=',\n')
+ docs += (' ' * indent) + ' ...\n'
+ docs += (' ' * indent) + ']' + eol
+ else:
+ # It's not a structure or list, so document the type. Here we
+ # try to use the equivalent Python type name for clarity.
+ if name is not None:
+ docs += ("'" + name + '\': ' +
+ py_type_name(shape.type_name).upper() + eol)
+ else:
+ docs += py_type_name(shape.type_name).upper() + eol
return docs
diff --git a/boto3/resources/factory.py b/boto3/resources/factory.py
index 7794fb21f4..aaf2015fc1 100644
--- a/boto3/resources/factory.py
+++ b/boto3/resources/factory.py
@@ -246,16 +246,27 @@ def _create_reference(factory_self, name, reference, service_name,
# References are essentially an action with no request
# or response, so we can re-use the response handlers to
# build up resources from identifiers and data members.
- handler = ResourceHandler('', factory_self, resource_defs,
- service_model, reference.resource)
+ handler = ResourceHandler(reference.resource.path, factory_self,
+ resource_defs, service_model,
+ reference.resource)
+
+ # Are there any identifiers that need access to data members?
+ # This is important when building the resource below since
+ # it requires the data to be loaded.
+ needs_data = any(i.source == 'data' for i in
+ reference.resource.identifiers)
def get_reference(self):
# We need to lazy-evaluate the reference to handle circular
# references between resources. We do this by loading the class
# when first accessed.
- # First, though, we need to see if we have the required
- # identifiers to instantiate the resource reference.
- return handler(self, {}, {})
+ # This is using a *response handler* so we need to make sure
+ # our data is loaded (if possible) and pass that data into
+ # the handler as if it were a response. This allows references
+ # to have their data loaded properly.
+ if needs_data and self.meta.data is None and hasattr(self, 'load'):
+ self.load()
+ return handler(self, {}, self.meta.data)
get_reference.__name__ = str(reference.name)
get_reference.__doc__ = 'TODO'
| diff --git a/tests/unit/resources/test_factory.py b/tests/unit/resources/test_factory.py
index 17b38b11cd..e2aec290f2 100644
--- a/tests/unit/resources/test_factory.py
+++ b/tests/unit/resources/test_factory.py
@@ -553,7 +553,7 @@ def test_resource_loads_waiters(self):
}
}
}
-
+
defs = {
'Bucket': {}
}
@@ -686,6 +686,67 @@ def test_dangling_resource_inequality(self):
self.assertNotEqual(q1, q2)
self.assertNotEqual(q1, m)
+ def test_dangling_resource_loads_data(self):
+ # Given a loadable resource instance that contains a reference
+ # to another resource which has a resource data path, the
+ # referenced resource should be loaded with all of the data
+ # contained at that path. This allows loading references
+ # which would otherwise not be loadable (missing load method)
+ # and prevents extra load calls for others when we already
+ # have the data available.
+ self.defs = {
+ 'Instance': {
+ 'identifiers': [{'name': 'Id'}],
+ 'has': {
+ 'NetworkInterface': {
+ 'resource': {
+ 'type': 'NetworkInterface',
+ 'identifiers': [
+ {'target': 'Id', 'source': 'data',
+ 'path': 'NetworkInterface.Id'}
+ ],
+ 'path': 'NetworkInterface'
+ }
+ }
+ }
+ },
+ 'NetworkInterface': {
+ 'identifiers': [{'name': 'Id'}],
+ 'shape': 'NetworkInterfaceShape'
+ }
+ }
+ self.model = self.defs['Instance']
+ shape = DenormalizedStructureBuilder().with_members({
+ 'Id': {
+ 'type': 'string',
+ },
+ 'PublicIp': {
+ 'type': 'string'
+ }
+ }).build_model()
+ service_model = mock.Mock()
+ service_model.shape_for.return_value = shape
+
+ cls = self.load('test', 'Instance', self.model, self.defs,
+ service_model)
+ instance = cls('instance-id')
+
+ # Set some data as if we had completed a load action.
+ def set_meta_data():
+ instance.meta.data = {
+ 'NetworkInterface': {
+ 'Id': 'network-interface-id',
+ 'PublicIp': '127.0.0.1'
+ }
+ }
+ instance.load = mock.Mock(side_effect=set_meta_data)
+
+ # Now, get the reference and make sure it has its data
+ # set as expected.
+ interface = instance.network_interface
+ self.assertIsNotNone(interface.meta.data)
+ self.assertEqual(interface.public_ip, '127.0.0.1')
+
class TestServiceResourceSubresources(BaseTestResourceFactory):
def setUp(self):
diff --git a/tests/unit/test_docs.py b/tests/unit/test_docs.py
new file mode 100644
index 0000000000..f5093321a9
--- /dev/null
+++ b/tests/unit/test_docs.py
@@ -0,0 +1,86 @@
+# Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"). You
+# may not use this file except in compliance with the License. A copy of
+# the License is located at
+#
+# http://aws.amazon.com/apache2.0/
+#
+# or in the "license" file accompanying this file. This file is
+# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
+# ANY KIND, either express or implied. See the License for the specific
+# language governing permissions and limitations under the License.
+
+from botocore.model import DenormalizedStructureBuilder, ServiceModel
+
+from boto3.docs import py_type_name, document_structure
+from tests import mock, BaseTestCase
+
+
+class TestPythonTypeName(BaseTestCase):
+ def test_structure(self):
+ self.assertEqual('dict', py_type_name('structure'))
+
+ def test_list(self):
+ self.assertEqual('list', py_type_name('list'))
+
+ def test_map(self):
+ self.assertEqual('dict', py_type_name('map'))
+
+ def test_string(self):
+ self.assertEqual('string', py_type_name('string'))
+
+ def test_character(self):
+ self.assertEqual('string', py_type_name('character'))
+
+ def test_blob(self):
+ self.assertEqual('bytes', py_type_name('blob'))
+
+ def test_timestamp(self):
+ self.assertEqual('datetime', py_type_name('timestamp'))
+
+ def test_integer(self):
+ self.assertEqual('integer', py_type_name('integer'))
+
+ def test_long(self):
+ self.assertEqual('integer', py_type_name('long'))
+
+ def test_float(self):
+ self.assertEqual('float', py_type_name('float'))
+
+ def test_double(self):
+ self.assertEqual('float', py_type_name('double'))
+
+
+class TestDocumentStructure(BaseTestCase):
+ def test_nested_structure(self):
+ # Internally this doesn't use an OrderedDict so we can't
+ # test the full output, but we can test whether the
+ # parameters are all included as expected with the correct
+ # types.
+ shape = DenormalizedStructureBuilder().with_members({
+ 'Param1': {
+ 'type': 'list',
+ 'member': {
+ 'type': 'structure',
+ 'members': {
+ 'Param2': {
+ 'type': 'string'
+ },
+ 'Param3': {
+ 'type': 'float'
+ }
+ }
+ }
+ },
+ 'Param4': {
+ 'type': 'blob'
+ }
+ }).build_model()
+
+ doc = document_structure('Test', shape)
+
+ self.assertIn("'Param1': [", doc)
+ self.assertIn("'Param2': STRING", doc)
+ self.assertIn("'Param3': FLOAT", doc)
+ self.assertIn("'Param4': BYTES", doc)
| [
{
"components": [
{
"doc": "Document a nested structure (list or dict) parameter or return value as\na snippet of Python code with dummy placeholders. For example:\n\n {\n 'Param1': [\n STRING,\n ...\n ],\n 'Param2': BOOLEAN,\n 'Param3': {\n ... | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_can_instantiate_service_resource",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_factory_creates_dangling_resources",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_factory_creates_properties",
"tests/unit/re... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Load reference data if a resource path is defined
This change allows references with a JMESPath query set on the reference
resource path attribute to be loaded with data at instantiation time if
that data is present in the parent (via `meta.data`). If the data has
not yet been loaded and the parent is loadable, then a `load` operation
is incurred.
Before:
``` python
>>> ni = ec2.NetworkInterface('abc123')
>>> ni.association.public_ip
ResourceLoadException: ec2.NetworkInterfaceAssociation has no load method
```
After:
``` python
>>> ni = ec2.NetworkInterface('abc123')
>>> ni.association.public_ip
'127.0.0.1'
```
Added a test to ensure this works as expected.
@kyleknap @jamesls please have a look.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/docs.py]
(definition of document_structure:)
def document_structure(name, shape, indent=0, indent_first=True, parent_type=None, eol='\n'):
"""Document a nested structure (list or dict) parameter or return value as
a snippet of Python code with dummy placeholders. For example:
{
'Param1': [
STRING,
...
],
'Param2': BOOLEAN,
'Param3': {
'Param4': FLOAT,
'Param5': INTEGER
}
}"""
[end of new definitions in boto3/docs.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | ||
boto__boto3-67 | 67 | boto/boto3 | null | 5cd57ce1785b950105d6d87dad28f40a17d06183 | 2015-02-23T19:19:29Z | diff --git a/boto3/docs.py b/boto3/docs.py
index 02ab776d81..9aace7848a 100644
--- a/boto3/docs.py
+++ b/boto3/docs.py
@@ -177,6 +177,12 @@ def docs_for(service_name):
for name, model in sorted(data['resources'].items(),
key=lambda i:i[0]):
resource_model = ResourceModel(name, model, data['resources'])
+
+ shape = None
+ if resource_model.shape:
+ shape = service_model.shape_for(resource_model.shape)
+ resource_model.load_rename_map(shape)
+
if name not in models:
models[name] = {'type': 'resource', 'model': resource_model}
@@ -333,7 +339,8 @@ def document_resource(service_name, official_name, resource_model,
docs += ' Attributes:\n\n'
shape = service_model.shape_for(resource_model.shape)
- for name, member in sorted(shape.members.items()):
+ attributes = resource_model.get_attributes(shape)
+ for name, (orig_name, member) in sorted(attributes.items()):
docs += (' .. py:attribute:: {0}\n\n (``{1}``)'
' {2}\n\n').format(
xform_name(name), py_type_name(member.type_name),
@@ -403,7 +410,7 @@ def document_resource(service_name, official_name, resource_model,
' to reach a specific state.\n\n')
service_waiter_model = session.get_waiter_model(service_name)
for waiter in sorted(resource_model.waiters,
- key=lambda i: i.resource_waiter_name):
+ key=lambda i: i.name):
docs += document_waiter(waiter, service_name, resource_model,
service_model, service_waiter_model)
@@ -467,7 +474,7 @@ def document_waiter(waiter, service_name, resource_model, service_model,
' This method calls ``wait()`` on'
' :py:meth:`{2}.Client.get_waiter` using `{3}`_ .').format(
resource_model.name,
- xform_name(waiter.name).replace('_', ' '),
+ ' '.join(waiter.name.split('_')[2:]),
service_name,
xform_name(waiter.waiter_name))
@@ -476,7 +483,7 @@ def document_waiter(waiter, service_name, resource_model, service_model,
return document_operation(
operation_model=operation_model, service_name=service_name,
- operation_name=xform_name(waiter.resource_waiter_name),
+ operation_name=xform_name(waiter.name),
description=description,
example_instance = xform_name(resource_model.name),
ignore_params=ignore_params, rtype=None)
diff --git a/boto3/resources/factory.py b/boto3/resources/factory.py
index c637c4e8f8..7794fb21f4 100644
--- a/boto3/resources/factory.py
+++ b/boto3/resources/factory.py
@@ -14,8 +14,6 @@
import logging
from functools import partial
-from botocore import xform_name
-
from .action import ServiceAction
from .action import WaiterAction
from .base import ResourceMeta, ServiceResource
@@ -74,6 +72,11 @@ def load_from_definition(self, service_name, resource_name, model,
resource_model = ResourceModel(resource_name, model, resource_defs)
+ shape = None
+ if resource_model.shape:
+ shape = service_model.shape_for(resource_model.shape)
+ resource_model.load_rename_map(shape)
+
self._load_identifiers(attrs, meta, resource_model)
self._load_actions(attrs, resource_model, resource_defs,
service_model)
@@ -98,11 +101,8 @@ def _load_identifiers(self, attrs, meta, model):
operations on the resource.
"""
for identifier in model.identifiers:
- snake_cased = xform_name(identifier.name)
- snake_cased = self._check_allowed_name(
- attrs, snake_cased, 'identifier', model.name)
- meta.identifiers.append(snake_cased)
- attrs[snake_cased] = None
+ meta.identifiers.append(identifier.name)
+ attrs[identifier.name] = None
def _load_actions(self, attrs, model, resource_defs, service_model):
"""
@@ -112,16 +112,12 @@ def _load_actions(self, attrs, model, resource_defs, service_model):
"""
if model.load:
attrs['load'] = self._create_action(
- 'load', model.load, resource_defs, service_model,
- is_load=True)
+ model.load, resource_defs, service_model, is_load=True)
attrs['reload'] = attrs['load']
for action in model.actions:
- snake_cased = xform_name(action.name)
- snake_cased = self._check_allowed_name(
- attrs, snake_cased, 'action', model.name)
- attrs[snake_cased] = self._create_action(snake_cased,
- action, resource_defs, service_model)
+ attrs[action.name] = self._create_action(action, resource_defs,
+ service_model)
def _load_attributes(self, attrs, meta, model, service_model):
"""
@@ -133,16 +129,9 @@ def _load_attributes(self, attrs, meta, model, service_model):
if model.shape:
shape = service_model.shape_for(model.shape)
- for name, member in shape.members.items():
- snake_cased = xform_name(name)
- if snake_cased in meta.identifiers:
- # Skip identifiers, these are set through other means
- continue
-
- snake_cased = self._check_allowed_name(
- attrs, snake_cased, 'attribute', model.name)
- attrs[snake_cased] = self._create_autoload_property(name,
- snake_cased)
+ attributes = model.get_attributes(shape)
+ for name, (orig_name, member) in attributes.items():
+ attrs[name] = self._create_autoload_property(orig_name, name)
def _load_collections(self, attrs, model, resource_defs, service_model):
"""
@@ -152,12 +141,8 @@ def _load_collections(self, attrs, model, resource_defs, service_model):
through the collection's items.
"""
for collection_model in model.collections:
- snake_cased = xform_name(collection_model.name)
- snake_cased = self._check_allowed_name(
- attrs, snake_cased, 'collection', model.name)
-
- attrs[snake_cased] = self._create_collection(
- attrs['meta'].service_name, model.name, snake_cased,
+ attrs[collection_model.name] = self._create_collection(
+ attrs['meta'].service_name, model.name,
collection_model, resource_defs, service_model)
def _load_has_relations(self, attrs, service_name, resource_name,
@@ -176,11 +161,8 @@ def _load_has_relations(self, attrs, service_name, resource_name,
# This is a dangling reference, i.e. we have all
# the data we need to create the resource, so
# this instance becomes an attribute on the class.
- snake_cased = xform_name(reference.name)
- snake_cased = self._check_allowed_name(
- attrs, snake_cased, 'reference', model.name)
- attrs[snake_cased] = self._create_reference(
- reference.resource.type, snake_cased, reference,
+ attrs[reference.name] = self._create_reference(
+ reference.resource.type, reference,
service_name, resource_name, model, resource_defs,
service_model)
@@ -200,44 +182,7 @@ def _load_waiters(self, attrs, model):
of the resource.
"""
for waiter in model.waiters:
- snake_cased = xform_name(waiter.resource_waiter_name)
- snake_cased = self._check_allowed_name(
- attrs, snake_cased, 'waiter', model.name)
- attrs[snake_cased] = self._create_waiter(waiter, snake_cased)
-
- def _check_allowed_name(self, attrs, name, category, resource_name):
- """
- Determine if a given name is allowed on the instance, and if not,
- then raise an exception. This prevents public attributes of the
- class from being clobbered, e.g. since we define ``Resource.meta``,
- no identifier may be named ``meta``. Another example: no action
- named ``queue_items`` may be added after an identifier of the same
- name has been added.
-
- One attempt is made in the event of a collision to remedy the
- situation. The ``category`` is appended to the name and the
- check is performed again. For example, if an action named
- ``get_frobs`` fails the test, then we try ``get_frobs_action``
- after logging a warning.
-
- :raises: ValueError
- """
- if name in attrs:
- logger.warning('%s `%s` would clobber existing %s'
- ' resource attribute, going to try'
- ' %s instead...', category, name,
- resource_name, name + '_' + category)
- # TODO: Move this logic into the model and strictly
- # define the loading order of categories. This
- # will make documentation much simpler.
- name = name + '_' + category
-
- if name in attrs:
- raise ValueError('{0} `{1}` would clobber existing '
- '{2} resource attribute'.format(
- category, name, resource_name))
-
- return name
+ attrs[waiter.name] = self._create_waiter(waiter)
def _create_autoload_property(factory_self, name, snake_cased):
"""
@@ -262,22 +207,22 @@ def property_loader(self):
property_loader.__doc__ = 'TODO'
return property(property_loader)
- def _create_waiter(factory_self, waiter_model, snake_cased):
+ def _create_waiter(factory_self, waiter_model):
"""
Creates a new wait method for each resource where both a waiter and
resource model is defined.
"""
- waiter = WaiterAction(waiter_model, waiter_resource_name=snake_cased)
+ waiter = WaiterAction(waiter_model,
+ waiter_resource_name=waiter_model.name)
def do_waiter(self, *args, **kwargs):
waiter(self, *args, **kwargs)
- do_waiter.__name__ = str(snake_cased)
+ do_waiter.__name__ = str(waiter_model.name)
do_waiter.__doc__ = 'TODO'
return do_waiter
def _create_collection(factory_self, service_name, resource_name,
- snake_cased, collection_model,
- resource_defs, service_model):
+ collection_model, resource_defs, service_model):
"""
Creates a new property on the resource to lazy-load a collection.
"""
@@ -289,13 +234,12 @@ def get_collection(self):
return cls(collection_model, self, factory_self,
resource_defs, service_model)
- get_collection.__name__ = str(snake_cased)
+ get_collection.__name__ = str(collection_model.name)
get_collection.__doc__ = 'TODO'
return property(get_collection)
- def _create_reference(factory_self, name, snake_cased, reference,
- service_name, resource_name, model, resource_defs,
- service_model):
+ def _create_reference(factory_self, name, reference, service_name,
+ resource_name, model, resource_defs, service_model):
"""
Creates a new property on the resource to lazy-load a reference.
"""
@@ -313,7 +257,7 @@ def get_reference(self):
# identifiers to instantiate the resource reference.
return handler(self, {}, {})
- get_reference.__name__ = str(snake_cased)
+ get_reference.__name__ = str(reference.name)
get_reference.__doc__ = 'TODO'
return property(get_reference)
@@ -352,7 +296,7 @@ def create_resource(self, *args, **kwargs):
create_resource.__doc__ = 'TODO'
return create_resource
- def _create_action(factory_self, snake_cased, action_model, resource_defs,
+ def _create_action(factory_self, action_model, resource_defs,
service_model, is_load=False):
"""
Creates a new method which makes a request to the underlying
@@ -386,6 +330,6 @@ def do_action(self, *args, **kwargs):
return response
- do_action.__name__ = str(snake_cased)
+ do_action.__name__ = str(action_model.name)
do_action.__doc__ = 'TODO'
return do_action
diff --git a/boto3/resources/model.py b/boto3/resources/model.py
index 3dd60c42ce..70611831e0 100644
--- a/boto3/resources/model.py
+++ b/boto3/resources/model.py
@@ -25,6 +25,8 @@
import logging
+from botocore import xform_name
+
logger = logging.getLogger(__name__)
@@ -152,15 +154,14 @@ class Waiter(DefinitionWithParams):
:type definition: dict
:param definition: The JSON definition
"""
+ PREFIX = 'WaitUntil'
+
def __init__(self, name, definition):
super(Waiter, self).__init__(definition)
#: (``string``) The name of this waiter
self.name = name
- #: (``string``) The name of the waiter in the resource
- self.resource_waiter_name = 'WaitUntil' + name
-
#: (``string``) The name of the underlying event waiter
self.waiter_name = definition.get('waiterName')
@@ -250,12 +251,172 @@ class ResourceModel(object):
def __init__(self, name, definition, resource_defs):
self._definition = definition
self._resource_defs = resource_defs
+ self._renamed = {}
#: (``string``) The name of this resource
self.name = name
#: (``string``) The service shape name for this resource or ``None``
self.shape = definition.get('shape')
+ def load_rename_map(self, shape=None):
+ """
+ Load a name translation map given a shape. This will set
+ up renamed values for any collisions, e.g. if the shape,
+ an action, and a subresource all are all named ``foo``
+ then the resource will have an action ``foo``, a subresource
+ named ``Foo`` and a property named ``foo_attribute``.
+ This is the order of precedence, from most important to
+ least important:
+
+ * Load action (resource.load)
+ * Identifiers
+ * Actions
+ * Subresources
+ * References
+ * Collections
+ * Waiters
+ * Attributes (shape members)
+
+ Batch actions are only exposed on collections, so do not
+ get modified here. Subresources use upper camel casing, so
+ are unlikely to collide with anything but other subresources.
+
+ Creates a structure like this::
+
+ renames = {
+ ('action', 'id'): 'id_action',
+ ('collection', 'id'): 'id_collection',
+ ('attribute', 'id'): 'id_attribute'
+ }
+
+ # Get the final name for an action named 'id'
+ name = renames.get(('action', 'id'), 'id')
+
+ :type shape: botocore.model.Shape
+ :param shape: The underlying shape for this resource.
+ """
+ # Meta is a reserved name for resources
+ names = set(['meta'])
+ self._renamed = {}
+
+ if self._definition.get('load'):
+ names.add('load')
+
+ for item in self._definition.get('identifiers', []):
+ self._load_name_with_category(names, item['name'], 'identifier')
+
+ for name in self._definition.get('actions', {}):
+ self._load_name_with_category(names, name, 'action')
+
+ for name, ref in self._get_has_definition().items():
+ # Subresources require no data members, just typically
+ # identifiers and user input.
+ data_required = False
+ for identifier in ref['resource']['identifiers']:
+ if identifier['source'] == 'data':
+ data_required = True
+ break
+
+ if not data_required:
+ self._load_name_with_category(names, name, 'subresource',
+ snake_case=False)
+ else:
+ self._load_name_with_category(names, name, 'reference')
+
+ for name in self._definition.get('hasMany', {}):
+ self._load_name_with_category(names, name, 'collection')
+
+ for name in self._definition.get('waiters', {}):
+ self._load_name_with_category(names, Waiter.PREFIX + name,
+ 'waiter')
+
+ if shape is not None:
+ for name in shape.members.keys():
+ self._load_name_with_category(names, name, 'attribute')
+
+ def _load_name_with_category(self, names, name, category,
+ snake_case=True):
+ """
+ Load a name with a given category, possibly renaming it
+ if that name is already in use. The name will be stored
+ in ``names`` and possibly be set up in ``self._renamed``.
+
+ :type names: set
+ :param names: Existing names (Python attributes, properties, or
+ methods) on the resource.
+ :type name: string
+ :param name: The original name of the value.
+ :type category: string
+ :param category: The value type, such as 'identifier' or 'action'
+ :type snake_case: bool
+ :param snake_case: True (default) if the name should be snake cased.
+ """
+ if snake_case:
+ name = xform_name(name)
+
+ if name in names:
+ logger.debug('Renaming %s %s %s' % (self.name, category, name))
+ self._renamed[(category, name)] = name + '_' + category
+ name += '_' + category
+
+ if name in names:
+ # This isn't good, let's raise instead of trying to keep
+ # renaming this value.
+ raise ValueError('Problem renaming {0} {1} to {2}!'.format(
+ self.name, category, name))
+
+ names.add(name)
+
+ def _get_name(self, category, name, snake_case=True):
+ """
+ Get a possibly renamed value given a category and name. This
+ uses the rename map set up in ``load_rename_map``, so that
+ method must be called once first.
+
+ :type category: string
+ :param category: The value type, such as 'identifier' or 'action'
+ :type name: string
+ :param name: The original name of the value
+ :type snake_case: bool
+ :param snake_case: True (default) if the name should be snake cased.
+ :rtype: string
+ :return: Either the renamed value if it is set, otherwise the
+ original name.
+ """
+ if snake_case:
+ name = xform_name(name)
+
+ return self._renamed.get((category, name), name)
+
+ def get_attributes(self, shape):
+ """
+ Get a dictionary of attribute names to original name and shape
+ models that represent the attributes of this resource. Looks
+ like the following:
+
+ {
+ 'some_name': ('SomeName', <Shape...>)
+ }
+
+ :type shape: botocore.model.Shape
+ :param shape: The underlying shape for this resource.
+ :rtype: dict
+ :return: Mapping of resource attributes.
+ """
+ attributes = {}
+ identifier_names = [i.name for i in self.identifiers]
+
+ for name, member in shape.members.items():
+ snake_cased = xform_name(name)
+ if snake_cased in identifier_names:
+ # Skip identifiers, these are set through other means
+ continue
+ snake_cased = self._get_name('attribute', snake_cased,
+ snake_case=False)
+ attributes[snake_cased] = (name, member)
+
+ return attributes
+
@property
def identifiers(self):
"""
@@ -266,7 +427,8 @@ def identifiers(self):
identifiers = []
for item in self._definition.get('identifiers', []):
- identifiers.append(Identifier(item['name']))
+ name = self._get_name('identifier', item['name'])
+ identifiers.append(Identifier(name))
return identifiers
@@ -294,6 +456,7 @@ def actions(self):
actions = []
for name, item in self._definition.get('actions', {}).items():
+ name = self._get_name('action', name)
actions.append(Action(name, item, self._resource_defs))
return actions
@@ -308,6 +471,7 @@ def batch_actions(self):
actions = []
for name, item in self._definition.get('batchActions', {}).items():
+ name = self._get_name('batch_action', name)
actions.append(Action(name, item, self._resource_defs))
return actions
@@ -387,6 +551,10 @@ def _get_related_resources(self, subresources):
resources = []
for name, definition in self._get_has_definition().items():
+ if subresources:
+ name = self._get_name('subresource', name, snake_case=False)
+ else:
+ name = self._get_name('reference', name)
action = Action(name, definition, self._resource_defs)
data_required = False
@@ -430,6 +598,7 @@ def collections(self):
collections = []
for name, item in self._definition.get('hasMany', {}).items():
+ name = self._get_name('collection', name)
collections.append(Collection(name, item, self._resource_defs))
return collections
@@ -444,6 +613,7 @@ def waiters(self):
waiters = []
for name, item in self._definition.get('waiters', {}).items():
+ name = self._get_name('waiter', Waiter.PREFIX + name)
waiters.append(Waiter(name, item))
return waiters
| diff --git a/tests/unit/resources/test_factory.py b/tests/unit/resources/test_factory.py
index c320e875eb..6e4235bbd4 100644
--- a/tests/unit/resources/test_factory.py
+++ b/tests/unit/resources/test_factory.py
@@ -11,7 +11,7 @@
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
-from botocore.model import ServiceModel, StructureShape
+from botocore.model import DenormalizedStructureBuilder, ServiceModel
from boto3.exceptions import ResourceLoadException
from boto3.resources.base import ServiceResource
from boto3.resources.collection import CollectionManager
@@ -127,11 +127,14 @@ def test_factory_creates_properties(self):
}
}
}
- shape = mock.Mock()
- shape.members = {
- 'ETag': None,
- 'LastModified': None,
- }
+ shape = DenormalizedStructureBuilder().with_members({
+ 'ETag': {
+ 'type': 'string',
+ },
+ 'LastModified': {
+ 'type': 'string'
+ }
+ }).build_model()
service_model = mock.Mock()
service_model.shape_for.return_value = shape
@@ -358,12 +361,20 @@ def test_resource_lazy_loads_properties(self, action_cls):
}
}
}
- shape = mock.Mock()
- shape.members = {
- 'Url': None,
- 'ETag': None,
- 'LastModified': None,
- }
+ shape = DenormalizedStructureBuilder().with_members({
+ 'ETag': {
+ 'type': 'string',
+ 'shape_name': 'ETag'
+ },
+ 'LastModified': {
+ 'type': 'string',
+ 'shape_name': 'LastModified'
+ },
+ 'Url': {
+ 'type': 'string',
+ 'shape_name': 'Url'
+ }
+ }).build_model()
service_model = mock.Mock()
service_model.shape_for.return_value = shape
@@ -402,12 +413,17 @@ def test_resource_lazy_properties_missing_load(self, action_cls):
# Note the lack of a `load` method. These resources
# are usually loaded via a call on a parent resource.
}
- shape = mock.Mock()
- shape.members = {
- 'Url': None,
- 'ETag': None,
- 'LastModified': None,
- }
+ shape = DenormalizedStructureBuilder().with_members({
+ 'ETag': {
+ 'type': 'string',
+ },
+ 'LastModified': {
+ 'type': 'string'
+ },
+ 'Url': {
+ 'type': 'string'
+ }
+ }).build_model()
service_model = mock.Mock()
service_model.shape_for.return_value = shape
@@ -518,7 +534,7 @@ def test_resource_loads_collections(self, mock_model):
'Queue': {}
}
service_model = ServiceModel({})
- mock_model.return_value.name = 'Queues'
+ mock_model.return_value.name = 'queues'
resource = self.load('test', 'test', model, defs, service_model)()
diff --git a/tests/unit/resources/test_model.py b/tests/unit/resources/test_model.py
index edeea4b6c8..5936385a14 100644
--- a/tests/unit/resources/test_model.py
+++ b/tests/unit/resources/test_model.py
@@ -11,6 +11,8 @@
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
+from botocore.model import DenormalizedStructureBuilder
+
from boto3.resources.model import ResourceModel, Action, Collection, Waiter
from tests import BaseTestCase
@@ -182,7 +184,7 @@ def test_resource_references(self):
self.assertEqual(len(model.references), 1)
ref = model.references[0]
- self.assertEqual(ref.name, 'Frob')
+ self.assertEqual(ref.name, 'frob')
self.assertEqual(ref.resource.type, 'Frob')
self.assertEqual(ref.resource.identifiers[0].target, 'Id')
self.assertEqual(ref.resource.identifiers[0].source, 'data')
@@ -230,7 +232,194 @@ def test_waiter(self):
waiter = model.waiters[0]
self.assertIsInstance(waiter, Waiter)
- self.assertEqual(waiter.name, 'Exists')
+ self.assertEqual(waiter.name, 'wait_until_exists')
self.assertEqual(waiter.waiter_name, 'ObjectExists')
- self.assertEqual(waiter.resource_waiter_name, 'WaitUntilExists')
self.assertEqual(waiter.params[0].target, 'Bucket')
+
+class TestRenaming(BaseTestCase):
+ def test_multiple(self):
+ # This tests a bunch of different renames working together
+ model = ResourceModel('test', {
+ 'identifiers': [{'name': 'Foo'}],
+ 'actions': {
+ 'Foo': {}
+ },
+ 'has': {
+ 'Foo': {
+ 'resource': {
+ 'type': 'Frob',
+ 'identifiers': [
+ {'target':'Id', 'source':'data',
+ 'path': 'FrobId'}
+ ]
+ }
+ }
+ },
+ 'hasMany': {
+ 'Foo': {}
+ },
+ 'waiters': {
+ 'Foo': {}
+ }
+ }, {
+ 'Frob': {}
+ })
+
+ shape = DenormalizedStructureBuilder().with_members({
+ 'Foo': {
+ 'type': 'string',
+ },
+ 'Bar': {
+ 'type': 'string'
+ }
+ }).build_model()
+
+ model.load_rename_map(shape)
+
+ self.assertEqual(model.identifiers[0].name, 'foo')
+ self.assertEqual(model.actions[0].name, 'foo_action')
+ self.assertEqual(model.references[0].name, 'foo_reference')
+ self.assertEqual(model.collections[0].name, 'foo_collection')
+ self.assertEqual(model.waiters[0].name, 'wait_until_foo')
+
+ # If an identifier and an attribute share the same name, then
+ # the attribute is essentially hidden.
+ self.assertNotIn('foo_attribute', model.get_attributes(shape))
+
+ # Other attributes need to be there, though
+ self.assertIn('bar', model.get_attributes(shape))
+
+ # The rest of the tests below ensure the correct order of precedence
+ # for the various categories of attributes/properties/methods on the
+ # resource model.
+ def test_meta_beats_identifier(self):
+ model = ResourceModel('test', {
+ 'identifiers': [{'name': 'Meta'}]
+ }, {})
+
+ model.load_rename_map()
+
+ self.assertEqual(model.identifiers[0].name, 'meta_identifier')
+
+ def test_load_beats_identifier(self):
+ model = ResourceModel('test', {
+ 'identifiers': [{'name': 'Load'}],
+ 'load': {
+ 'request': {
+ 'operation': 'GetFrobs'
+ }
+ }
+ }, {})
+
+ model.load_rename_map()
+
+ self.assertTrue(model.load)
+ self.assertEqual(model.identifiers[0].name, 'load_identifier')
+
+ def test_identifier_beats_action(self):
+ model = ResourceModel('test', {
+ 'identifiers': [{'name': 'foo'}],
+ 'actions': {
+ 'Foo': {
+ 'request': {
+ 'operation': 'GetFoo'
+ }
+ }
+ }
+ }, {})
+
+ model.load_rename_map()
+
+ self.assertEqual(model.identifiers[0].name, 'foo')
+ self.assertEqual(model.actions[0].name, 'foo_action')
+
+ def test_action_beats_reference(self):
+ model = ResourceModel('test', {
+ 'actions': {
+ 'Foo': {
+ 'request': {
+ 'operation': 'GetFoo'
+ }
+ }
+ },
+ 'has': {
+ 'Foo': {
+ 'resource': {
+ 'type': 'Frob',
+ 'identifiers': [
+ {'target':'Id', 'source':'data',
+ 'path': 'FrobId'}
+ ]
+ }
+ }
+ }
+ }, {'Frob': {}})
+
+ model.load_rename_map()
+
+ self.assertEqual(model.actions[0].name, 'foo')
+ self.assertEqual(model.references[0].name, 'foo_reference')
+
+ def test_reference_beats_collection(self):
+ model = ResourceModel('test', {
+ 'has': {
+ 'Foo': {
+ 'resource': {
+ 'type': 'Frob',
+ 'identifiers': [
+ {'target':'Id', 'source':'data',
+ 'path': 'FrobId'}
+ ]
+ }
+ }
+ },
+ 'hasMany': {
+ 'Foo': {
+ 'resource': {
+ 'type': 'Frob'
+ }
+ }
+ }
+ }, {'Frob': {}})
+
+ model.load_rename_map()
+
+ self.assertEqual(model.references[0].name, 'foo')
+ self.assertEqual(model.collections[0].name, 'foo_collection')
+
+ def test_collection_beats_waiter(self):
+ model = ResourceModel('test', {
+ 'hasMany': {
+ 'WaitUntilFoo': {
+ 'resource': {
+ 'type': 'Frob'
+ }
+ }
+ },
+ 'waiters': {
+ 'Foo': {}
+ }
+ }, {'Frob': {}})
+
+ model.load_rename_map()
+
+ self.assertEqual(model.collections[0].name, 'wait_until_foo')
+ self.assertEqual(model.waiters[0].name, 'wait_until_foo_waiter')
+
+ def test_waiter_beats_attribute(self):
+ model = ResourceModel('test', {
+ 'waiters': {
+ 'Foo': {}
+ }
+ }, {'Frob': {}})
+
+ shape = DenormalizedStructureBuilder().with_members({
+ 'WaitUntilFoo': {
+ 'type': 'string',
+ }
+ }).build_model()
+
+ model.load_rename_map(shape)
+
+ self.assertEqual(model.waiters[0].name, 'wait_until_foo')
+ self.assertIn('wait_until_foo_attribute', model.get_attributes(shape))
| [
{
"components": [
{
"doc": "Load a name translation map given a shape. This will set\nup renamed values for any collisions, e.g. if the shape,\nan action, and a subresource all are all named ``foo``\nthen the resource will have an action ``foo``, a subresource\nnamed ``Foo`` and a property named `... | [
"tests/unit/resources/test_model.py::TestModels::test_resource_references",
"tests/unit/resources/test_model.py::TestModels::test_waiter",
"tests/unit/resources/test_model.py::TestRenaming::test_action_beats_reference",
"tests/unit/resources/test_model.py::TestRenaming::test_collection_beats_waiter",
"tests... | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_can_instantiate_service_resource",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_factory_creates_dangling_resources",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_factory_creates_properties",
"tests/unit/re... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Move name collision logic to the model
This change moves and formalizes the name collision logic from the factory
to the resource model. The following has changed:
- Documentation now better matches the factory output.
- Collision renaming order is formalized in one place
- Order: reserved names (meta), load action, identifiers, actions,
subresources, references, collections, and then attributes.
- Renaming resource model attributes/methods now happens at model loading
time rather than class creation time.
The way this works is by creating a mapping of (type, name) tuples to
the renamed value, if it exists. Typically this mapping will be very
sparse and it's fast to create / access. In practice we currently only
have one or two names that collide across all the resource models.
Tests have been updated, some of which needed to define proper Botocore
shapes as the code now looks more closely at those at model load time.
cc @kyleknap @jamesls
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/resources/model.py]
(definition of ResourceModel.load_rename_map:)
def load_rename_map(self, shape=None):
"""Load a name translation map given a shape. This will set
up renamed values for any collisions, e.g. if the shape,
an action, and a subresource all are all named ``foo``
then the resource will have an action ``foo``, a subresource
named ``Foo`` and a property named ``foo_attribute``.
This is the order of precedence, from most important to
least important:
* Load action (resource.load)
* Identifiers
* Actions
* Subresources
* References
* Collections
* Waiters
* Attributes (shape members)
Batch actions are only exposed on collections, so do not
get modified here. Subresources use upper camel casing, so
are unlikely to collide with anything but other subresources.
Creates a structure like this::
renames = {
('action', 'id'): 'id_action',
('collection', 'id'): 'id_collection',
('attribute', 'id'): 'id_attribute'
}
# Get the final name for an action named 'id'
name = renames.get(('action', 'id'), 'id')
:type shape: botocore.model.Shape
:param shape: The underlying shape for this resource."""
(definition of ResourceModel._load_name_with_category:)
def _load_name_with_category(self, names, name, category, snake_case=True):
"""Load a name with a given category, possibly renaming it
if that name is already in use. The name will be stored
in ``names`` and possibly be set up in ``self._renamed``.
:type names: set
:param names: Existing names (Python attributes, properties, or
methods) on the resource.
:type name: string
:param name: The original name of the value.
:type category: string
:param category: The value type, such as 'identifier' or 'action'
:type snake_case: bool
:param snake_case: True (default) if the name should be snake cased."""
(definition of ResourceModel._get_name:)
def _get_name(self, category, name, snake_case=True):
"""Get a possibly renamed value given a category and name. This
uses the rename map set up in ``load_rename_map``, so that
method must be called once first.
:type category: string
:param category: The value type, such as 'identifier' or 'action'
:type name: string
:param name: The original name of the value
:type snake_case: bool
:param snake_case: True (default) if the name should be snake cased.
:rtype: string
:return: Either the renamed value if it is set, otherwise the
original name."""
(definition of ResourceModel.get_attributes:)
def get_attributes(self, shape):
"""Get a dictionary of attribute names to original name and shape
models that represent the attributes of this resource. Looks
like the following:
{
'some_name': ('SomeName', <Shape...>)
}
:type shape: botocore.model.Shape
:param shape: The underlying shape for this resource.
:rtype: dict
:return: Mapping of resource attributes."""
[end of new definitions in boto3/resources/model.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | ||
boto__boto3-27 | 27 | boto/boto3 | null | 65c797349e404a1a5f398463eca084caf29ce525 | 2014-11-24T21:09:26Z | diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index 50b095575f..9ce3bb3106 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -1,6 +1,21 @@
Changelog
=========
+0.0.3 - 2014-11-26
+------------------
+
+* feature: Update to Botocore 0.76.0.
+
+ * Add support for using AWS Data Pipeline templates to create
+ pipelines and bind values to parameters in the pipeline
+ * Add support to Amazon Elastic Transcoder client for encryption of files
+ in Amazon S3.
+ * Fix issue where Amazon S3 requests were not being
+ resigned correctly when using Signature Version 4.
+ (`botocore issue 388 <https://github.com/boto/botocore/pull/388>`__)
+ * Add support for custom response parsing in Botocore clients.
+ (`botocore issue 387 <https://github.com/boto/botocore/pull/387>`__)
+
0.0.2 - 2014-11-20
------------------
diff --git a/boto3/__init__.py b/boto3/__init__.py
index 8ab8a3ccd5..c64a2d1ce9 100644
--- a/boto3/__init__.py
+++ b/boto3/__init__.py
@@ -17,7 +17,7 @@
__author__ = 'Amazon Web Services'
-__version__ = '0.0.2'
+__version__ = '0.0.3'
# The default Boto3 session; autoloaded when needed.
diff --git a/boto3/resources/model.py b/boto3/resources/model.py
index e5c8a6fddf..ee17783995 100644
--- a/boto3/resources/model.py
+++ b/boto3/resources/model.py
@@ -70,9 +70,12 @@ def __init__(self, name, definition, resource_defs):
self.path = definition.get('path')
-class Request(object):
+
+class DefinitionWithParams(object):
"""
- A service operation action request.
+ An item which has parameters exposed via the ``params`` property.
+ A request has an operation and parameters, while a waiter has
+ a name, a low-level waiter name and parameters.
:type definition: dict
:param definition: The JSON definition
@@ -80,9 +83,6 @@ class Request(object):
def __init__(self, definition):
self._definition = definition
- #: (``string``) The name of the low-level service operation
- self.operation = definition.get('operation')
-
@property
def params(self):
"""
@@ -121,6 +121,39 @@ def __init__(self, target, source_type, source):
self.source = source
+class Request(DefinitionWithParams):
+ """
+ A service operation action request.
+
+ :type definition: dict
+ :param definition: The JSON definition
+ """
+ def __init__(self, definition):
+ super(Request, self).__init__(definition)
+
+ #: (``string``) The name of the low-level service operation
+ self.operation = definition.get('operation')
+
+
+class Waiter(DefinitionWithParams):
+ """
+ An event waiter specification.
+
+ :type name: string
+ :param name: Name of the waiter
+ :type definition: dict
+ :param definition: The JSON definition
+ """
+ def __init__(self, name, definition):
+ super(Waiter, self).__init__(definition)
+
+ #: (``string``) The name of this waiter
+ self.name = name
+
+ #: (``string``) The name of the underlying event waiter
+ self.waiter_name = definition.get('waiterName')
+
+
class ResponseResource(object):
"""
A resource response to create after performing an action.
@@ -284,6 +317,20 @@ def actions(self):
return actions
+ @property
+ def batch_actions(self):
+ """
+ Get a list of batch actions for this resource.
+
+ :type: list(:py:class:`Action`)
+ """
+ actions = []
+
+ for name, item in self._definition.get('batchActions', {}).items():
+ actions.append(Action(name, item, self._resource_defs))
+
+ return actions
+
@property
def references(self):
"""
@@ -355,3 +402,17 @@ def collections(self):
collections.append(Collection(name, item, self._resource_defs))
return collections
+
+ @property
+ def waiters(self):
+ """
+ Get a list of waiters for this resource.
+
+ :type: list(:py:class:`Waiter`)
+ """
+ waiters = []
+
+ for name, item in self._definition.get('waiters', {}).items():
+ waiters.append(Waiter(name, item))
+
+ return waiters
diff --git a/docs/source/reference/core/resources.rst b/docs/source/reference/core/resources.rst
index f868487f02..85a9595ab8 100644
--- a/docs/source/reference/core/resources.rst
+++ b/docs/source/reference/core/resources.rst
@@ -10,6 +10,7 @@ Resource Model
.. automodule:: boto3.resources.model
:members:
:undoc-members:
+ :inherited-members:
Request Parameters
------------------
diff --git a/requirements.txt b/requirements.txt
index 4550a23621..7d82c342ae 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,6 +1,6 @@
-botocore==0.74.0
-bcdoc==0.12.2
-jmespath==0.5.0
+-e git://github.com/boto/botocore.git@develop#egg=botocore
+-e git://github.com/boto/bcdoc.git@develop#egg=bcdoc
+-e git://github.com/boto/jmespath.git@develop#egg=jmespath
six==1.7.3
nose==1.3.3
mock==1.0.1
diff --git a/setup.py b/setup.py
index 71babdf470..b67b080307 100644
--- a/setup.py
+++ b/setup.py
@@ -26,7 +26,7 @@ def get_version():
]
requires = [
- 'botocore==0.74.0',
+ 'botocore==0.76.0',
'bcdoc==0.12.2',
'jmespath==0.5.0',
'six==1.7.3',
| diff --git a/tests/integration/test_collections.py b/tests/integration/test_collections.py
index 15c571158d..0574acde49 100644
--- a/tests/integration/test_collections.py
+++ b/tests/integration/test_collections.py
@@ -26,6 +26,7 @@
# or are very slow to run.
BLACKLIST = {
'ec2': ['images'],
+ 'iam': ['signing_certificates'],
'sqs': ['dead_letter_source_queues']
}
diff --git a/tests/unit/resources/test_model.py b/tests/unit/resources/test_model.py
index 9b3ef1a214..7d1442f8fb 100644
--- a/tests/unit/resources/test_model.py
+++ b/tests/unit/resources/test_model.py
@@ -12,7 +12,7 @@
# language governing permissions and limitations under the License.
from boto3.resources.model import ResourceModel, Action, SubResourceList,\
- Collection
+ Collection, Waiter
from tests import BaseTestCase
@@ -101,6 +101,28 @@ def test_resource_load_action(self):
self.assertEqual(model.load.request.operation, 'GetFrobInfo')
self.assertEqual(model.load.path, '$')
+ def test_resource_batch_action(self):
+ model = ResourceModel('test', {
+ 'batchActions': {
+ 'Delete': {
+ 'request': {
+ 'operation': 'DeleteObjects',
+ 'params': [
+ {'target': 'Bucket', 'sourceType': 'identifier',
+ 'source': 'BucketName'}
+ ]
+ }
+ }
+ }
+ }, {})
+
+ self.assertIsInstance(model.batch_actions, list)
+
+ action = model.batch_actions[0]
+ self.assertIsInstance(action, Action)
+ self.assertEqual(action.request.operation, 'DeleteObjects')
+ self.assertEqual(action.request.params[0].target, 'Bucket')
+
def test_sub_resources(self):
model = ResourceModel('test', {
'subResources': {
@@ -223,3 +245,24 @@ def test_resource_collections(self):
self.assertEqual(model.collections[0].resource.type, 'Frob')
self.assertEqual(model.collections[0].resource.model.name, 'Frob')
self.assertEqual(model.collections[0].path, 'FrobList[]')
+
+ def test_waiter(self):
+ model = ResourceModel('test', {
+ 'waiters': {
+ 'Exists': {
+ 'waiterName': 'ObjectExists',
+ 'params': [
+ {'target': 'Bucket', 'sourceType': 'identifier',
+ 'source': 'BucketName'}
+ ]
+ }
+ }
+ }, {})
+
+ self.assertIsInstance(model.waiters, list)
+
+ waiter = model.waiters[0]
+ self.assertIsInstance(waiter, Waiter)
+ self.assertEqual(waiter.name, 'Exists')
+ self.assertEqual(waiter.waiter_name, 'ObjectExists')
+ self.assertEqual(waiter.params[0].target, 'Bucket')
| diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index 50b095575f..9ce3bb3106 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -1,6 +1,21 @@
Changelog
=========
+0.0.3 - 2014-11-26
+------------------
+
+* feature: Update to Botocore 0.76.0.
+
+ * Add support for using AWS Data Pipeline templates to create
+ pipelines and bind values to parameters in the pipeline
+ * Add support to Amazon Elastic Transcoder client for encryption of files
+ in Amazon S3.
+ * Fix issue where Amazon S3 requests were not being
+ resigned correctly when using Signature Version 4.
+ (`botocore issue 388 <https://github.com/boto/botocore/pull/388>`__)
+ * Add support for custom response parsing in Botocore clients.
+ (`botocore issue 387 <https://github.com/boto/botocore/pull/387>`__)
+
0.0.2 - 2014-11-20
------------------
diff --git a/docs/source/reference/core/resources.rst b/docs/source/reference/core/resources.rst
index f868487f02..85a9595ab8 100644
--- a/docs/source/reference/core/resources.rst
+++ b/docs/source/reference/core/resources.rst
@@ -10,6 +10,7 @@ Resource Model
.. automodule:: boto3.resources.model
:members:
:undoc-members:
+ :inherited-members:
Request Parameters
------------------
diff --git a/requirements.txt b/requirements.txt
index 4550a23621..7d82c342ae 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,6 +1,6 @@
-botocore==0.74.0
-bcdoc==0.12.2
-jmespath==0.5.0
+-e git://github.com/boto/botocore.git@develop#egg=botocore
+-e git://github.com/boto/bcdoc.git@develop#egg=bcdoc
+-e git://github.com/boto/jmespath.git@develop#egg=jmespath
six==1.7.3
nose==1.3.3
mock==1.0.1
| [
{
"components": [
{
"doc": "An item which has parameters exposed via the ``params`` property.\nA request has an operation and parameters, while a waiter has\na name, a low-level waiter name and parameters.\n\n:type definition: dict\n:param definition: The JSON definition",
"lines": [
... | [
"tests/unit/resources/test_model.py::TestModels::test_resource_action_raw",
"tests/unit/resources/test_model.py::TestModels::test_resource_action_response_resource",
"tests/unit/resources/test_model.py::TestModels::test_resource_batch_action",
"tests/unit/resources/test_model.py::TestModels::test_resource_col... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add batch actions and waiters to resource model.
This adds definitions to the internal resource model to handle batch actions
and waiters, both of which are recent additions to the resource JSON format.
This change is a prerequisite to adding support for these features in the
resource class factory. Tests are added for the new model features.
cc @jamesls, @kyleknap
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/resources/model.py]
(definition of DefinitionWithParams:)
class DefinitionWithParams(object):
"""An item which has parameters exposed via the ``params`` property.
A request has an operation and parameters, while a waiter has
a name, a low-level waiter name and parameters.
:type definition: dict
:param definition: The JSON definition"""
(definition of DefinitionWithParams.__init__:)
def __init__(self, definition):
(definition of DefinitionWithParams.params:)
def params(self):
"""Get a list of auto-filled parameters for this request.
:type: list(:py:class:`Parameter`)"""
(definition of Waiter:)
class Waiter(DefinitionWithParams):
"""An event waiter specification.
:type name: string
:param name: Name of the waiter
:type definition: dict
:param definition: The JSON definition"""
(definition of Waiter.__init__:)
def __init__(self, name, definition):
(definition of ResourceModel.batch_actions:)
def batch_actions(self):
"""Get a list of batch actions for this resource.
:type: list(:py:class:`Action`)"""
(definition of ResourceModel.waiters:)
def waiters(self):
"""Get a list of waiters for this resource.
:type: list(:py:class:`Waiter`)"""
[end of new definitions in boto3/resources/model.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | |
boto__boto3-11 | 11 | boto/boto3 | null | f0c07a858001ff38a1992d62bad589667ee49cd3 | 2014-10-31T22:07:42Z | diff --git a/boto3/resources/base.py b/boto3/resources/base.py
index 42e7f0330a..274dac68d6 100644
--- a/boto3/resources/base.py
+++ b/boto3/resources/base.py
@@ -63,3 +63,16 @@ def __repr__(self):
self.__class__.__name__,
', '.join(identifiers),
)
+
+ def __eq__(self, other):
+ # Should be instances of the same resource class
+ if other.__class__.__name__ != self.__class__.__name__:
+ return False
+
+ # Each of the identifiers should have the same value in both
+ # instances, e.g. two buckets need the same name to be equal.
+ for identifier in self.meta['identifiers']:
+ if getattr(self, identifier) != getattr(other, identifier):
+ return False
+
+ return True
diff --git a/boto3/resources/factory.py b/boto3/resources/factory.py
index 8f5595bc5f..72346a4257 100644
--- a/boto3/resources/factory.py
+++ b/boto3/resources/factory.py
@@ -412,6 +412,13 @@ def do_action(self, *args, **kwargs):
# instance via ``self``.
def do_action(self, *args, **kwargs):
response = action(self, *args, **kwargs)
+
+ if hasattr(self, 'load'):
+ # Clear cached data. It will be reloaded the next
+ # time that an attribute is accessed.
+ # TODO: Make this configurable in the future?
+ self.meta['data'] = None
+
return response
do_action.__name__ = str(snake_cased)
diff --git a/docs/source/guide/resources.rst b/docs/source/guide/resources.rst
index fd2804bfe5..ab0b19e62b 100644
--- a/docs/source/guide/resources.rst
+++ b/docs/source/guide/resources.rst
@@ -56,6 +56,25 @@ Identifiers may also be passed as positional arguments::
# Raises exception, missing key!
obj = s3.Object('boto3')
+Identifiers also play a role in resource instance equality. For two
+instances of a resource to be considered equal, their identifiers must
+be equal::
+
+ >>> bucket1 = s3.Bucket('boto3')
+ >>> bucket2 = s3.Bucket('boto3')
+ >>> bucket3 = s3.Bucket('some-other-bucket')
+
+ >>> bucket1 == bucket2
+ True
+ >>> bucket1 == bucket3
+ False
+
+.. note::
+
+ Only identifiers are taken into account for instance equality. Region,
+ account ID and other data members are not considered. When using temporary
+ credentials or multiple regions in your code please keep this in mind.
+
Resources may also have attributes, which are *lazy-loaded* properties on the
instance. They may be set at creation time from the response of an action on
another resource, or they may be set when accessed or via an explicit call to
@@ -75,6 +94,12 @@ the ``load`` or ``reload`` action. Examples of attributes::
exactly when the load action (and thus latency) is invoked. The
documentation for each resource explicitly lists its attributes.
+ Additionally, attributes may be reloaded after an action has been
+ performed on the resource. For example, if the ``last_modified``
+ attribute of an S3 object is loaded and then a ``put`` action is
+ called, then the next time you access ``last_modified`` it will
+ reload the object's metadata.
+
Actions
-------
An action is a method which makes a call to the service. Actions may return a
| diff --git a/tests/unit/resources/test_factory.py b/tests/unit/resources/test_factory.py
index e9c2c38e3e..b04ea967a1 100644
--- a/tests/unit/resources/test_factory.py
+++ b/tests/unit/resources/test_factory.py
@@ -261,6 +261,39 @@ def test_dangling_resource_raises_for_unknown_arg(self):
with self.assertRaises(ValueError):
resource.Queue(url='foo', bar='baz')
+ def test_dangling_resource_equality(self):
+ defs = {
+ 'Queue': {
+ 'identifiers': [{'name': 'Url'}]
+ }
+ }
+
+ resource = self.load('test', 'test', {}, defs, None)()
+
+ q1 = resource.Queue('url')
+ q2 = resource.Queue('url')
+
+ self.assertEqual(q1, q2)
+
+ def test_dangling_resource_inequality(self):
+ defs = {
+ 'Queue': {
+ 'identifiers': [{'name': 'Url'}]
+ },
+ 'Message': {
+ 'identifiers': [{'name': 'QueueUrl'}, {'name': 'Handle'}]
+ }
+ }
+
+ resource = self.load('test', 'test', {}, defs, None)()
+
+ q1 = resource.Queue('url')
+ q2 = resource.Queue('different')
+ m = resource.Message('url', 'handle')
+
+ self.assertNotEqual(q1, q2)
+ self.assertNotEqual(q1, m)
+
def test_non_service_resource_missing_defs(self):
# Only services should get dangling defs
defs = {
@@ -323,8 +356,6 @@ def test_resource_meta_unique(self):
queue1 = queue_cls()
queue2 = queue_cls()
- self.assertNotEqual(queue1, queue2)
-
self.assertEqual(queue1.meta, queue2.meta,
'Queue meta copies not equal after creation')
@@ -355,6 +386,59 @@ def test_resource_calls_action(self, action_cls):
action.assert_called_with(queue, 'arg1', arg2=2)
+ @mock.patch('boto3.resources.factory.ServiceAction')
+ def test_resource_action_clears_data(self, action_cls):
+ model = {
+ 'load': {
+ 'request': {
+ 'operation': 'DescribeQueue'
+ }
+ },
+ 'actions': {
+ 'GetMessageStatus': {
+ 'request': {
+ 'operation': 'DescribeMessageStatus'
+ }
+ }
+ }
+ }
+
+ queue = self.load('test', 'Queue', model, {}, None)()
+
+ # Simulate loaded data
+ queue.meta['data'] = {'some': 'data'}
+
+ # Perform a call
+ queue.get_message_status()
+
+ # Cached data should be cleared
+ self.assertIsNone(queue.meta['data'])
+
+ @mock.patch('boto3.resources.factory.ServiceAction')
+ def test_resource_action_leaves_data(self, action_cls):
+ # This model has NO load method. Cached data should
+ # never be cleared since it cannot be reloaded!
+ model = {
+ 'actions': {
+ 'GetMessageStatus': {
+ 'request': {
+ 'operation': 'DescribeMessageStatus'
+ }
+ }
+ }
+ }
+
+ queue = self.load('test', 'Queue', model, {}, None)()
+
+ # Simulate loaded data
+ queue.meta['data'] = {'some': 'data'}
+
+ # Perform a call
+ queue.get_message_status()
+
+ # Cached data should not be cleared
+ self.assertEqual(queue.meta['data'], {'some': 'data'})
+
@mock.patch('boto3.resources.factory.ServiceAction')
def test_resource_lazy_loads_properties(self, action_cls):
model = {
| diff --git a/docs/source/guide/resources.rst b/docs/source/guide/resources.rst
index fd2804bfe5..ab0b19e62b 100644
--- a/docs/source/guide/resources.rst
+++ b/docs/source/guide/resources.rst
@@ -56,6 +56,25 @@ Identifiers may also be passed as positional arguments::
# Raises exception, missing key!
obj = s3.Object('boto3')
+Identifiers also play a role in resource instance equality. For two
+instances of a resource to be considered equal, their identifiers must
+be equal::
+
+ >>> bucket1 = s3.Bucket('boto3')
+ >>> bucket2 = s3.Bucket('boto3')
+ >>> bucket3 = s3.Bucket('some-other-bucket')
+
+ >>> bucket1 == bucket2
+ True
+ >>> bucket1 == bucket3
+ False
+
+.. note::
+
+ Only identifiers are taken into account for instance equality. Region,
+ account ID and other data members are not considered. When using temporary
+ credentials or multiple regions in your code please keep this in mind.
+
Resources may also have attributes, which are *lazy-loaded* properties on the
instance. They may be set at creation time from the response of an action on
another resource, or they may be set when accessed or via an explicit call to
@@ -75,6 +94,12 @@ the ``load`` or ``reload`` action. Examples of attributes::
exactly when the load action (and thus latency) is invoked. The
documentation for each resource explicitly lists its attributes.
+ Additionally, attributes may be reloaded after an action has been
+ performed on the resource. For example, if the ``last_modified``
+ attribute of an S3 object is loaded and then a ``put`` action is
+ called, then the next time you access ``last_modified`` it will
+ reload the object's metadata.
+
Actions
-------
An action is a method which makes a call to the service. Actions may return a
| [
{
"components": [
{
"doc": "",
"lines": [
67,
78
],
"name": "ServiceResource.__eq__",
"signature": "def __eq__(self, other):",
"type": "function"
}
],
"file": "boto3/resources/base.py"
}
] | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_dangling_resource_equality",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_resource_action_clears_data"
] | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_can_instantiate_service_resource",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_create_resource_calls_load",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_create_service_calls_load",
"tests/unit/resources/t... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Implement resource equality based on identifiers.
This pull request adds two related features. First, it adds resource instance
equality by comparing the classes and identifiers of two resources. If they
share the same class (e.g. `s3.Bucket`) and the same identifiers (e.g.
`name`) then they are considered equal.
``` python
>>> b1 = s3.Bucket('boto3')
>>> b2 = s3.Bucket('boto3')
>>> b1 == b2
True
```
Related to equality, this also implements reloading of attributes
after an action has been performed. Imagine the following scenario:
``` python
>>> o = s3.Object('boto3', 'test.txt')
>>> o.put(Body='hello')
>>> print(o.last_modified)
datetime(2014, 10, 31, 0, 0, 0)
>>> o.put(Body='hello, world')
>>> print(o.last_modified)
datetime(2014, 10, 31, 0, 0, 20)
>>> o2 = s3.Object('boto3', 'test.txt')
>>> o == o2
True
>>> o.last_modified == o2.last_modified
True
```
Updated documentation and tests to reflect these changes.
cc @jamesls, @kyleknap. Definitely open to suggestions on this one, but I think this is the most intuitive approach for people coming to use the SDK.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/resources/base.py]
(definition of ServiceResource.__eq__:)
def __eq__(self, other):
[end of new definitions in boto3/resources/base.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | |
boto__boto3-10 | 10 | boto/boto3 | null | e9d22eff450f82192d7487eb643216ae1a014189 | 2014-10-29T23:03:46Z | diff --git a/boto3/__init__.py b/boto3/__init__.py
index 18e1be7ff9..8c9b5d16ce 100644
--- a/boto3/__init__.py
+++ b/boto3/__init__.py
@@ -62,7 +62,7 @@ def _get_default_session():
"""
Get the default session, creating one if needed.
- :rtype: boto3.session.Sesssion
+ :rtype: :py:class:`~boto3.session.Sesssion`
:return: The default session
"""
if DEFAULT_SESSION is None:
diff --git a/boto3/compat.py b/boto3/compat.py
index 395d251c94..a59ee37a88 100644
--- a/boto3/compat.py
+++ b/boto3/compat.py
@@ -12,14 +12,3 @@
# language governing permissions and limitations under the License.
import six
-import sys
-
-try:
- from collections import OrderedDict
-except ImportError:
- from ordereddict import OrderedDict
-
-if sys.version_info[:2] == (2, 6):
- import simplejson as json
-else:
- import json
diff --git a/boto3/exceptions.py b/boto3/exceptions.py
index 741365da9a..d5352b2f44 100644
--- a/boto3/exceptions.py
+++ b/boto3/exceptions.py
@@ -13,3 +13,7 @@
class ResourceLoadException(Exception):
pass
+
+
+class NoVersionFound(Exception):
+ pass
diff --git a/boto3/resources/factory.py b/boto3/resources/factory.py
index 9177a2831f..126e355097 100644
--- a/boto3/resources/factory.py
+++ b/boto3/resources/factory.py
@@ -12,7 +12,6 @@
# language governing permissions and limitations under the License.
import logging
-import os
from functools import partial
from botocore import xform_name
@@ -22,95 +21,27 @@
from .collection import CollectionManager
from .model import ResourceModel
from .response import all_not_none, build_identifiers
-from ..compat import json, OrderedDict
from ..exceptions import ResourceLoadException
-# Where to find the resource objects
-RESOURCE_ROOT = os.path.join(
- os.path.dirname(os.path.dirname(__file__)),
- 'data',
- 'resources'
-)
-
-
logger = logging.getLogger(__name__)
-def get_latest_version(name):
- """
- Get the latest version number given a service name.
-
- :type name: string
- :param name: Service name, e.g. 'sqs'
- :rtype: string
- :return: Service version, e.g. 2012-11-05
- """
- entries = os.listdir(RESOURCE_ROOT)
- entries = [i for i in entries if i.startswith(name + '-')]
- return sorted(entries, reverse=True)[0][len(name) + 1:len(name) + 11]
-
-
class ResourceFactory(object):
"""
- A factory to create new ``ServiceResource`` classes from a ResourceJSON
- description. There are two types of lookups that can be done: one on the
- service itself (e.g. an SQS resource) and another on models contained
- within the service (e.g. an SQS Queue resource).
-
- >>> factory = ResourceFactory()
- >>> S3Resource = factory.create_class('s3')
- >>> S3BucketResource = factory.create_class('s3', name='Bucket')
- >>> SQSResource = factory.create_class('sqs')
- >>> SQSQueueResource = factory.create_class('sqs', name='Queue')
-
+ A factory to create new :py:class:`~boto3.resources.base.ServiceResource`
+ classes from a :py:class:`~boto3.resources.model.ResourceModel`. There are
+ two types of lookups that can be done: one on the service itself (e.g. an
+ SQS resource) and another on models contained within the service (e.g. an
+ SQS Queue resource).
"""
- def create_class(self, service, name=None, version=None,
- service_model=None):
- """
- Create a new resource class for a service or service resource.
-
- :type service: string
- :param service: Name of the service to look up
- :type name: string
- :param name: Name of the resource to look up. If not given, then the
- service resource itself is returned.
- :type version: string
- :param version: The service version to load. A value of ``None`` will
- load the latest available version.
- :type service_model: ``botocore.model.ServiceModel``
- :param service_model: The Botocore service model, required only if the
- resource shape contains members. This is used to
- expose lazy-loaded attributes on the resource.
- :rtype: Subclass of ``ServiceResource``
- :return: The service or resource class.
- """
- if version is None:
- version = get_latest_version(service)
-
- path = os.path.join(RESOURCE_ROOT,
- '{0}-{1}.resources.json'.format(service, version))
-
- logger.info('Loading %s:%s from %s', service, name, path)
- model = json.load(open(path), object_pairs_hook=OrderedDict)
-
- resource_defs = model.get('resources', {})
-
- if name is None:
- cls = self.load_from_definition(service, service,
- model.get('service', {}), resource_defs, service_model)
- else:
- cls = self.load_from_definition(service, name,
- resource_defs.get(name, {}), resource_defs, service_model)
-
- return cls
-
def load_from_definition(self, service_name, resource_name, model,
resource_defs, service_model):
"""
- Loads a resource from a model, creating a new ServiceResource subclass
+ Loads a resource from a model, creating a new
+ :py:class:`~boto3.resources.base.ServiceResource` subclass
with the correct properties and methods, named based on the service
- and resource name, e.g. EC2InstanceResource.
+ and resource name, e.g. EC2.Instance.
:type service: string
:param service: Name of the service to look up
@@ -126,7 +57,7 @@ def load_from_definition(self, service_name, resource_name, model,
:param service_model: The Botocore service model, required only if the
resource shape contains members. This is used to
expose lazy-loaded attributes on the resource.
- :rtype: Subclass of ``ServiceResource``
+ :rtype: Subclass of :py:class:`~boto3.resources.base.ServiceResource`
:return: The service or resource class.
"""
# Set some basic info
diff --git a/boto3/session.py b/boto3/session.py
index 17c82a1e72..273a8e41cf 100644
--- a/boto3/session.py
+++ b/boto3/session.py
@@ -11,8 +11,11 @@
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
+import os
+
import botocore.session
+from .exceptions import NoVersionFound
from .resources.factory import ResourceFactory
@@ -50,16 +53,66 @@ def __init__(self, aws_access_key_id=None, aws_secret_access_key=None,
self._session.set_config_variable('region', region_name)
self.resource_factory = ResourceFactory()
+ self._setup_loader()
def __repr__(self):
- return '<boto3.Session({0}:{1})'.format(
- self._session.get_config_variable('region'),
- self._session.get_credentials().access_key)
+ return 'Session(region={0})'.format(
+ repr(self._session.get_config_variable('region')))
+
+ def _setup_loader(self):
+ """
+ Setup loader paths so that we can load resources.
+ """
+ self._loader = self._session.get_component('data_loader')
+ self._loader.data_path = ':'.join(
+ [self._loader.data_path,
+ os.path.join(os.path.dirname(__file__), 'data',
+ 'resources')]).strip(':')
+
+ def _get_resource_files(self):
+ """
+ This generator yields paths to resource files in the loader's
+ search paths. Specifically, it looks for files that end with
+ ``.resources.json`` in any of the search paths, but does not
+ recursively search the paths.
+ """
+ for path in self._loader.get_search_paths():
+ if not os.path.isdir(path) or not os.path.exists(path):
+ continue
+
+ items = os.listdir(path)
+ for entry in [i for i in items if i.endswith('.resources.json')]:
+ yield entry
+
+ def _find_latest_version(self, service_name):
+ """
+ Find the latest resource version of a given service if it exists,
+ otherwise raises an exception.
+
+ TODO: Merge this logic upstream into Botocore if possible. Botocore
+ depends on a different directory layout at the moment.
+
+ :rtype: string
+ :return: Version string like 'YYYY-MM-DD'
+ :raises: NoVersionFound
+ """
+ filtered = []
+ for path in self._get_resource_files():
+ if path.startswith(service_name + '-'):
+ filtered.append(path)
+
+ try:
+ # ['s3-2006-03-01.resources.json', ...] => '2006-03-01'
+ # Hard coded offsets below pull out just the date string
+ start = len(service_name)
+ return max([i[start + 1:start + 11] for i in filtered])
+ except ValueError:
+ raise NoVersionFound(service_name)
def get_available_services(self):
"""
Get a list of available services that can be loaded as low-level
- clients via ``session.client(name)``.
+ clients via :py:meth:`Session.client`.
:rtype: list
:return: List of service names
@@ -69,20 +122,25 @@ def get_available_services(self):
def get_available_resources(self):
"""
Get a list of available services that can be loaded as resource
- clients via ``session.resource(name)``.
+ clients via :py:meth:`Session.resource`.
:rtype: list
:return: List of service names
"""
- # TODO: Implement me!
- return []
+ service_names = set()
+
+ for path in self._get_resource_files():
+ # 'foo-bar-2006-03-01' => 'foo-bar'
+ service_names.add('-'.join(path.split('-')[:-3]))
+
+ return list(service_names)
def client(self, service_name, region_name=None, api_version=None,
use_ssl=True, verify=None, endpoint_url=None,
aws_access_key_id=None, aws_secret_access_key=None,
aws_session_token=None):
"""
- Create a low-level service client by name using the default session.
+ Create a low-level service client by name.
:type service_name: string
:param service_name: The name of a service, e.g. 's3' or 'ec2'. You
@@ -150,7 +208,7 @@ def resource(self, service_name, region_name=None, api_version=None,
aws_access_key_id=None, aws_secret_access_key=None,
aws_session_token=None):
"""
- Create a resource service client by name using the default session.
+ Create a resource service client by name.
:type service_name: string
:param service_name: The name of a service, e.g. 's3' or 'ec2'. You
@@ -204,8 +262,12 @@ def resource(self, service_name, region_name=None, api_version=None,
:param aws_session_token: The session token to use when creating
the client. Same semantics as aws_access_key_id above.
- :return: Resource client instance
+ :return: Subclass of :py:class:`~boto3.resources.base.ServiceResource`
"""
+ # Creating a new resource instance requires the low-level client
+ # and service model, the resource version and resource JSON data.
+ # We pass these to the factory and get back a class, which is
+ # instantiated on top of the low-level client.
client = self.client(
service_name, region_name=region_name, api_version=api_version,
use_ssl=use_ssl, verify=verify, endpoint_url=endpoint_url,
@@ -213,6 +275,10 @@ def resource(self, service_name, region_name=None, api_version=None,
aws_secret_access_key=aws_secret_access_key,
aws_session_token=aws_session_token)
service_model = self._session.get_service_model(service_name)
- cls = self.resource_factory.create_class(service_name,
- service_model=service_model)
+ version = self._find_latest_version(service_name)
+ model = self._loader.load_data(
+ '{0}-{1}.resources'.format(service_name, version))
+ cls = self.resource_factory.load_from_definition(
+ service_name, service_name, model['service'], model['resources'],
+ service_model)
return cls(client=client)
| diff --git a/tests/__init__.py b/tests/__init__.py
index 5dc56bf5d1..1f29e89d0e 100644
--- a/tests/__init__.py
+++ b/tests/__init__.py
@@ -36,9 +36,12 @@ class BaseTestCase(unittest.TestCase):
any actual calls to Botocore.
"""
def setUp(self):
- self.bc_session_patch = mock.patch('botocore.session.Session',
- autospec=True)
+ self.bc_session_patch = mock.patch('botocore.session.Session')
self.bc_session_cls = self.bc_session_patch.start()
+ loader = self.bc_session_cls.return_value.get_component.return_value
+ loader.data_path = ''
+ self.loader = loader
+
def tearDown(self):
self.bc_session_patch.stop()
diff --git a/tests/unit/resources/test_factory.py b/tests/unit/resources/test_factory.py
index 6e22fcb127..dc6e069652 100644
--- a/tests/unit/resources/test_factory.py
+++ b/tests/unit/resources/test_factory.py
@@ -24,38 +24,8 @@ def setUp(self):
self.factory = ResourceFactory()
self.load = self.factory.load_from_definition
- # Don't do version lookups on the filesystem, instead always return
- # a set date and mock calls to ``open`` when required.
- self.version_patch = mock.patch(
- 'boto3.resources.factory.get_latest_version')
- self.version_mock = self.version_patch.start()
- self.version_mock.return_value = '2014-01-01'
-
def tearDown(self):
super(TestResourceFactory, self).tearDown()
- self.version_patch.stop()
-
- def test_create_service_calls_load(self):
- self.factory.load_from_definition = mock.Mock()
- with mock.patch('boto3.resources.factory.open',
- mock.mock_open(read_data='{}'), create=True):
- self.factory.create_class('test')
-
- self.assertTrue(self.factory.load_from_definition.called,
- 'Class was not loaded from definition')
- self.factory.load_from_definition.assert_called_with(
- 'test', 'test', {}, {}, None)
-
- def test_create_resource_calls_load(self):
- self.factory.load_from_definition = mock.Mock()
- with mock.patch('boto3.resources.factory.open',
- mock.mock_open(read_data='{}'), create=True):
- self.factory.create_class('test', 'Queue')
-
- self.assertTrue(self.factory.load_from_definition.called,
- 'Class was not loaded from definition')
- self.factory.load_from_definition.assert_called_with(
- 'test', 'Queue', {}, {}, None)
def test_get_service_returns_resource_class(self):
TestResource = self.load('test', 'test', {}, {}, None)
diff --git a/tests/unit/test_session.py b/tests/unit/test_session.py
index 237dfa820b..d3fe1f76e6 100644
--- a/tests/unit/test_session.py
+++ b/tests/unit/test_session.py
@@ -11,11 +11,21 @@
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
+from boto3.exceptions import NoVersionFound
from boto3.session import Session
from tests import mock, BaseTestCase
class TestSession(BaseTestCase):
+ def test_repr(self):
+ bc_session = self.bc_session_cls.return_value
+ bc_session.get_credentials.return_value.access_key = 'abc123'
+ bc_session.get_config_variable.return_value = 'us-west-2'
+
+ session = Session('abc123', region_name='us-west-2')
+
+ self.assertEqual(repr(session), 'Session(region=\'us-west-2\')')
+
def test_arguments_not_required(self):
Session()
@@ -55,10 +65,16 @@ def test_get_available_services(self):
self.assertTrue(bc_session.get_available_services.called,
'Botocore session get_available_services not called')
- def test_get_available_resources(self):
+ @mock.patch('os.path.isdir', return_value=True)
+ @mock.patch('os.path.exists', return_value=True)
+ @mock.patch('os.listdir', return_value=[
+ 'sqs-2012-11-05.resources.json', 's3-2006-03-01.resources.json'])
+ def test_get_available_resources(self, list_mock, exist_mock, dir_mock):
session = Session()
- resources = session.get_available_resources()
- self.assertIsInstance(resources, list)
+ self.loader.get_search_paths.return_value = ['search-path']
+
+ names = session.get_available_resources()
+ self.assertEqual(sorted(names), ['s3', 'sqs'])
def test_create_client(self):
session = Session(region_name='us-east-1')
@@ -78,15 +94,23 @@ def test_create_client_with_args(self):
endpoint_url=None, use_ssl=True, aws_session_token=None,
verify=None, region_name='us-west-2', api_version=None)
- def test_create_resource(self):
+ @mock.patch('os.path.isdir', return_value=True)
+ @mock.patch('os.path.exists', return_value=True)
+ @mock.patch('os.listdir', return_value=['sqs-2012-11-05.resources.json'])
+ def test_create_resource(self, list_mock, exist_mock, dir_mock):
session = Session()
session.client = mock.Mock()
- session.resource_factory.create_class = mock.Mock()
- cls = session.resource_factory.create_class.return_value
+ load_mock = mock.Mock()
+ session.resource_factory.load_from_definition = load_mock
+ cls = load_mock.return_value
+
+ self.loader.get_search_paths.return_value = ['search-path']
sqs = session.resource('sqs', verify=False)
- self.assertTrue(session.resource_factory.create_class.called,
+ self.assertTrue(session.client.called,
+ 'No low-level client was created')
+ self.assertTrue(load_mock.called,
'Resource factory did not look up class')
self.assertTrue(cls.called,
'Resource instance was not created')
@@ -94,10 +118,15 @@ def test_create_resource(self):
'Returned instance is not an instance of the looked up resource '
'class from the factory')
- def test_create_resource_with_args(self):
+ @mock.patch('os.path.isdir', return_value=True)
+ @mock.patch('os.path.exists', return_value=True)
+ @mock.patch('os.listdir', return_value=['sqs-2012-11-05.resources.json'])
+ def test_create_resource_with_args(self, list_mock, exist_mock, dir_mock):
session = Session()
session.client = mock.Mock()
- session.resource_factory.create_class = mock.Mock()
+ session.resource_factory.load_from_definition = mock.Mock()
+
+ self.loader.get_search_paths.return_value = ['search-path']
session.resource('sqs', verify=False)
@@ -105,3 +134,56 @@ def test_create_resource_with_args(self):
'sqs', aws_secret_access_key=None, aws_access_key_id=None,
endpoint_url=None, use_ssl=True, aws_session_token=None,
verify=False, region_name=None, api_version=None)
+
+ @mock.patch('os.path.isdir', return_value=True)
+ @mock.patch('os.path.exists', return_value=True)
+ @mock.patch('os.listdir',
+ return_value=['sqs-2012-11-05.resources.json',
+ 'sqs-2013-11-05.resources.json',
+ 'sqs-2014-11-05.resources.json'])
+ def test_create_resource_latest_version(self, list_mock, exist_mock,
+ dir_mock):
+ session = Session()
+ session.client = mock.Mock()
+ load_mock = mock.Mock()
+ session.resource_factory.load_from_definition = load_mock
+
+ self.loader.get_search_paths.return_value = ['search-path']
+
+ session.resource('sqs')
+
+ self.loader.load_data.assert_called_with('sqs-2014-11-05.resources')
+
+ @mock.patch('os.path.isdir', return_value=True)
+ @mock.patch('os.path.exists', return_value=True)
+ @mock.patch('os.listdir', return_value=['s3-2006-03-01.resources.json'])
+ def test_bad_resource_name(self, list_mock, exist_mock, dir_mock):
+ session = Session()
+ session.client = mock.Mock()
+ load_mock = mock.Mock()
+ session.resource_factory.load_from_definition = load_mock
+
+ self.loader.get_search_paths.return_value = ['search-path']
+
+ with self.assertRaises(NoVersionFound):
+ # S3 is defined but not SQS!
+ session.resource('sqs')
+
+ # We make ``isdir`` return ``False``, then ``True`` because we
+ # want the first path to be a file, and the second a directory.
+ # This allows us to test both code paths while searching for
+ # a value, otherwise the ``exists`` check is never performed.
+ @mock.patch('os.path.isdir', side_effect=[False, True])
+ @mock.patch('os.path.exists', return_value=False)
+ def test_no_search_path_resources(self, exist_mock, dir_mock):
+ session = Session()
+ session.client = mock.Mock()
+ load_mock = mock.Mock()
+ session.resource_factory.load_from_definition = load_mock
+
+ self.loader.get_search_paths.return_value = [
+ 'search-path1', 'search-path2']
+
+ with self.assertRaises(NoVersionFound):
+ # No resources are defined anywhere
+ session.resource('sqs')
| [
{
"components": [
{
"doc": "",
"lines": [
18,
19
],
"name": "NoVersionFound",
"signature": "class NoVersionFound(Exception):",
"type": "class"
}
],
"file": "boto3/exceptions.py"
},
{
"components": [
{
"... | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_can_instantiate_service_resource",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_dangling_resource_create_with_kwarg",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_dangling_resource_equality",
"tests/unit/r... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Switch to Botocore data loader.
This pull request changes Boto 3 to use the Botocore data loader for loading
resource JSON description files instead of the custom loading logic that was
in place. As a result, a couple of things have changed:
- Removed all JSON loading code from Boto 3
- Removed `ResourceFactory.create_class` in favor of `load_from_definition`
- Added Boto 3 resource data path to Botocore's loader
- Added functionality to get the latest resource version
- Implemented `Session.get_available_resources`
- Updated and removed some tests
- Updated docs
Unfortunately, Botocore requires a very specific folder structure, and at the
moment Boto 3 is not using it. Because of this I wrote custom logic to find
resource JSON files and find the latest version, some of which should be
migrated into Botocore in the near future.
cc @jamesls, @kyleknap
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/exceptions.py]
(definition of NoVersionFound:)
class NoVersionFound(Exception):
[end of new definitions in boto3/exceptions.py]
[start of new definitions in boto3/session.py]
(definition of Session._setup_loader:)
def _setup_loader(self):
"""Setup loader paths so that we can load resources."""
(definition of Session._get_resource_files:)
def _get_resource_files(self):
"""This generator yields paths to resource files in the loader's
search paths. Specifically, it looks for files that end with
``.resources.json`` in any of the search paths, but does not
recursively search the paths."""
(definition of Session._find_latest_version:)
def _find_latest_version(self, service_name):
"""Find the latest resource version of a given service if it exists,
otherwise raises an exception.
TODO: Merge this logic upstream into Botocore if possible. Botocore
depends on a different directory layout at the moment.
:rtype: string
:return: Version string like 'YYYY-MM-DD'
:raises: NoVersionFound"""
[end of new definitions in boto3/session.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | ||
boto__boto3-9 | 9 | boto/boto3 | null | f0c07a858001ff38a1992d62bad589667ee49cd3 | 2014-10-28T22:41:45Z | diff --git a/boto3/data/resources/sqs-2012-11-05.resources.json b/boto3/data/resources/sqs-2012-11-05.resources.json
index 08ebcb3b53..212ae1b0f7 100644
--- a/boto3/data/resources/sqs-2012-11-05.resources.json
+++ b/boto3/data/resources/sqs-2012-11-05.resources.json
@@ -169,16 +169,6 @@
]
}
}
- },
- "hasOne": {
- "Queue": {
- "resource": {
- "type":"Queue",
- "identifiers": [
- { "target":"Url", "sourceType":"identifier", "source":"QueueUrl" }
- ]
- }
- }
}
}
}
diff --git a/boto3/resources/factory.py b/boto3/resources/factory.py
index 8f5595bc5f..c8dafca889 100644
--- a/boto3/resources/factory.py
+++ b/boto3/resources/factory.py
@@ -266,6 +266,13 @@ def _load_references(self, attrs, service_name, resource_name,
reference.resource.type, snake_cased, reference, service_name,
resource_name, model, resource_defs, service_model)
+ for reference in model.reverse_references:
+ snake_cased = xform_name(reference.resource.type)
+ self._check_allowed_name(attrs, snake_cased)
+ attrs[snake_cased] = self._create_reference(
+ reference.resource.type, snake_cased, reference, service_name,
+ resource_name, model, resource_defs, service_model)
+
def _check_allowed_name(self, attrs, name):
"""
Determine if a given name is allowed on the instance, and if not,
diff --git a/boto3/resources/model.py b/boto3/resources/model.py
index f6fc405590..e5c8a6fddf 100644
--- a/boto3/resources/model.py
+++ b/boto3/resources/model.py
@@ -23,6 +23,11 @@
classes as well as by the documentation generator.
"""
+import logging
+
+
+logger = logging.getLogger(__name__)
+
class Identifier(object):
"""
@@ -295,6 +300,48 @@ def references(self):
return references
+ @property
+ def reverse_references(self):
+ """
+ Get a list of reverse reference resources. E.g. an S3 object has
+ a ``bucket_name`` identifier that can be used to instantiate a
+ bucket resource instance.
+ """
+ references = []
+
+ # First, we search for possible reverse references based on the
+ # defined sub-resources in each resource. If the name of this
+ # resource is present, then we are a child. Next, we use the
+ # identifiers to construct a reference definition, append it
+ # to the list of references and return.
+
+ for name, definition in self._resource_defs.items():
+ sub_resources = definition.get('subResources', {})
+ resource_names = sub_resources.get('resources', [])
+
+ if self.name in resource_names:
+ logger.debug('Discovered reverse reference from {0}'
+ ' to {1}'.format(self.name, name))
+
+ identifiers = sub_resources.get('identifiers', {})
+
+ has_one_def = {
+ 'resource': {
+ 'type': name,
+ 'identifiers': []
+ }
+ }
+
+ for target, source in identifiers.items():
+ has_one_def['resource']['identifiers'].append(
+ {'target': target, 'sourceType': 'identifier',
+ 'source': source})
+
+ references.append(
+ Action(name, has_one_def, self._resource_defs))
+
+ return references
+
@property
def collections(self):
"""
| diff --git a/tests/unit/resources/test_factory.py b/tests/unit/resources/test_factory.py
index e9c2c38e3e..665c23950e 100644
--- a/tests/unit/resources/test_factory.py
+++ b/tests/unit/resources/test_factory.py
@@ -432,6 +432,7 @@ def test_resource_lazy_properties_missing_load(self, action_cls):
def test_resource_loads_references(self):
model = {
'shape': 'InstanceShape',
+ 'identifiers': [{'name': 'GroupId'}],
'hasOne': {
'Subnet': {
'resource': {
@@ -445,6 +446,13 @@ def test_resource_loads_references(self):
}
}
defs = {
+ 'Group': {
+ 'identifiers': [{'name': 'Id'}],
+ 'subResources': {
+ 'identifiers': {'Id': 'GroupId'},
+ 'resources': ['Instance']
+ }
+ },
'Subnet': {
'identifiers': [{'name': 'Id'}]
}
@@ -454,6 +462,9 @@ def test_resource_loads_references(self):
'InstanceShape': {
'type': 'structure',
'members': {
+ 'GroupId': {
+ 'shape': 'String'
+ },
'SubnetId': {
'shape': 'String'
}
@@ -465,7 +476,8 @@ def test_resource_loads_references(self):
}
})
- resource = self.load('test', 'Instance', model, defs, service_model)()
+ resource = self.load('test', 'Instance', model, defs,
+ service_model)('group-id')
# Load the resource with no data
resource.meta['data'] = {}
@@ -474,6 +486,8 @@ def test_resource_loads_references(self):
'Resource should have a subnet reference')
self.assertIsNone(resource.subnet,
'Missing identifier, should return None')
+ self.assertTrue(hasattr(resource, 'group'),
+ 'Resource should have a group reverse ref')
# Load the resource with data to instantiate a reference
resource.meta['data'] = {'SubnetId': 'abc123'}
diff --git a/tests/unit/resources/test_model.py b/tests/unit/resources/test_model.py
index f3de482885..9b3ef1a214 100644
--- a/tests/unit/resources/test_model.py
+++ b/tests/unit/resources/test_model.py
@@ -162,6 +162,43 @@ def test_resource_references(self):
self.assertEqual(ref2.resource.type, 'Frob')
self.assertEqual(len(ref2.resource.identifiers), 0)
+ def test_reverse_reference(self):
+ # Here the Code resource has no explicit ``hasOne`` defined, however
+ # by accessing the model's ``reverse_references`` you can see that
+ # it provides such a relation to Frob based on the Code resource's
+ # own identifiers (FrobId in this case).
+ resource_defs = {
+ 'Frob': {
+ 'identifiers': [{'name': 'Id'}],
+ 'subResources': {
+ 'identifiers': {'FrobId': 'Id'},
+ 'resources': ['Code']
+ }
+ },
+ 'Code': {
+ 'identifiers': [
+ {'name': 'FrobId'},
+ {'name': 'Id'}
+ ]
+ }
+ }
+ model_def = resource_defs['Code']
+ model = ResourceModel('Code', model_def, resource_defs)
+
+ references = model.reverse_references
+
+ self.assertIsInstance(references, list)
+ self.assertEqual(len(references), 1,
+ 'Code should have a single reverse ref to Frob')
+
+ ref = references[0]
+ self.assertEqual(ref.name, 'Frob')
+ self.assertEqual(ref.resource.type, 'Frob')
+ self.assertEqual(ref.resource.identifiers[0].target, 'FrobId')
+ self.assertEqual(ref.resource.identifiers[0].source_type,
+ 'identifier')
+ self.assertEqual(ref.resource.identifiers[0].source, 'Id')
+
def test_resource_collections(self):
model = ResourceModel('test', {
'hasMany': {
| diff --git a/boto3/data/resources/sqs-2012-11-05.resources.json b/boto3/data/resources/sqs-2012-11-05.resources.json
index 08ebcb3b53..212ae1b0f7 100644
--- a/boto3/data/resources/sqs-2012-11-05.resources.json
+++ b/boto3/data/resources/sqs-2012-11-05.resources.json
@@ -169,16 +169,6 @@
]
}
}
- },
- "hasOne": {
- "Queue": {
- "resource": {
- "type":"Queue",
- "identifiers": [
- { "target":"Url", "sourceType":"identifier", "source":"QueueUrl" }
- ]
- }
- }
}
}
}
| [
{
"components": [
{
"doc": "Get a list of reverse reference resources. E.g. an S3 object has\na ``bucket_name`` identifier that can be used to instantiate a\nbucket resource instance.",
"lines": [
304,
343
],
"name": "ResourceModel.reverse_references",
... | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_resource_loads_references",
"tests/unit/resources/test_model.py::TestModels::test_reverse_reference"
] | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_can_instantiate_service_resource",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_create_resource_calls_load",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_create_service_calls_load",
"tests/unit/resources/t... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add implicit reverse references.
This pull request adds support for resources to reference their parents,
for example:
``` python
obj = s3.Object('boto3', 'sun.jpg')
print(obj.bucket.name)
```
This works by using the `subResources` feature of the resource JSON
description format. Since `Object` is listed as a child of `Bucket`
in the bucket's sub-resources, and we know the identifier mapping
from and object's `bucket_name` to a bucket's `name`, we can create
the reverse reference just like if an explicit `hasOne` was defined
in the JSON.
Additionally this fixes the SQS resource JSON to no longer explicitly
list the Message hasOne Queue relation, since the Queue has a Message
in its sub-resources.
Adds a test for the resource model as well as modifies the references
test to ensure both normal and reverse references become attributes
on the instantiated resource.
**Note**: this is not very efficient, but caching should fix that in the near
future. I'm open to suggestions to make it better.
cc: @jamesls, @kyleknap
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/resources/model.py]
(definition of ResourceModel.reverse_references:)
def reverse_references(self):
"""Get a list of reverse reference resources. E.g. an S3 object has
a ``bucket_name`` identifier that can be used to instantiate a
bucket resource instance."""
[end of new definitions in boto3/resources/model.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | |
boto__boto3-8 | 8 | boto/boto3 | null | 9e6bc37d2fb8acd8ee0d779f72276ca268f3e867 | 2014-10-23T20:09:50Z | diff --git a/boto3/resources/factory.py b/boto3/resources/factory.py
index 8dc143fb93..8f5595bc5f 100644
--- a/boto3/resources/factory.py
+++ b/boto3/resources/factory.py
@@ -21,6 +21,7 @@
from .base import ServiceResource
from .collection import CollectionManager
from .model import ResourceModel
+from .response import all_not_none, build_identifiers
from ..compat import json, OrderedDict
from ..exceptions import ResourceLoadException
@@ -138,6 +139,8 @@ def load_from_definition(self, service_name, resource_name, model,
'meta': meta,
}
+ logger.debug('Loading %s:%s', service_name, resource_name)
+
resource_model = ResourceModel(resource_name, model, resource_defs)
self._load_identifiers(attrs, meta, resource_model)
@@ -148,6 +151,8 @@ def load_from_definition(self, service_name, resource_name, model,
self._load_attributes(attrs, meta, resource_model, service_model)
self._load_collections(attrs, resource_model, resource_defs,
service_model)
+ self._load_references(attrs, service_name, resource_name,
+ resource_model, resource_defs, service_model)
# Create the name based on the requested service and resource
cls_name = resource_name
@@ -234,6 +239,12 @@ def _load_attributes(self, attrs, meta, model, service_model):
snake_cased)
def _load_collections(self, attrs, model, resource_defs, service_model):
+ """
+ Load resource collections from the model. Each collection becomes
+ a :py:class:`~boto3.resources.collection.CollectionManager` instance
+ on the resource instance, which allows you to iterate and filter
+ through the collection's items.
+ """
for collection_model in model.collections:
snake_cased = xform_name(collection_model.name)
self._check_allowed_name(attrs, snake_cased)
@@ -241,6 +252,20 @@ def _load_collections(self, attrs, model, resource_defs, service_model):
attrs[snake_cased] = self._create_collection(snake_cased,
collection_model, resource_defs, service_model)
+ def _load_references(self, attrs, service_name, resource_name,
+ model, resource_defs, service_model):
+ """
+ Load references, which are related resource instances. For example,
+ an EC2 instance would have a ``vpc`` reference, which is an instance
+ of an EC2 VPC resource.
+ """
+ for reference in model.references:
+ snake_cased = xform_name(reference.resource.type)
+ self._check_allowed_name(attrs, snake_cased)
+ attrs[snake_cased] = self._create_reference(
+ reference.resource.type, snake_cased, reference, service_name,
+ resource_name, model, resource_defs, service_model)
+
def _check_allowed_name(self, attrs, name):
"""
Determine if a given name is allowed on the instance, and if not,
@@ -292,6 +317,35 @@ def get_collection(self):
get_collection.__doc__ = 'TODO'
return property(get_collection)
+ def _create_reference(factory_self, name, snake_cased, reference,
+ service_name, resource_name, model, resource_defs,
+ service_model):
+ """
+ Creates a new property on the resource to lazy-load a reference.
+ """
+ def get_reference(self):
+ # We need to lazy-evaluate the reference to handle circular
+ # references between resources. We do this by loading the class
+ # when first accessed.
+ # First, though, we need to see if we have the required
+ # identifiers to instantiate the resource reference.
+ identifiers = build_identifiers(
+ reference.resource.identifiers, self, {}, {})
+ resource = None
+ if all_not_none(identifiers.values()):
+ # Identifiers are present, so now we can create the resource
+ # instance using them.
+ resource_type = reference.resource.type
+ cls = factory_self.load_from_definition(
+ service_name, name, resource_defs.get(resource_type),
+ resource_defs, service_model)
+ resource = cls(**identifiers)
+ return resource
+
+ get_reference.__name__ = str(snake_cased)
+ get_reference.__doc__ = 'TODO'
+ return property(get_reference)
+
def _create_class_partial(factory_self, resource_cls, identifiers=None):
"""
Creates a new method which acts as a functools.partial, passing
diff --git a/boto3/resources/model.py b/boto3/resources/model.py
index 463f61f981..f6fc405590 100644
--- a/boto3/resources/model.py
+++ b/boto3/resources/model.py
@@ -279,6 +279,22 @@ def actions(self):
return actions
+ @property
+ def references(self):
+ """
+ Get a list of reference resources.
+
+ :type: list(:py:class:`ResponseResource`)
+ """
+ references = []
+
+ for key in ['hasOne', 'hasSome']:
+ for name, definition in self._definition.get(key, {}).items():
+ references.append(
+ Action(name, definition, self._resource_defs))
+
+ return references
+
@property
def collections(self):
"""
| diff --git a/tests/unit/resources/test_factory.py b/tests/unit/resources/test_factory.py
index 89315ecd45..e9c2c38e3e 100644
--- a/tests/unit/resources/test_factory.py
+++ b/tests/unit/resources/test_factory.py
@@ -11,7 +11,7 @@
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
-from botocore.model import ServiceModel
+from botocore.model import ServiceModel, StructureShape
from boto3.exceptions import ResourceLoadException
from boto3.resources.base import ServiceResource
from boto3.resources.factory import ResourceFactory
@@ -429,6 +429,57 @@ def test_resource_lazy_properties_missing_load(self, action_cls):
with self.assertRaises(ResourceLoadException):
resource.last_modified
+ def test_resource_loads_references(self):
+ model = {
+ 'shape': 'InstanceShape',
+ 'hasOne': {
+ 'Subnet': {
+ 'resource': {
+ 'type': 'Subnet',
+ 'identifiers': [
+ {'target': 'Id', 'sourceType': 'dataMember',
+ 'source': 'SubnetId'}
+ ]
+ }
+ }
+ }
+ }
+ defs = {
+ 'Subnet': {
+ 'identifiers': [{'name': 'Id'}]
+ }
+ }
+ service_model = ServiceModel({
+ 'shapes': {
+ 'InstanceShape': {
+ 'type': 'structure',
+ 'members': {
+ 'SubnetId': {
+ 'shape': 'String'
+ }
+ }
+ },
+ 'String': {
+ 'type': 'string'
+ }
+ }
+ })
+
+ resource = self.load('test', 'Instance', model, defs, service_model)()
+
+ # Load the resource with no data
+ resource.meta['data'] = {}
+
+ self.assertTrue(hasattr(resource, 'subnet'),
+ 'Resource should have a subnet reference')
+ self.assertIsNone(resource.subnet,
+ 'Missing identifier, should return None')
+
+ # Load the resource with data to instantiate a reference
+ resource.meta['data'] = {'SubnetId': 'abc123'}
+ self.assertIsInstance(resource.subnet, ServiceResource)
+ self.assertEqual(resource.subnet.id, 'abc123')
+
@mock.patch('boto3.resources.factory.CollectionManager')
@mock.patch('boto3.resources.model.Collection')
def test_resource_loads_collections(self, mock_model, collection_cls):
diff --git a/tests/unit/resources/test_model.py b/tests/unit/resources/test_model.py
index 51741f4cfe..f3de482885 100644
--- a/tests/unit/resources/test_model.py
+++ b/tests/unit/resources/test_model.py
@@ -120,6 +120,48 @@ def test_sub_resources(self):
resource = model.sub_resources.resources[0]
self.assertEqual(resource.name, 'Frob')
+ def test_resource_references(self):
+ model_def = {
+ 'hasOne': {
+ 'Frob': {
+ 'resource': {
+ 'type': 'Frob',
+ 'identifiers': [
+ {'target':'Id', 'sourceType':'dataMember',
+ 'source':'FrobId'}
+ ]
+ },
+ }
+ },
+ 'hasSome': {
+ 'Frobs': {
+ 'resource': {
+ 'type': 'Frob'
+ }
+ }
+ }
+ }
+ resource_defs = {
+ 'Frob': {}
+ }
+ model = ResourceModel('test', model_def, resource_defs)
+
+ self.assertIsInstance(model.references, list)
+ self.assertEqual(len(model.references), 2)
+
+ ref = model.references[0]
+ self.assertEqual(ref.name, 'Frob')
+ self.assertEqual(ref.resource.type, 'Frob')
+ self.assertEqual(ref.resource.identifiers[0].target, 'Id')
+ self.assertEqual(ref.resource.identifiers[0].source_type,
+ 'dataMember')
+ self.assertEqual(ref.resource.identifiers[0].source, 'FrobId')
+
+ ref2 = model.references[1]
+ self.assertEqual(ref2.name, 'Frobs')
+ self.assertEqual(ref2.resource.type, 'Frob')
+ self.assertEqual(len(ref2.resource.identifiers), 0)
+
def test_resource_collections(self):
model = ResourceModel('test', {
'hasMany': {
| [
{
"components": [
{
"doc": "Load references, which are related resource instances. For example,\nan EC2 instance would have a ``vpc`` reference, which is an instance\nof an EC2 VPC resource.",
"lines": [
255,
267
],
"name": "ResourceFactory._load_referen... | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_resource_loads_references",
"tests/unit/resources/test_model.py::TestModels::test_resource_references"
] | [
"tests/unit/resources/test_factory.py::TestResourceFactory::test_can_instantiate_service_resource",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_create_resource_calls_load",
"tests/unit/resources/test_factory.py::TestResourceFactory::test_create_service_calls_load",
"tests/unit/resources/t... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add references to resources.
This change exposes `hasOne` and `hasSome` references on resources, such as
an EC2 `instance.subnet` or `instance.vpc`. Both the subnet and VPC are
resource instances that are created from data available on the instance. In
this case via the `subnet_id` and `vpc_id` data members. Allows code
like the following:
``` python
import boto3
ec2 = boto3.resource('ec2')
for instance in ec2.instances.all():
print(instance.vpc.tags)
```
Updates the resource data model, updates and adds tests.
cc @jamesls, @kyleknap
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/resources/factory.py]
(definition of ResourceFactory._load_references:)
def _load_references(self, attrs, service_name, resource_name, model, resource_defs, service_model):
"""Load references, which are related resource instances. For example,
an EC2 instance would have a ``vpc`` reference, which is an instance
of an EC2 VPC resource."""
(definition of ResourceFactory._create_reference:)
def _create_reference(factory_self, name, snake_cased, reference, service_name, resource_name, model, resource_defs, service_model):
"""Creates a new property on the resource to lazy-load a reference."""
(definition of ResourceFactory._create_reference.get_reference:)
def get_reference(self):
[end of new definitions in boto3/resources/factory.py]
[start of new definitions in boto3/resources/model.py]
(definition of ResourceModel.references:)
def references(self):
"""Get a list of reference resources.
:type: list(:py:class:`ResponseResource`)"""
[end of new definitions in boto3/resources/model.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | ||
boto__boto3-7 | 7 | boto/boto3 | null | e47ef425ec8c32538c36c7b6ea417f2327050ba4 | 2014-10-22T23:04:00Z | diff --git a/boto3/resources/action.py b/boto3/resources/action.py
index a4fd7252e8..22291c24d4 100644
--- a/boto3/resources/action.py
+++ b/boto3/resources/action.py
@@ -29,8 +29,8 @@ class ServiceAction(object):
The action may construct parameters from existing resource identifiers
and may return either a raw response or a new resource instance.
- :type action_def: dict
- :param action_def: The action definition.
+ :type action_model: :py:class`~boto3.resources.model.Action`
+ :param action_model: The action model.
:type factory: ResourceFactory
:param factory: The factory that created the resource class to which
this action is attached.
@@ -39,19 +39,18 @@ class ServiceAction(object):
:type service_model: :ref:`botocore.model.ServiceModel`
:param service_model: The Botocore service model
"""
- def __init__(self, action_def, factory=None, resource_defs=None,
+ def __init__(self, action_model, factory=None, resource_defs=None,
service_model=None):
- self.action_def = action_def
+ self.action_model = action_model
- search_path = action_def.get('path')
+ search_path = action_model.path
# In the simplest case we just return the response, but if a
# resource is defined, then we must create these before returning.
- response_resource_def = action_def.get('resource', {})
- if response_resource_def:
+ if action_model.resource:
self.response_handler = ResourceHandler(search_path, factory,
- resource_defs, service_model, response_resource_def,
- action_def.get('request', {}).get('operation'))
+ resource_defs, service_model, action_model.resource,
+ action_model.request.operation)
else:
self.response_handler = RawHandler(search_path)
@@ -65,13 +64,12 @@ def __call__(self, parent, *args, **kwargs):
:rtype: dict or ServiceResource or list(ServiceResource)
:return: The response, either as a raw dict or resource instance(s).
"""
- request_def = self.action_def.get('request', {})
- operation_name = xform_name(request_def.get('operation', ''))
+ operation_name = xform_name(self.action_model.request.operation)
# First, build predefined params and then update with the
# user-supplied kwargs, which allows overriding the pre-built
# params if needed.
- params = create_request_parameters(parent, request_def)
+ params = create_request_parameters(parent, self.action_model.request)
params.update(kwargs)
logger.info('Calling %s:%s with %r', parent.meta['service_name'],
diff --git a/boto3/resources/collection.py b/boto3/resources/collection.py
index f86cd2cc63..52c0613b67 100644
--- a/boto3/resources/collection.py
+++ b/boto3/resources/collection.py
@@ -32,19 +32,19 @@ class ResourceCollection(object):
See :ref:`guide_collections` for a high-level overview of collections,
including when remote service requests are performed.
- :type definition: dict
- :param definition: Collection definition
+ :type model: :py:class:`~boto3.resources.model.Collection`
+ :param model: Collection model
:type parent: :py:class:`~boto3.resources.base.ServiceResource`
:param parent: The collection's parent resource
:type handler: :py:class:`~boto3.resources.response.ResourceHandler`
:param handler: The resource response handler used to create resource
instances
"""
- def __init__(self, definition, parent, handler, **kwargs):
- self._definition = definition
+ def __init__(self, model, parent, handler, **kwargs):
+ self._model = model
self._parent = parent
self._py_operation_name = xform_name(
- definition.get('request', {}).get('operation', ''))
+ model.request.operation)
self._handler = handler
self._params = kwargs
@@ -54,7 +54,7 @@ def __repr__(self):
self._parent,
'{0}.{1}'.format(
self._parent.meta['service_name'],
- self._definition.get('resource', {}).get('type')
+ self._model.resource.type
)
)
@@ -70,7 +70,7 @@ def __iter__(self):
page_size = cleaned_params.pop('page_size', None)
params = create_request_parameters(
- self._parent, self._definition.get('request', {}))
+ self._parent, self._model.request)
params.update(cleaned_params)
# Is this a paginated operation? If so, we need to get an
@@ -123,7 +123,7 @@ def _clone(self, **kwargs):
"""
params = copy.deepcopy(self._params)
params.update(kwargs)
- clone = self.__class__(self._definition, self._parent,
+ clone = self.__class__(self._model, self._parent,
self._handler, **params)
return clone
@@ -224,8 +224,8 @@ class CollectionManager(object):
See :ref:`guide_collections` for a high-level overview of collections,
including when remote service requests are performed.
- :type collection_def: dict
- :param collection_def: Collection definition
+ :type model: :py:class:`~boto3.resources.model.Collection`
+ :param model: Collection model
:type parent: :py:class:`~boto3.resources.base.ServiceResource`
:param parent: The collection's parent resource
:type factory: :py:class:`~boto3.resources.factory.ResourceFactory`
@@ -235,16 +235,15 @@ class CollectionManager(object):
:type service_model: :ref:`botocore.model.ServiceModel`
:param service_model: The Botocore service model
"""
- def __init__(self, collection_def, parent, factory, resource_defs,
+ def __init__(self, model, parent, factory, resource_defs,
service_model):
- self._definition = collection_def
- operation_name = self._definition.get('request', {}).get('operation')
+ self._model = model
+ operation_name = self._model.request.operation
self._parent = parent
- search_path = collection_def.get('path', '')
- response_resource_def = collection_def.get('resource')
+ search_path = model.path
self._handler = ResourceHandler(search_path, factory, resource_defs,
- service_model, response_resource_def, operation_name)
+ service_model, model.resource, operation_name)
def __repr__(self):
return '{0}({1}, {2})'.format(
@@ -252,7 +251,7 @@ def __repr__(self):
self._parent,
'{0}.{1}'.format(
self._parent.meta['service_name'],
- self._definition.get('resource', {}).get('type')
+ self._model.resource.type
)
)
@@ -263,7 +262,7 @@ def iterator(self, **kwargs):
:rtype: :py:class:`ResourceCollection`
:return: An iterable representing the collection of resources
"""
- return ResourceCollection(self._definition, self._parent,
+ return ResourceCollection(self._model, self._parent,
self._handler, **kwargs)
# Set up some methods to proxy ResourceCollection methods
diff --git a/boto3/resources/factory.py b/boto3/resources/factory.py
index fbaf55e587..8dc143fb93 100644
--- a/boto3/resources/factory.py
+++ b/boto3/resources/factory.py
@@ -20,6 +20,7 @@
from .action import ServiceAction
from .base import ServiceResource
from .collection import CollectionManager
+from .model import ResourceModel
from ..compat import json, OrderedDict
from ..exceptions import ResourceLoadException
@@ -137,12 +138,16 @@ def load_from_definition(self, service_name, resource_name, model,
'meta': meta,
}
- self._load_identifiers(attrs, meta, model)
- self._load_subresources(attrs, service_name, resource_name, model,
- resource_defs, service_model)
- self._load_actions(attrs, model, resource_defs, service_model)
- self._load_attributes(attrs, meta, model, service_model)
- self._load_collections(attrs, model, resource_defs, service_model)
+ resource_model = ResourceModel(resource_name, model, resource_defs)
+
+ self._load_identifiers(attrs, meta, resource_model)
+ self._load_subresources(attrs, service_name, resource_name,
+ resource_model, resource_defs, service_model)
+ self._load_actions(attrs, resource_model, resource_defs,
+ service_model)
+ self._load_attributes(attrs, meta, resource_model, service_model)
+ self._load_collections(attrs, resource_model, resource_defs,
+ service_model)
# Create the name based on the requested service and resource
cls_name = resource_name
@@ -157,8 +162,8 @@ def _load_identifiers(self, attrs, meta, model):
the resource cannot be used. Identifiers become arguments for
operations on the resource.
"""
- for identifier in model.get('identifiers', []):
- snake_cased = xform_name(identifier['name'])
+ for identifier in model.identifiers:
+ snake_cased = xform_name(identifier.name)
self._check_allowed_name(attrs, snake_cased)
meta['identifiers'].append(snake_cased)
attrs[snake_cased] = None
@@ -175,19 +180,20 @@ def _load_subresources(self, attrs, service_name, resource_name,
# This is a service, so dangle all the resource_defs as if
# they were subresources of the service itself.
for name, resource_def in resource_defs.items():
- cls = self.load_from_definition(service_name, name,
- resource_defs.get(name), resource_defs, service_model)
+ cls = self.load_from_definition(
+ service_name, name, resource_defs.get(name, {}),
+ resource_defs, service_model)
attrs[name] = self._create_class_partial(cls)
# For non-services, subresources are explicitly listed
- sub_resources = model.get('subResources', {})
- if sub_resources:
- identifiers = sub_resources.get('identifiers', {})
- for name in sub_resources.get('resources'):
- klass = self.load_from_definition(service_name, name,
- resource_defs.get(name), resource_defs, service_model)
- attrs[name] = self._create_class_partial(klass,
- identifiers=identifiers)
+ if model.sub_resources:
+ identifiers = model.sub_resources.identifiers
+ for name in model.sub_resources.resource_names:
+ cls = self.load_from_definition(
+ service_name, name, resource_defs.get(name, {}),
+ resource_defs, service_model)
+ attrs[name] = self._create_class_partial(
+ cls, identifiers=identifiers)
def _load_actions(self, attrs, model, resource_defs, service_model):
"""
@@ -195,15 +201,14 @@ def _load_actions(self, attrs, model, resource_defs, service_model):
being a special case which sets internal data for attributes, and
``reload`` is an alias for ``load``.
"""
- if 'load' in model:
- load_def = model.get('load')
-
- attrs['load'] = self._create_action('load',
- load_def, resource_defs, service_model, is_load=True)
+ if model.load:
+ attrs['load'] = self._create_action(
+ 'load', model.load, resource_defs, service_model,
+ is_load=True)
attrs['reload'] = attrs['load']
- for name, action in model.get('actions', {}).items():
- snake_cased = xform_name(name)
+ for action in model.actions:
+ snake_cased = xform_name(action.name)
self._check_allowed_name(attrs, snake_cased)
attrs[snake_cased] = self._create_action(snake_cased,
action, resource_defs, service_model)
@@ -215,8 +220,8 @@ def _load_attributes(self, attrs, meta, model, service_model):
is defined in the Botocore service JSON, hence the need for
access to the ``service_model``.
"""
- if 'shape' in model:
- shape = service_model.shape_for(model.get('shape'))
+ if model.shape:
+ shape = service_model.shape_for(model.shape)
for name, member in shape.members.items():
snake_cased = xform_name(name)
@@ -229,12 +234,12 @@ def _load_attributes(self, attrs, meta, model, service_model):
snake_cased)
def _load_collections(self, attrs, model, resource_defs, service_model):
- for name, definition in model.get('hasMany', {}).items():
- snake_cased = xform_name(name)
+ for collection_model in model.collections:
+ snake_cased = xform_name(collection_model.name)
self._check_allowed_name(attrs, snake_cased)
attrs[snake_cased] = self._create_collection(snake_cased,
- definition, resource_defs, service_model)
+ collection_model, resource_defs, service_model)
def _check_allowed_name(self, attrs, name):
"""
@@ -274,13 +279,13 @@ def property_loader(self):
property_loader.__doc__ = 'TODO'
return property(property_loader)
- def _create_collection(factory_self, snake_cased, collection_def,
+ def _create_collection(factory_self, snake_cased, collection_model,
resource_defs, service_model):
"""
Creates a new property on the resource to lazy-load a collection.
"""
def get_collection(self):
- return CollectionManager(collection_def,
+ return CollectionManager(collection_model,
self, factory_self, resource_defs, service_model)
get_collection.__name__ = str(snake_cased)
@@ -328,7 +333,7 @@ def create_resource(self, *args, **kwargs):
create_resource.__doc__ = doc.format(resource_cls)
return create_resource
- def _create_action(factory_self, snake_cased, action_def, resource_defs,
+ def _create_action(factory_self, snake_cased, action_model, resource_defs,
service_model, is_load=False):
"""
Creates a new method which makes a request to the underlying
@@ -337,7 +342,7 @@ def _create_action(factory_self, snake_cased, action_def, resource_defs,
# Create the action in in this closure but before the ``do_action``
# method below is invoked, which allows instances of the resource
# to share the ServiceAction instance.
- action = ServiceAction(action_def, factory=factory_self,
+ action = ServiceAction(action_model, factory=factory_self,
resource_defs=resource_defs, service_model=service_model)
# A resource's ``load`` method is special because it sets
diff --git a/boto3/resources/model.py b/boto3/resources/model.py
new file mode 100644
index 0000000000..463f61f981
--- /dev/null
+++ b/boto3/resources/model.py
@@ -0,0 +1,294 @@
+# Copyright 2014 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License"). You
+# may not use this file except in compliance with the License. A copy of
+# the License is located at
+#
+# http://aws.amazon.com/apache2.0/
+#
+# or in the "license" file accompanying this file. This file is
+# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
+# ANY KIND, either express or implied. See the License for the specific
+# language governing permissions and limitations under the License.
+
+"""
+The models defined in this file represent the resource JSON description
+format and provide a layer of abstraction from the raw JSON. The advantages
+of this are:
+
+* Pythonic interface (e.g. ``action.request.operation``)
+* Consumers need not change for minor JSON changes (e.g. renamed field)
+
+These models are used both by the resource factory to generate resource
+classes as well as by the documentation generator.
+"""
+
+
+class Identifier(object):
+ """
+ A resource identifier, given by its name.
+
+ :type name: string
+ :param name: The name of the identifier
+ """
+ def __init__(self, name):
+ #: (``string``) The name of the identifier
+ self.name = name
+
+
+class Action(object):
+ """
+ A service operation action.
+
+ :type name: string
+ :param name: The name of the action
+ :type definition: dict
+ :param definition: The JSON definition
+ :type resource_defs: dict
+ :param resource_defs: All resources defined in the service
+ """
+ def __init__(self, name, definition, resource_defs):
+ self._definition = definition
+
+ #: (``string``) The name of the action
+ self.name = name
+ #: (:py:class:`Request`) This action's request or ``None``
+ self.request = None
+ if 'request' in definition:
+ self.request = Request(definition.get('request', {}))
+ #: (:py:class:`ResponseResource`) This action's resource or ``None``
+ self.resource = None
+ if 'resource' in definition:
+ self.resource = ResponseResource(definition.get('resource', {}),
+ resource_defs)
+ #: (``string``) The JMESPath search path or ``None``
+ self.path = definition.get('path')
+
+
+class Request(object):
+ """
+ A service operation action request.
+
+ :type definition: dict
+ :param definition: The JSON definition
+ """
+ def __init__(self, definition):
+ self._definition = definition
+
+ #: (``string``) The name of the low-level service operation
+ self.operation = definition.get('operation')
+
+ @property
+ def params(self):
+ """
+ Get a list of auto-filled parameters for this request.
+
+ :type: list(:py:class:`Parameter`)
+ """
+ params = []
+
+ for item in self._definition.get('params', []):
+ params.append(
+ Parameter(item['target'], item['sourceType'], item['source']))
+
+ return params
+
+
+class Parameter(object):
+ """
+ An auto-filled parameter which has a source and target. For example,
+ the ``QueueUrl`` may be auto-filled from a resource's ``url`` identifier
+ when making calls to ``queue.receive_messages``.
+
+ :type target: string
+ :param target: The destination parameter name, e.g. ``QueueUrl``
+ :type source_type: string
+ :param source_type: Where the source is defined.
+ :type source: string
+ :param source: The source name, e.g. ``Url``
+ """
+ def __init__(self, target, source_type, source):
+ #: (``string``) The destination parameter name
+ self.target = target
+ #: (``string``) Where the source is defined
+ self.source_type = source_type
+ #: (``string``) The source name
+ self.source = source
+
+
+class ResponseResource(object):
+ """
+ A resource response to create after performing an action.
+
+ :type definition: dict
+ :param definition: The JSON definition
+ :type resource_defs: dict
+ :param resource_defs: All resources defined in the service
+ """
+ def __init__(self, definition, resource_defs):
+ self._definition = definition
+ self._resource_defs = resource_defs
+
+ #: (``string``) The name of the response resource type
+ self.type = definition.get('type')
+
+ @property
+ def identifiers(self):
+ """
+ A list of resource identifiers.
+
+ :type: list(:py:class:`Identifier`)
+ """
+ identifiers = []
+
+ for item in self._definition.get('identifiers', []):
+ identifiers.append(
+ Parameter(item['target'], item['sourceType'], item['source']))
+
+ return identifiers
+
+ @property
+ def model(self):
+ """
+ Get the resource model for the response resource.
+
+ :type: :py:class:`ResourceModel`
+ """
+ return ResourceModel(self.type, self._resource_defs[self.type],
+ self._resource_defs)
+
+
+class Collection(Action):
+ """
+ A group of resources. See :py:class:`Action`.
+
+ :type name: string
+ :param name: The name of the collection
+ :type definition: dict
+ :param definition: The JSON definition
+ :type resource_defs: dict
+ :param resource_defs: All resources defined in the service
+ """
+ pass
+
+
+class SubResourceList(object):
+ """
+ A list of information about sub-resources. It includes access
+ to identifiers as well as resource names and models.
+
+ :type definition: dict
+ :param definition: The JSON definition
+ :type resource_defs: dict
+ :param resource_defs: All resources defined in the service
+ """
+ def __init__(self, definition, resource_defs):
+ self._definition = definition
+ self._resource_defs = resource_defs
+
+ #: (``dict``) Identifier key:value pairs
+ self.identifiers = definition.get('identifiers', {})
+ #: (``list``) A list of resource names
+ self.resource_names = definition.get('resources', [])
+
+ @property
+ def resources(self):
+ """
+ Get a list of resource models contained in this sub-resource
+ entry.
+
+ :type: list(:py:class:`ResourceModel`)
+ """
+ resources = []
+
+ for name in self.resource_names:
+ resources.append(
+ ResourceModel(name, self._resource_defs.get(name, {}),
+ self._resource_defs))
+
+ return resources
+
+
+class ResourceModel(object):
+ """
+ A model representing a resource, defined via a JSON description
+ format. A resource has identifiers, attributes, actions,
+ sub-resources, references and collections. For more information
+ on resources, see :ref:`guide_resources`.
+
+ :type name: string
+ :param name: The name of this resource, e.g. ``sqs`` or ``Queue``
+ :type definition: dict
+ :param definition: The JSON definition
+ :type resource_defs: dict
+ :param resource_defs: All resources defined in the service
+ """
+ def __init__(self, name, definition, resource_defs):
+ self._definition = definition
+ self._resource_defs = resource_defs
+
+ #: (``string``) The name of this resource
+ self.name = name
+ #: (``string``) The service shape name for this resource or ``None``
+ self.shape = definition.get('shape')
+ #: (:py:class:`SubResourceList`) Sub-resource information or ``None``
+ self.sub_resources = None
+ if 'subResources' in definition:
+ self.sub_resources = SubResourceList(
+ definition.get('subResources', {}), resource_defs)
+
+ @property
+ def identifiers(self):
+ """
+ Get a list of resource identifiers.
+
+ :type: list(:py:class:`Identifier`)
+ """
+ identifiers = []
+
+ for item in self._definition.get('identifiers', []):
+ identifiers.append(Identifier(item['name']))
+
+ return identifiers
+
+ @property
+ def load(self):
+ """
+ Get the load action for this resource, if it is defined.
+
+ :type: :py:class:`Action` or ``None``
+ """
+ action = self._definition.get('load')
+
+ if action is not None:
+ action = Action('load', action, self._resource_defs)
+
+ return action
+
+ @property
+ def actions(self):
+ """
+ Get a list of actions for this resource.
+
+ :type: list(:py:class:`Action`)
+ """
+ actions = []
+
+ for name, item in self._definition.get('actions', {}).items():
+ actions.append(Action(name, item, self._resource_defs))
+
+ return actions
+
+ @property
+ def collections(self):
+ """
+ Get a list of collections for this resource.
+
+ :type: list(:py:class:`Collection`)
+ """
+ collections = []
+
+ for name, item in self._definition.get('hasMany', {}).items():
+ collections.append(Collection(name, item, self._resource_defs))
+
+ return collections
diff --git a/boto3/resources/params.py b/boto3/resources/params.py
index 1c692187ee..1e9cac1065 100644
--- a/boto3/resources/params.py
+++ b/boto3/resources/params.py
@@ -14,24 +14,24 @@
from botocore import xform_name
-def create_request_parameters(parent, request_def):
+def create_request_parameters(parent, request_model):
"""
Handle request parameters that can be filled in from identifiers,
resource data members or constants.
:type parent: ServiceResource
:param parent: The resource instance to which this action is attached.
- :type request_def: dict
- :param request_def: The action request definition.
+ :type request_model: :py:class:`~boto3.resources.model.Request`
+ :param request_model: The action request model.
:rtype: dict
:return: Pre-filled parameters to be sent to the request operation.
"""
params = {}
- for param in request_def.get('params', []):
- source = param.get('source', '')
- source_type = param.get('sourceType', '')
- target = param.get('target', '')
+ for param in request_model.params:
+ source = param.source
+ source_type = param.source_type
+ target = param.target
if source_type in ['identifier', 'dataMember']:
# Resource identifier, e.g. queue.url
diff --git a/boto3/resources/response.py b/boto3/resources/response.py
index 0ad8748872..fa3a610c42 100644
--- a/boto3/resources/response.py
+++ b/boto3/resources/response.py
@@ -27,15 +27,16 @@ def all_not_none(iterable):
return True
-def build_identifiers(identifiers_def, parent, params, raw_response):
+def build_identifiers(identifiers, parent, params, raw_response):
"""
Builds a mapping of identifier names to values based on the
identifier source location, type, and target. Identifier
values may be scalars or lists depending on the source type
and location.
- :type identifiers_def: list
- :param identifiers_def: List of identifier definitions
+ :type identifiers: list
+ :param identifiers: List of :py:class:`~boto3.resources.model.Parameter`
+ definitions
:type parent: ServiceResource
:param parent: The resource instance to which this action is attached.
:type params: dict
@@ -45,10 +46,10 @@ def build_identifiers(identifiers_def, parent, params, raw_response):
"""
results = {}
- for identifier in identifiers_def:
- source = identifier.get('source', '')
- source_type = identifier.get('sourceType')
- target = identifier.get('target')
+ for identifier in identifiers:
+ source = identifier.source
+ source_type = identifier.source_type
+ target = identifier.target
if source_type == 'responsePath':
value = jmespath.search(source, raw_response)
@@ -159,20 +160,20 @@ class ResourceHandler(object):
:param resource_defs: Service resource definitions.
:type service_model: :ref:`botocore.model.ServiceModel`
:param service_model: The Botocore service model
- :type resource_def: dict
- :param resource_def: Response resource definition.
+ :type resource_model: :py:class:`~boto3.resources.model.ResponseResource`
+ :param resource_model: Response resource model.
:type operation_name: string
:param operation_name: Name of the underlying service operation
:rtype: ServiceResource or list
:return: New resource instance(s).
"""
def __init__(self, search_path, factory, resource_defs, service_model,
- resource_def, operation_name):
+ resource_model, operation_name):
self.search_path = search_path
self.factory = factory
self.resource_defs = resource_defs
self.service_model = service_model
- self.resource_def = resource_def
+ self.resource_model = resource_model
self.operation_name = operation_name
def __call__(self, parent, params, response):
@@ -184,7 +185,7 @@ def __call__(self, parent, params, response):
:type response: dict
:param response: Low-level operation response.
"""
- resource_name = self.resource_def.get('type', '')
+ resource_name = self.resource_model.type
resource_cls = self.factory.load_from_definition(
parent.meta['service_name'], resource_name,
self.resource_defs.get(resource_name), self.resource_defs,
@@ -206,7 +207,7 @@ def __call__(self, parent, params, response):
# resource that is instantiated. Items which are not a list will
# be set as the same value on each new resource instance.
identifiers = build_identifiers(
- self.resource_def.get('identifiers', []), parent, params,
+ self.resource_model.identifiers, parent, params,
raw_response)
# If any of the identifiers is a list, then the response is plural
diff --git a/docs/source/reference/core/resources.rst b/docs/source/reference/core/resources.rst
index 4fc0cea325..f868487f02 100644
--- a/docs/source/reference/core/resources.rst
+++ b/docs/source/reference/core/resources.rst
@@ -4,6 +4,13 @@
Resources Reference
===================
+Resource Model
+--------------
+
+.. automodule:: boto3.resources.model
+ :members:
+ :undoc-members:
+
Request Parameters
------------------
| diff --git a/tests/unit/resources/test_action.py b/tests/unit/resources/test_action.py
index 5ba82a1ef6..05b6322edb 100644
--- a/tests/unit/resources/test_action.py
+++ b/tests/unit/resources/test_action.py
@@ -12,6 +12,7 @@
# language governing permissions and limitations under the License.
from boto3.resources.action import ServiceAction
+from boto3.resources.model import Action
from tests import BaseTestCase, mock
@@ -26,6 +27,10 @@ def setUp(self):
}
}
+ @property
+ def action(self):
+ return Action('test', self.action_def, {})
+
@mock.patch('boto3.resources.action.create_request_parameters',
return_value={})
def test_service_action_creates_params(self, params_mock):
@@ -35,7 +40,7 @@ def test_service_action_creates_params(self, params_mock):
'client': mock.Mock(),
}
- action = ServiceAction(self.action_def)
+ action = ServiceAction(self.action)
action(resource, foo=1)
@@ -53,7 +58,7 @@ def test_service_action_calls_operation(self, params_mock):
operation = resource.meta['client'].get_frobs
operation.return_value = 'response'
- action = ServiceAction(self.action_def)
+ action = ServiceAction(self.action)
response = action(resource, foo=1)
@@ -73,7 +78,7 @@ def test_service_action_calls_raw_handler(self, handler_mock, params_mock):
operation = resource.meta['client'].get_frobs
operation.return_value = 'response'
- action = ServiceAction(self.action_def)
+ action = ServiceAction(self.action)
handler_mock.return_value.return_value = 'response'
@@ -103,7 +108,9 @@ def test_service_action_calls_resource_handler(self, handler_mock, params_mock):
resource_defs = {}
service_model = mock.Mock()
- action = ServiceAction(self.action_def, factory=factory,
+ action_model = self.action
+
+ action = ServiceAction(action_model, factory=factory,
resource_defs=resource_defs, service_model=service_model)
handler_mock.return_value.return_value = 'response'
@@ -111,6 +118,6 @@ def test_service_action_calls_resource_handler(self, handler_mock, params_mock):
action(resource)
handler_mock.assert_called_with('Container', factory, resource_defs,
- service_model, self.action_def['resource'],
+ service_model, action_model.resource,
self.action_def['request']['operation'])
handler_mock.return_value.assert_called_with(resource, {}, 'response')
diff --git a/tests/unit/resources/test_collection.py b/tests/unit/resources/test_collection.py
index a59f375746..f87768abd0 100644
--- a/tests/unit/resources/test_collection.py
+++ b/tests/unit/resources/test_collection.py
@@ -14,6 +14,7 @@
from botocore.model import ServiceModel
from boto3.resources.collection import CollectionManager
from boto3.resources.factory import ResourceFactory
+from boto3.resources.model import Collection
from tests import BaseTestCase, mock
@@ -21,7 +22,15 @@ class TestResourceCollection(BaseTestCase):
def setUp(self):
super(TestResourceCollection, self).setUp()
- self.collection_def = {}
+ # Minimal definition so things like repr work
+ self.collection_def = {
+ 'request': {
+ 'operation': 'TestOperation'
+ },
+ 'resource': {
+ 'type': 'Frob'
+ }
+ }
self.client = mock.Mock()
self.client.can_paginate.return_value = False
meta = {
@@ -51,8 +60,11 @@ def get_collection(self):
resource_defs['Frob']['identifiers'].append(
{'name': identifier['target']})
+ collection_model = Collection(
+ 'test', self.collection_def, resource_defs)
+
collection = CollectionManager(
- self.collection_def, self.parent, self.factory,
+ collection_model, self.parent, self.factory,
resource_defs, self.service_model)
return collection
diff --git a/tests/unit/resources/test_factory.py b/tests/unit/resources/test_factory.py
index 4523521d60..89315ecd45 100644
--- a/tests/unit/resources/test_factory.py
+++ b/tests/unit/resources/test_factory.py
@@ -430,7 +430,8 @@ def test_resource_lazy_properties_missing_load(self, action_cls):
resource.last_modified
@mock.patch('boto3.resources.factory.CollectionManager')
- def test_resource_loads_collections(self, collection_cls):
+ @mock.patch('boto3.resources.model.Collection')
+ def test_resource_loads_collections(self, mock_model, collection_cls):
model = {
'hasMany': {
u'Queues': {
@@ -447,6 +448,7 @@ def test_resource_loads_collections(self, collection_cls):
'Queue': {}
}
service_model = ServiceModel({})
+ mock_model.return_value.name = 'Queues'
resource = self.load('test', 'test', model, defs, service_model)()
@@ -455,5 +457,5 @@ def test_resource_loads_collections(self, collection_cls):
self.assertEqual(resource.queues, collection_cls.return_value,
'Queues collection should be a collection manager')
- collection_cls.assert_called_with(model['hasMany']['Queues'],
+ collection_cls.assert_called_with(mock_model.return_value,
resource, self.factory, defs, service_model)
diff --git a/tests/unit/resources/test_model.py b/tests/unit/resources/test_model.py
new file mode 100644
index 0000000000..51741f4cfe
--- /dev/null
+++ b/tests/unit/resources/test_model.py
@@ -0,0 +1,146 @@
+# Copyright 2014 Amazon.com, Inc. or its affiliates. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the 'License'). You
+# may not use this file except in compliance with the License. A copy of
+# the License is located at
+#
+# http://aws.amazon.com/apache2.0/
+#
+# or in the 'license' file accompanying this file. This file is
+# distributed on an 'AS IS' BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
+# ANY KIND, either express or implied. See the License for the specific
+# language governing permissions and limitations under the License.
+
+from boto3.resources.model import ResourceModel, Action, SubResourceList,\
+ Collection
+from tests import BaseTestCase
+
+
+class TestModels(BaseTestCase):
+ def test_resource_name(self):
+ model = ResourceModel('test', {}, {})
+
+ self.assertEqual(model.name, 'test')
+
+ def test_resource_shape(self):
+ model = ResourceModel('test', {
+ 'shape': 'Frob'
+ }, {})
+
+ self.assertEqual(model.shape, 'Frob')
+
+ def test_resource_identifiers(self):
+ model = ResourceModel('test', {
+ 'identifiers': [
+ {'name': 'one'},
+ {'name': 'two'}
+ ]
+ }, {})
+
+ self.assertEqual(model.identifiers[0].name, 'one')
+ self.assertEqual(model.identifiers[1].name, 'two')
+
+ def test_resource_action_raw(self):
+ model = ResourceModel('test', {
+ 'actions': {
+ 'GetFrobs': {
+ 'request': {
+ 'operation': 'GetFrobsOperation',
+ 'params': [
+ {'target': 'FrobId', 'sourceType': 'identifier',
+ 'source': 'Id'}
+ ]
+ },
+ 'path': 'Container.Frobs[]'
+ }
+ }
+ }, {})
+
+ self.assertIsInstance(model.actions, list)
+ self.assertEqual(len(model.actions), 1)
+
+ action = model.actions[0]
+ self.assertIsInstance(action, Action)
+ self.assertEqual(action.request.operation, 'GetFrobsOperation')
+ self.assertIsInstance(action.request.params, list)
+ self.assertEqual(len(action.request.params), 1)
+ self.assertEqual(action.request.params[0].target, 'FrobId')
+ self.assertEqual(action.request.params[0].source_type, 'identifier')
+ self.assertEqual(action.request.params[0].source, 'Id')
+ self.assertEqual(action.path, 'Container.Frobs[]')
+
+ def test_resource_action_response_resource(self):
+ model = ResourceModel('test', {
+ 'actions': {
+ 'GetFrobs': {
+ 'resource': {
+ 'type': 'Frob'
+ }
+ }
+ }
+ }, {
+ 'Frob': {}
+ })
+
+ action = model.actions[0]
+ self.assertEqual(action.resource.type, 'Frob')
+ self.assertIsInstance(action.resource.model, ResourceModel)
+ self.assertEqual(action.resource.model.name, 'Frob')
+
+ def test_resource_load_action(self):
+ model = ResourceModel('test', {
+ 'load': {
+ 'request': {
+ 'operation': 'GetFrobInfo'
+ },
+ 'path': '$'
+ }
+ }, {})
+
+ self.assertIsInstance(model.load, Action)
+ self.assertEqual(model.load.request.operation, 'GetFrobInfo')
+ self.assertEqual(model.load.path, '$')
+
+ def test_sub_resources(self):
+ model = ResourceModel('test', {
+ 'subResources': {
+ 'identifiers': {
+ 'FrobId': 'Id'
+ },
+ 'resources': ['Frob']
+ }
+ }, {
+ 'Frob': {}
+ })
+
+ self.assertIsInstance(model.sub_resources, SubResourceList)
+ self.assertEqual(model.sub_resources.identifiers['FrobId'], 'Id')
+ self.assertEqual(model.sub_resources.resource_names[0], 'Frob')
+
+ resource = model.sub_resources.resources[0]
+ self.assertEqual(resource.name, 'Frob')
+
+ def test_resource_collections(self):
+ model = ResourceModel('test', {
+ 'hasMany': {
+ 'Frobs': {
+ 'request': {
+ 'operation': 'GetFrobList'
+ },
+ 'resource': {
+ 'type': 'Frob'
+ },
+ 'path': 'FrobList[]'
+ }
+ }
+ }, {
+ 'Frob': {}
+ })
+
+ self.assertIsInstance(model.collections, list)
+ self.assertEqual(len(model.collections), 1)
+ self.assertIsInstance(model.collections[0], Collection)
+ self.assertEqual(model.collections[0].request.operation, 'GetFrobList')
+ self.assertEqual(model.collections[0].resource.type, 'Frob')
+ self.assertEqual(model.collections[0].resource.model.name, 'Frob')
+ self.assertEqual(model.collections[0].path, 'FrobList[]')
diff --git a/tests/unit/resources/test_params.py b/tests/unit/resources/test_params.py
index 9ed74f6527..c7dcfe01ba 100644
--- a/tests/unit/resources/test_params.py
+++ b/tests/unit/resources/test_params.py
@@ -11,79 +11,74 @@
# ANY KIND, either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
+from boto3.resources.model import Request
from boto3.resources.params import create_request_parameters
from tests import BaseTestCase, mock
class TestServiceActionParams(BaseTestCase):
def test_service_action_params_identifier(self):
- action_def = {
- 'request': {
- 'operation': 'GetFrobs',
- 'params': [
- {
- 'target': 'WarehouseUrl',
- 'sourceType': 'identifier',
- 'source': 'Url'
- }
- ]
- }
- }
+ request_model = Request({
+ 'operation': 'GetFrobs',
+ 'params': [
+ {
+ 'target': 'WarehouseUrl',
+ 'sourceType': 'identifier',
+ 'source': 'Url'
+ }
+ ]
+ })
parent = mock.Mock()
parent.url = 'w-url'
- params = create_request_parameters(parent, action_def['request'])
+ params = create_request_parameters(parent, request_model)
self.assertEqual(params['WarehouseUrl'], 'w-url',
'Parameter not set from resource identifier')
def test_service_action_params_data_member(self):
- action_def = {
- 'request': {
- 'operation': 'GetFrobs',
- 'params': [
- {
- 'target': 'WarehouseUrl',
- 'sourceType': 'dataMember',
- 'source': 'some_member'
- }
- ]
- }
- }
+ request_model = Request({
+ 'operation': 'GetFrobs',
+ 'params': [
+ {
+ 'target': 'WarehouseUrl',
+ 'sourceType': 'dataMember',
+ 'source': 'some_member'
+ }
+ ]
+ })
parent = mock.Mock()
parent.some_member = 'w-url'
- params = create_request_parameters(parent, action_def['request'])
+ params = create_request_parameters(parent, request_model)
self.assertEqual(params['WarehouseUrl'], 'w-url',
'Parameter not set from resource property')
def test_service_action_params_constants(self):
- action_def = {
- 'request': {
- 'operation': 'GetFrobs',
- 'params': [
- {
- 'target': 'Param1',
- 'sourceType': 'string',
- 'source': 'param1'
- },
- {
- 'target': 'Param2',
- 'sourceType': 'integer',
- 'source': 123
- },
- {
- 'target': 'Param3',
- 'sourceType': 'boolean',
- 'source': True
- }
- ]
- }
- }
-
- params = create_request_parameters(None, action_def['request'])
+ request_model = Request({
+ 'operation': 'GetFrobs',
+ 'params': [
+ {
+ 'target': 'Param1',
+ 'sourceType': 'string',
+ 'source': 'param1'
+ },
+ {
+ 'target': 'Param2',
+ 'sourceType': 'integer',
+ 'source': 123
+ },
+ {
+ 'target': 'Param3',
+ 'sourceType': 'boolean',
+ 'source': True
+ }
+ ]
+ })
+
+ params = create_request_parameters(None, request_model)
self.assertEqual(params['Param1'], 'param1',
'Parameter not set from string constant')
@@ -93,37 +88,33 @@ def test_service_action_params_constants(self):
'Parameter not set from boolean constant')
def test_service_action_params_invalid(self):
- action_def = {
- 'request': {
- 'operation': 'GetFrobs',
- 'params': [
- {
- 'target': 'Param1',
- 'sourceType': 'invalid',
- 'source': 'param1'
- }
- ]
- }
- }
+ request_model = Request({
+ 'operation': 'GetFrobs',
+ 'params': [
+ {
+ 'target': 'Param1',
+ 'sourceType': 'invalid',
+ 'source': 'param1'
+ }
+ ]
+ })
with self.assertRaises(NotImplementedError):
- create_request_parameters(None, action_def['request'])
+ create_request_parameters(None, request_model)
def test_action_params_list(self):
- action_def = {
- 'request': {
- 'operation': 'GetFrobs',
- 'params': [
- {
- 'target': 'WarehouseUrls[0]',
- 'sourceType': 'string',
- 'source': 'w-url'
- }
- ]
- }
- }
-
- params = create_request_parameters(None, action_def['request'])
+ request_model = Request({
+ 'operation': 'GetFrobs',
+ 'params': [
+ {
+ 'target': 'WarehouseUrls[0]',
+ 'sourceType': 'string',
+ 'source': 'w-url'
+ }
+ ]
+ })
+
+ params = create_request_parameters(None, request_model)
self.assertIsInstance(params['WarehouseUrls'], list,
'Parameter did not create a list')
diff --git a/tests/unit/resources/test_response.py b/tests/unit/resources/test_response.py
index e916579aa3..369b6a36fd 100644
--- a/tests/unit/resources/test_response.py
+++ b/tests/unit/resources/test_response.py
@@ -13,6 +13,7 @@
from tests import BaseTestCase, mock
from boto3.resources.base import ServiceResource
+from boto3.resources.model import ResponseResource, Parameter
from boto3.resources.factory import ResourceFactory
from boto3.resources.response import build_identifiers, build_empty_response,\
RawHandler, ResourceHandler
@@ -20,11 +21,8 @@
class TestBuildIdentifiers(BaseTestCase):
def test_build_identifier_from_res_path_scalar(self):
- identifier_defs = [{
- 'target': 'Id',
- 'sourceType': 'responsePath',
- 'source': 'Container.Frob.Id'
- }]
+ identifiers = [Parameter(target='Id', source_type='responsePath',
+ source='Container.Frob.Id')]
parent = mock.Mock()
params = {}
@@ -36,17 +34,14 @@ def test_build_identifier_from_res_path_scalar(self):
}
}
- values = build_identifiers(identifier_defs, parent, params, response)
+ values = build_identifiers(identifiers, parent, params, response)
self.assertEqual(values['id'], 'response-path',
'Identifier loaded from responsePath scalar not set')
def test_build_identifier_from_res_path_list(self):
- identifier_defs = [{
- 'target': 'Id',
- 'sourceType': 'responsePath',
- 'source': 'Container.Frobs[].Id'
- }]
+ identifiers = [Parameter(target='Id', source_type='responsePath',
+ source='Container.Frobs[].Id')]
parent = mock.Mock()
params = {}
@@ -60,17 +55,14 @@ def test_build_identifier_from_res_path_list(self):
}
}
- values = build_identifiers(identifier_defs, parent, params, response)
+ values = build_identifiers(identifiers, parent, params, response)
self.assertEqual(values['id'], ['response-path'],
'Identifier loaded from responsePath list not set')
def test_build_identifier_from_parent_identifier(self):
- identifier_defs = [{
- 'target': 'Id',
- 'sourceType': 'identifier',
- 'source': 'Id'
- }]
+ identifiers = [Parameter(target='Id', source_type='identifier',
+ source='Id')]
parent = mock.Mock()
parent.id = 'identifier'
@@ -81,17 +73,14 @@ def test_build_identifier_from_parent_identifier(self):
}
}
- values = build_identifiers(identifier_defs, parent, params, response)
+ values = build_identifiers(identifiers, parent, params, response)
self.assertEqual(values['id'], 'identifier',
'Identifier loaded from parent identifier not set')
def test_build_identifier_from_parent_data_member(self):
- identifier_defs = [{
- 'target': 'Id',
- 'sourceType': 'dataMember',
- 'source': 'Member'
- }]
+ identifiers = [Parameter(target='Id', source_type='dataMember',
+ source='Member')]
parent = mock.Mock()
parent.member = 'data-member'
@@ -102,17 +91,14 @@ def test_build_identifier_from_parent_data_member(self):
}
}
- values = build_identifiers(identifier_defs, parent, params, response)
+ values = build_identifiers(identifiers, parent, params, response)
self.assertEqual(values['id'], 'data-member',
'Identifier loaded from parent data member not set')
def test_build_identifier_from_req_param(self):
- identifier_defs = [{
- 'target': 'Id',
- 'sourceType': 'requestParameter',
- 'source': 'Param'
- }]
+ identifiers = [Parameter(target='Id', source_type='requestParameter',
+ source='Param')]
parent = mock.Mock()
params = {
@@ -124,17 +110,14 @@ def test_build_identifier_from_req_param(self):
}
}
- values = build_identifiers(identifier_defs, parent, params, response)
+ values = build_identifiers(identifiers, parent, params, response)
self.assertEqual(values['id'], 'request-param',
'Identifier loaded from request parameter not set')
def test_build_identifier_from_invalid_source_type(self):
- identifier_defs = [{
- 'target': 'Id',
- 'sourceType': 'invalid',
- 'source': 'abc'
- }]
+ identifiers = [Parameter(target='Id', source_type='invalid',
+ source='abc')]
parent = mock.Mock()
params = {}
@@ -145,7 +128,7 @@ def test_build_identifier_from_invalid_source_type(self):
}
with self.assertRaises(NotImplementedError):
- build_identifiers(identifier_defs, parent, params, response)
+ build_identifiers(identifiers, parent, params, response)
class TestBuildEmptyResponse(BaseTestCase):
@@ -358,9 +341,11 @@ def get_resource(self, search_path, response):
'source': self.identifier_source},
]
}
+ resource_model = ResponseResource(
+ request_resource_def, self.resource_defs)
handler = ResourceHandler(search_path, self.factory,
- self.resource_defs, self.service_model, request_resource_def,
+ self.resource_defs, self.service_model, resource_model,
'GetFrobs')
return handler(self.parent, self.params, response)
| diff --git a/docs/source/reference/core/resources.rst b/docs/source/reference/core/resources.rst
index 4fc0cea325..f868487f02 100644
--- a/docs/source/reference/core/resources.rst
+++ b/docs/source/reference/core/resources.rst
@@ -4,6 +4,13 @@
Resources Reference
===================
+Resource Model
+--------------
+
+.. automodule:: boto3.resources.model
+ :members:
+ :undoc-members:
+
Request Parameters
------------------
| [
{
"components": [
{
"doc": "A resource identifier, given by its name.\n\n:type name: string\n:param name: The name of the identifier",
"lines": [
27,
36
],
"name": "Identifier",
"signature": "class Identifier(object):",
"type": "class"
... | [
"tests/unit/resources/test_action.py::TestServiceActionCall::test_service_action_calls_operation",
"tests/unit/resources/test_action.py::TestServiceActionCall::test_service_action_calls_raw_handler",
"tests/unit/resources/test_action.py::TestServiceActionCall::test_service_action_calls_resource_handler",
"tes... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Add a resource model abstraction.
This change adds an abstraction over the raw JSON resource descriptions,
similar to how Botocore has a `ServiceModel` which abstracts its JSON
service descriptions. Advantages of this approach:
- Pythonic interface (e.g. `model.actions[0].request.operation`)
- Encapsulation of minor JSON changes (e.g. field name change)
All of the factory code is updated to use it and documentation code
will use it in the future. Tests have been updated accordingly
and the model has its own tests as well.
cc: @jamesls, @kyleknap
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in boto3/resources/model.py]
(definition of Identifier:)
class Identifier(object):
"""A resource identifier, given by its name.
:type name: string
:param name: The name of the identifier"""
(definition of Identifier.__init__:)
def __init__(self, name):
(definition of Action:)
class Action(object):
"""A service operation action.
:type name: string
:param name: The name of the action
:type definition: dict
:param definition: The JSON definition
:type resource_defs: dict
:param resource_defs: All resources defined in the service"""
(definition of Action.__init__:)
def __init__(self, name, definition, resource_defs):
(definition of Request:)
class Request(object):
"""A service operation action request.
:type definition: dict
:param definition: The JSON definition"""
(definition of Request.__init__:)
def __init__(self, definition):
(definition of Request.params:)
def params(self):
"""Get a list of auto-filled parameters for this request.
:type: list(:py:class:`Parameter`)"""
(definition of Parameter:)
class Parameter(object):
"""An auto-filled parameter which has a source and target. For example,
the ``QueueUrl`` may be auto-filled from a resource's ``url`` identifier
when making calls to ``queue.receive_messages``.
:type target: string
:param target: The destination parameter name, e.g. ``QueueUrl``
:type source_type: string
:param source_type: Where the source is defined.
:type source: string
:param source: The source name, e.g. ``Url``"""
(definition of Parameter.__init__:)
def __init__(self, target, source_type, source):
(definition of ResponseResource:)
class ResponseResource(object):
"""A resource response to create after performing an action.
:type definition: dict
:param definition: The JSON definition
:type resource_defs: dict
:param resource_defs: All resources defined in the service"""
(definition of ResponseResource.__init__:)
def __init__(self, definition, resource_defs):
(definition of ResponseResource.identifiers:)
def identifiers(self):
"""A list of resource identifiers.
:type: list(:py:class:`Identifier`)"""
(definition of ResponseResource.model:)
def model(self):
"""Get the resource model for the response resource.
:type: :py:class:`ResourceModel`"""
(definition of Collection:)
class Collection(Action):
"""A group of resources. See :py:class:`Action`.
:type name: string
:param name: The name of the collection
:type definition: dict
:param definition: The JSON definition
:type resource_defs: dict
:param resource_defs: All resources defined in the service"""
(definition of SubResourceList:)
class SubResourceList(object):
"""A list of information about sub-resources. It includes access
to identifiers as well as resource names and models.
:type definition: dict
:param definition: The JSON definition
:type resource_defs: dict
:param resource_defs: All resources defined in the service"""
(definition of SubResourceList.__init__:)
def __init__(self, definition, resource_defs):
(definition of SubResourceList.resources:)
def resources(self):
"""Get a list of resource models contained in this sub-resource
entry.
:type: list(:py:class:`ResourceModel`)"""
(definition of ResourceModel:)
class ResourceModel(object):
"""A model representing a resource, defined via a JSON description
format. A resource has identifiers, attributes, actions,
sub-resources, references and collections. For more information
on resources, see :ref:`guide_resources`.
:type name: string
:param name: The name of this resource, e.g. ``sqs`` or ``Queue``
:type definition: dict
:param definition: The JSON definition
:type resource_defs: dict
:param resource_defs: All resources defined in the service"""
(definition of ResourceModel.__init__:)
def __init__(self, name, definition, resource_defs):
(definition of ResourceModel.identifiers:)
def identifiers(self):
"""Get a list of resource identifiers.
:type: list(:py:class:`Identifier`)"""
(definition of ResourceModel.load:)
def load(self):
"""Get the load action for this resource, if it is defined.
:type: :py:class:`Action` or ``None``"""
(definition of ResourceModel.actions:)
def actions(self):
"""Get a list of actions for this resource.
:type: list(:py:class:`Action`)"""
(definition of ResourceModel.collections:)
def collections(self):
"""Get a list of collections for this resource.
:type: list(:py:class:`Collection`)"""
[end of new definitions in boto3/resources/model.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 196a2da7490a1a661a0103b8770bd31e34e147f2 | |
falconry__falcon-329 | 329 | falconry/falcon | null | 22ae1fa1e15b3d6630386f1e76783f1c0450edf5 | 2014-10-06T10:28:00Z | diff --git a/falcon/api.py b/falcon/api.py
index 122ddfca8..2ac328103 100644
--- a/falcon/api.py
+++ b/falcon/api.py
@@ -41,6 +41,21 @@ class API(object):
after (callable, optional): A global action hook (or list of hooks)
to call after each on_* responder, for all resources. Similar to
the ``after`` decorator, but applies to the entire API.
+ middleware(callable class, optional): A global action middleware (or
+ list of middlewares) to be executed wrapping the responder.
+ Middleware object can define process_request and process_response,
+ and they will be executed in natural order, as if they were round
+ layers over responder:
+ process_request is executed in the same order of definition
+ process_response and process_exception in reversed order
+ if you define middleware=[OutSideMw, InsideMw]
+ the order will be:
+ OutsideMw.process_request
+ InsideMw.process_request
+ responder
+ InsideMw.process_response
+ OutsideMw.process_request
+ Any exception would apply process_response of the unexecuted mw.
request_type (Request, optional): Request-alike class to use instead
of Falcon's default class. Useful if you wish to extend
``falcon.request.Request`` with a custom ``context_type``.
@@ -53,10 +68,11 @@ class API(object):
__slots__ = ('_after', '_before', '_request_type', '_response_type',
'_error_handlers', '_media_type', '_routes', '_sinks',
- '_serialize_error', 'req_options')
+ '_serialize_error', 'req_options', '_middleware')
def __init__(self, media_type=DEFAULT_MEDIA_TYPE, before=None, after=None,
- request_type=Request, response_type=Response):
+ request_type=Request, response_type=Response,
+ middleware=None):
self._routes = []
self._sinks = []
self._media_type = media_type
@@ -64,6 +80,9 @@ def __init__(self, media_type=DEFAULT_MEDIA_TYPE, before=None, after=None,
self._before = helpers.prepare_global_hooks(before)
self._after = helpers.prepare_global_hooks(after)
+ # set middleware
+ self._middleware = helpers.prepare_mw(middleware)
+
self._request_type = request_type
self._response_type = response_type
@@ -90,6 +109,7 @@ def __call__(self, env, start_response):
req = self._request_type(env, options=self.req_options)
resp = self._response_type()
resource = None
+ stack_mw = [] # Keep track of executed mw
try:
# NOTE(warsaw): Moved this to inside the try except because it's
@@ -106,12 +126,18 @@ def __call__(self, env, start_response):
# so disabled on relevant lines. All paths are tested
# afaict.
try:
+ # Run request middlewares and fill stack_mw
+ self._call_req_mw(stack_mw, req, resp, params)
+
responder(req, resp, **params) # pragma: no cover
+ # Run middlewares for response
+ self._call_resp_mw(stack_mw, req, resp)
except Exception as ex:
for err_type, err_handler in self._error_handlers:
if isinstance(ex, err_type):
err_handler(ex, req, resp, params)
self._call_after_hooks(req, resp, resource)
+ self._call_resp_mw(stack_mw, req, resp)
break # pragma: no cover
else:
@@ -122,11 +148,17 @@ def __call__(self, env, start_response):
# indeed, should perhaps be slower to create
# backpressure on clients that are issuing bad
# requests.
+ # NOTE(ealogar): This will executed remaining
+ # process_response when no error_handler is given
+ # and for whatever exception. If an HTTPError is raised
+ # remaining process_response will be executed later.
+ self._call_resp_mw(stack_mw, req, resp)
raise
except HTTPError as ex:
self._compose_error_response(req, resp, ex)
self._call_after_hooks(req, resp, resource)
+ self._call_resp_mw(stack_mw, req, resp)
#
# Set status and headers
@@ -383,6 +415,28 @@ def _compose_error_response(self, req, resp, error):
# it was mistakenly set by the app.
resp.content_type = media_type
+ def _call_req_mw(self, stack_mw, req, resp, params):
+ """Runs the process_request middleware and tracks"""
+ for mw in self._middleware:
+
+ try:
+ getattr(mw, 'process_request')(req, resp, params)
+ except AttributeError:
+ pass
+ # Put executed mw in top of stack
+ stack_mw.insert(0, mw) # keep track from outside
+
+ def _call_resp_mw(self, stack_mw, req, resp):
+ """Runs the process_response middleware and tracks"""
+ # Make copy of stack_mw
+ for mw in list(stack_mw):
+ # Remove mw about to be executed
+ stack_mw.remove(mw)
+ try:
+ getattr(mw, 'process_response')(req, resp)
+ except AttributeError:
+ pass
+
def _call_after_hooks(self, req, resp, resource):
"""Executes each of the global "after" hooks, in turn."""
diff --git a/falcon/api_helpers.py b/falcon/api_helpers.py
index 44a7a487c..f2a26aab1 100644
--- a/falcon/api_helpers.py
+++ b/falcon/api_helpers.py
@@ -41,6 +41,42 @@ def prepare_global_hooks(hooks):
return hooks
+def prepare_mw(middleware=None):
+ """Check middleware interface and prepare it to iterate.
+
+ Args:
+ middleware: list (or object) of input middleware
+
+ Returns:
+ A middleware list
+ """
+ if middleware is None:
+ middleware = []
+ else:
+ if not isinstance(middleware, list):
+ middleware = [middleware]
+
+ # check basic interface of middleware objects
+ for mw in middleware:
+ if not hasattr(mw, 'process_request') and not\
+ hasattr(mw, 'process_response'):
+
+ raise TypeError('{0} is not a valid middlware'.format(str(mw)))
+
+ # Check process_request and process_response are bounded methods
+ for mw_method in ('process_request', 'process_response'):
+ method_mw_bound = getattr(mw, mw_method, None)
+
+ if method_mw_bound is not None:
+
+ if six.get_method_self(method_mw_bound) is None:
+ raise AttributeError(
+ '{0} must be a bound method'.format(method_mw_bound))\
+ # pragma: no cover
+
+ return middleware
+
+
def should_ignore_body(status, method):
"""Return True if the status or method indicates no body, per RFC 2616
| diff --git a/tests/test_middlewares.py b/tests/test_middlewares.py
new file mode 100644
index 000000000..6591d6115
--- /dev/null
+++ b/tests/test_middlewares.py
@@ -0,0 +1,294 @@
+import falcon
+import falcon.testing as testing
+from datetime import datetime
+
+context = {'executed_methods': []}
+
+
+class RequestTimeMiddleware(object):
+
+ def process_request(self, req, resp, params):
+ global context
+ context['start_time'] = datetime.utcnow()
+
+ def process_response(self, req, resp):
+ global context
+ context['end_time'] = datetime.utcnow()
+
+
+class TransactionIdMiddleware(object):
+
+ def process_request(self, req, resp, params):
+ global context
+ context['transaction_id'] = 'unique-req-id'
+
+
+class ExecutedFirstMiddleware(object):
+
+ def process_request(self, req, resp, params):
+ global context
+ context['executed_methods'].append(
+ '{0}.{1}'.format(self.__class__.__name__, 'process_request'))
+
+ def process_response(self, req, resp):
+ global context
+ context['executed_methods'].append(
+ '{0}.{1}'.format(self.__class__.__name__, 'process_response'))
+
+
+class ExecutedLastMiddleware(ExecutedFirstMiddleware):
+ pass
+
+
+class MiddlewareClassResource(object):
+
+ def on_get(self, req, resp):
+ resp.status = falcon.HTTP_200
+ resp.body = {'status': 'ok'}
+
+
+class TestMiddleware(testing.TestBase):
+
+ def setUp(self):
+ # Clear context
+ global context
+ context = {'executed_methods': []}
+ testing.TestBase.setUp(self)
+
+
+class TestRequestTimeMiddleware(TestMiddleware):
+
+ def test_add_invalid_middleware(self):
+ """Test than an invalid class can not be added as middleware"""
+ class InvalidMiddleware():
+ def process_request(self, *args):
+ pass
+
+ mw_list = [RequestTimeMiddleware(), InvalidMiddleware]
+ self.assertRaises(AttributeError, falcon.API, middleware=mw_list)
+ mw_list = [RequestTimeMiddleware(), "InvalidMiddleware"]
+ self.assertRaises(TypeError, falcon.API, middleware=mw_list)
+ mw_list = [{'process_request': 90}]
+ self.assertRaises(TypeError, falcon.API, middleware=mw_list)
+
+ def test_response_middleware_raises_exception(self):
+ """Test that error in response middleware is propagated up"""
+ class RaiseErrorMiddleware(object):
+
+ def process_response(self, req, resp, params):
+ raise Exception("Always fail")
+
+ self.api = falcon.API(middleware=[RaiseErrorMiddleware()])
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ self.assertRaises(Exception, self.simulate_request, self.test_route)
+
+ def test_log_get_request(self):
+ """Test that Log middleware is executed"""
+ global context
+ self.api = falcon.API(middleware=[RequestTimeMiddleware()])
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ body = self.simulate_request(self.test_route)
+ self.assertEqual([{'status': 'ok'}], body)
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn("start_time", context)
+ self.assertIn("end_time", context)
+ self.assertTrue(context['end_time'] > context['start_time'],
+ "process_response not executed after request")
+
+
+class TestTransactionIdMiddleware(TestMiddleware):
+
+ def test_generate_trans_id_with_request(self):
+ """Test that TransactionIdmiddleware is executed"""
+ global context
+ self.api = falcon.API(middleware=TransactionIdMiddleware())
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ body = self.simulate_request(self.test_route)
+ self.assertEqual([{'status': 'ok'}], body)
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn("transaction_id", context)
+ self.assertEqual("unique-req-id", context['transaction_id'])
+
+
+class TestSeveralMiddlewares(TestMiddleware):
+
+ def test_generate_trans_id_and_time_with_request(self):
+ global context
+ self.api = falcon.API(middleware=[TransactionIdMiddleware(),
+ RequestTimeMiddleware()])
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ body = self.simulate_request(self.test_route)
+ self.assertEqual([{'status': 'ok'}], body)
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ self.assertIn("transaction_id", context)
+ self.assertEqual("unique-req-id", context['transaction_id'])
+ self.assertIn("start_time", context)
+ self.assertIn("end_time", context)
+ self.assertTrue(context['end_time'] > context['start_time'],
+ "process_response not executed after request")
+
+ def test_middleware_execution_order(self):
+ global context
+ self.api = falcon.API(middleware=[ExecutedFirstMiddleware(),
+ ExecutedLastMiddleware()])
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ body = self.simulate_request(self.test_route)
+ self.assertEqual([{'status': 'ok'}], body)
+ self.assertEqual(self.srmock.status, falcon.HTTP_200)
+ # as the method registration is in a list, the order also is
+ # tested
+ expectedExecutedMethods = [
+ "ExecutedFirstMiddleware.process_request",
+ "ExecutedLastMiddleware.process_request",
+ "ExecutedLastMiddleware.process_response",
+ "ExecutedFirstMiddleware.process_response"
+ ]
+ self.assertEqual(expectedExecutedMethods, context['executed_methods'])
+
+ def test_inner_mw_throw_exception(self):
+ """Test that error in inner middleware leaves"""
+ global context
+
+ class RaiseErrorMiddleware(object):
+
+ def process_request(self, req, resp, params):
+ raise Exception("Always fail")
+
+ self.api = falcon.API(middleware=[TransactionIdMiddleware(),
+ RequestTimeMiddleware(),
+ RaiseErrorMiddleware()])
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ self.assertRaises(Exception, self.simulate_request, self.test_route)
+
+ # RequestTimeMiddleware process_response should be executed
+ self.assertIn("transaction_id", context)
+ self.assertIn("start_time", context)
+ self.assertIn("end_time", context)
+
+ def test_inner_mw_with_ex_handler_throw_exception(self):
+ """Test that error in inner middleware leaves"""
+ global context
+
+ class RaiseErrorMiddleware(object):
+
+ def process_request(self, req, resp, params):
+ raise Exception("Always fail")
+
+ self.api = falcon.API(middleware=[TransactionIdMiddleware(),
+ RequestTimeMiddleware(),
+ RaiseErrorMiddleware()])
+
+ def handler(ex, req, resp, params):
+ context['error_handler'] = True
+
+ self.api.add_error_handler(Exception, handler)
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ self.simulate_request(self.test_route)
+
+ # RequestTimeMiddleware process_response should be executed
+ self.assertIn("transaction_id", context)
+ self.assertIn("start_time", context)
+ self.assertIn("end_time", context)
+ self.assertIn("error_handler", context)
+
+ def test_outer_mw_with_ex_handler_throw_exception(self):
+ """Test that error in inner middleware leaves"""
+ global context
+
+ class RaiseErrorMiddleware(object):
+
+ def process_request(self, req, resp, params):
+ raise Exception("Always fail")
+
+ self.api = falcon.API(middleware=[TransactionIdMiddleware(),
+ RaiseErrorMiddleware(),
+ RequestTimeMiddleware()])
+
+ def handler(ex, req, resp, params):
+ context['error_handler'] = True
+
+ self.api.add_error_handler(Exception, handler)
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ self.simulate_request(self.test_route)
+
+ # Any mw is executed now...
+ self.assertIn("transaction_id", context)
+ self.assertNotIn("start_time", context)
+ self.assertNotIn("end_time", context)
+ self.assertIn("error_handler", context)
+
+ def test_order_mw_executed_when_exception_in_resp(self):
+ """Test that error in inner middleware leaves"""
+ global context
+
+ class RaiseErrorMiddleware(object):
+
+ def process_response(self, req, resp):
+ raise Exception("Always fail")
+
+ self.api = falcon.API(middleware=[ExecutedFirstMiddleware(),
+ RaiseErrorMiddleware(),
+ ExecutedLastMiddleware()])
+
+ def handler(ex, req, resp, params):
+ context['error_handler'] = True
+
+ self.api.add_error_handler(Exception, handler)
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ self.simulate_request(self.test_route)
+
+ # Any mw is executed now...
+ expectedExecutedMethods = [
+ "ExecutedFirstMiddleware.process_request",
+ "ExecutedLastMiddleware.process_request",
+ "ExecutedLastMiddleware.process_response",
+ "ExecutedFirstMiddleware.process_response"
+ ]
+ self.assertEqual(expectedExecutedMethods, context['executed_methods'])
+
+ def test_order_mw_executed_when_exception_in_req(self):
+ """Test that error in inner middleware leaves"""
+ global context
+
+ class RaiseErrorMiddleware(object):
+
+ def process_request(self, req, resp):
+ raise Exception("Always fail")
+
+ self.api = falcon.API(middleware=[ExecutedFirstMiddleware(),
+ RaiseErrorMiddleware(),
+ ExecutedLastMiddleware()])
+
+ def handler(ex, req, resp, params):
+ context['error_handler'] = True
+
+ self.api.add_error_handler(Exception, handler)
+
+ self.api.add_route(self.test_route, MiddlewareClassResource())
+
+ self.simulate_request(self.test_route)
+
+ # Any mw is executed now...
+ expectedExecutedMethods = [
+ "ExecutedFirstMiddleware.process_request",
+ "ExecutedFirstMiddleware.process_response"
+ ]
+ self.assertEqual(expectedExecutedMethods, context['executed_methods'])
| [
{
"components": [
{
"doc": "Runs the process_request middleware and tracks",
"lines": [
418,
427
],
"name": "API._call_req_mw",
"signature": "def _call_req_mw(self, stack_mw, req, resp, params):",
"type": "function"
},
{
... | [
"tests/test_middlewares.py::TestRequestTimeMiddleware::test_add_invalid_middleware",
"tests/test_middlewares.py::TestRequestTimeMiddleware::test_log_get_request",
"tests/test_middlewares.py::TestRequestTimeMiddleware::test_response_middleware_raises_exception",
"tests/test_middlewares.py::TestTransactionIdMid... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat(API): Add middlewares to falcon API
Currently falcon provides hooks as a way to add logic to all "routed" requests. With this approach it's not possible to execute code when a request is not routed (e.g., method not allowed, path not found ...)
I would like to include the concept of middlewares (same as django) in falcon as it may be useful for many projects. Typical middlewares:
- Generate unique transaction for all requests
- Log all requests and include time ellapsed
- Authentication (imo this should happend before any logic of routing, aka find responder).
A list of Middleware classes can be added to falcon WSGI in this way:
api.add_middlewares([MiddClass1,MidClass2])
A middleware will be executed always with every request/response dispatched, no matters routing and errors.
A middleware can define process_request or process_response or both; internally the code will be executed when corresponds.
Arguments are the same as hooks and error_handlers. In unit tests, some examples can be seen.
----------
@ealogar Perhaps it would be more elegant to simply always execute global hooks, rather than only executing when a route is matched? Technically, doing so would be a breaking change, but I would surprised if anyone were relying on the fact that global hooks only run on matched routes.
Thoughts?
I think this feature is actually required. There are a lot of use cases when middlewares are to be executed without any routing.
I understand that global hooks should not be associated to any routing (and it makes sense). A hook associated to a resource class should be executed if the resource path is valid (although there is no method in the resource class; i.e. if I only implement GET in a resource, then the hook should be executed for a POST request to the resource).
@kgriffs I didn't want to break the design of existing hooks, that's why I propose this. Nevermind, defining middlewares as global hooks with before and after would be easy to do:
- Change the creation of middlewares to import time
- Maybe not use classes and just callables as now
- The middleware logic in API would remain quite similar
IMO we shouldn't remove the falcon.before decorator for every time we want use with every resource.
I can add another commit using before and after global hooks....
Cool, what you guys are saying makes sense to me. My main concern was about having two things that are fairly similar in the framework (global hooks and middleware). This could result in confusion among the community and induce an extra maintenance burden going forward. Therefore, I'd like to see if we can unify the two concepts.
What do you think about this plan?
- Modify global hooks so they are no longer attached to the call chain, but are iterated over and executed by the API explicitly.
- Remove support from global hooks for the new `resource` param (since that has a precondition on routing). This shouldn't be a problem since the param is new for 0.2, which hasn't been released yet, so noone should be relying on it yet.
- Rename "global hooks" in code and docs to "middleware"
- Add a new kwarg, named "middleware", to API.__init__. This will take an instance of a class with a well-defined interface (see example below).
- Unify the code for executing the callables passed via the "before" and "after" with the middleware class passed via "middleware". Note that
- Note in the docs that the "before" and "after" kwargs are deprecated, and will be removed in the next version of the framework.
``` python
class Tracer(object):
# Optional; if present, will be executed before the request is routed
def process_request(self, req, resp, params):
pass
# Optional; if present, will be executed after the request is routed
def process_response(self, req, resp):
pass
tracer = Tracer()
# Deprecated
api = falcon.API(before=[tracer.process_request], after=[tracer.process_response])
# New
api = falcon.API(middleware=[tracer])
```
P.S. - I just realized that this change means global "hooks" / middleware will be able to rewrite paths and thus affect routing. That could come in handy...
@kgriffs I like your proposal! Take into account when deprecating that process_request is a bounded method to the object Tracer, so, when using before you will need to do something extra. IMO just defining a simple function.
I also like your proposal although I would rename "middleware" as "middlewares" because it is actually an array.
@ealogar wrote:
> I like your proposal! Take into account when deprecating that process_request is a bounded method to the object Tracer, so, when using before you will need to do something extra. IMO just defining a simple function.
If I understand your comment correctly, I think we are actually OK. Thanks to some Python magic, you don't have to do anything special to make this work pattern. Since the reference to the method is taken from an instance of the middleware class, it is bound such that the caller can treat it as a free function (i.e., no need to pass "self"). I tried the following with the latest code from master to make sure:
``` python
import falcon
import falcon.testing
import falcon
import falcon.testing
class Auth(object):
def process_request(self, req, resp, params):
msg = ('The provided auth token is invalid or has expired. '
'Please reauthenticate and try again.')
raise falcon.HTTPUnauthorized('Invalid Auth Token', msg)
class MyResource(object):
def on_get(self, req, resp):
pass
auth = Auth()
api = falcon.API(before=[auth.process_request])
api.add_route('/my-resource', MyResource())
srmock = falcon.testing.StartResponseMock()
env = falcon.testing.create_environ('/my-resource')
api(env, srmock)
```
@jlorgal wrote:
> I also like your proposal although I would rename "middleware" as "middlewares" because it is actually an array.
To be honest, I would prefer sticking with "middleware". Anecdotally, "middlewares" sounds awkward to me as a native English speaker, and I can't remember the last time I heard someone use that form in regular conversation.
At the risk of bikeshedding on this topic, if we [treat the word as a mass noun](https://github.com/rack/rack/issues/332) (which makes sense to me), then in the code below, "middleware" works as both a modifier and a category:
``` python
api = falcon.API(middleware=[tracer])
```
As a modifier, the noun is implied by the context (i.e., "pass a list of middleware components" ). Or, as a category: "pass a list of components that can be categorized as middleware because they implement a specific interface contract."
@kgriffs are you going to do the changes yourself or do you want I change this PR ?
If you would like to do it, that would be great. Thanks!
On Thursday, October 16, 2014, Eduardo notifications@github.com wrote:
> @kgriffs https://github.com/kgriffs are you going to do the changes
> yourself or do you want I change this PR ?
>
> —
> Reply to this email directly or view it on GitHub
> https://github.com/racker/falcon/pull/329#issuecomment-59394909.
@ealogar Have you had a chance to work on this any further? I would like to land this before cutting the 0.2 beta release (hopefully) later this week.
@kgriffs Yes, I am doing something right now
@kgriffs I have included some commits with a code proposal. Some of them are needed for fixing merging issues. I do not more history rewrite for avoiding lose something...
@kgriffs I have removed the compatibility with resource for middleware, Leave in falcon.after and before decorarators
@kgriffs Thanks for comments, I work on them tonight!
BTW, thanks for working on this! I'm really looking forward to having this feature in the framework.
@kgriffs Add a commit with your feedback and also with a process_exception method ... check what you think of it!
@kgriffs I just add a commit with:
- Implementation of middleware
- Leave before and after global hooks same before
- When raising an exception in a middleware, run process_response methods who still not executed depending on chain call:
- if an exception happen in process_request only process_response of already executed mw will be called.
- If an exception happens in responder, all process_response will be executed
- if an exception happens in process_response, the rest of process_response will be called but not those process_response already executed
In all cases the exception handler is performed first
For developing this I just add a stack_mw of executed mw.
Cool, this looks like it is on the right track. I still need to double-check the tests - will do that ASAP. Thanks!
[](https://coveralls.io/builds/1402335)
Coverage increased (+0.01%) when pulling **9eb61ddc129996f26822d7d6d42f37dc58da2fd8 on ealogar:feature/middlewares** into **3a696c38f3d1507fb2a579f7615e88a9532e03eb on racker:master**.
@kgriffs I just add a comment with your feedback and fix merge conflicts with master. It also seems that coverage has been increased !
Cool, I think this looks good! As a last step, would you mind squashing all your commits down to a single one? When you do this, please remember to follow the commit message style specified in [CONTRIBUTING.md](https://github.com/racker/falcon/blob/master/CONTRIBUTING.md#commit-message-format).
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/api.py]
(definition of API._call_req_mw:)
def _call_req_mw(self, stack_mw, req, resp, params):
"""Runs the process_request middleware and tracks"""
(definition of API._call_resp_mw:)
def _call_resp_mw(self, stack_mw, req, resp):
"""Runs the process_response middleware and tracks"""
[end of new definitions in falcon/api.py]
[start of new definitions in falcon/api_helpers.py]
(definition of prepare_mw:)
def prepare_mw(middleware=None):
"""Check middleware interface and prepare it to iterate.
Args:
middleware: list (or object) of input middleware
Returns:
A middleware list"""
[end of new definitions in falcon/api_helpers.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | ||
falconry__falcon-285 | 285 | falconry/falcon | null | 040cea7564c6bfc293a2a2f9638a76fe1b030fcc | 2014-06-30T16:51:38Z | diff --git a/falcon/api.py b/falcon/api.py
index 02abc71b2..61bb7375b 100644
--- a/falcon/api.py
+++ b/falcon/api.py
@@ -88,7 +88,7 @@ def __call__(self, env, start_response):
req = self._request_type(env)
resp = self._response_type()
- responder, params = self._get_responder(
+ responder, params, resource = self._get_responder(
req.path, req.method)
try:
@@ -104,6 +104,7 @@ def __call__(self, env, start_response):
for err_type, err_handler in self._error_handlers:
if isinstance(ex, err_type):
err_handler(ex, req, resp, params)
+ self._call_after_hooks(req, resp, resource)
break # pragma: no cover
else:
@@ -118,6 +119,7 @@ def __call__(self, env, start_response):
except HTTPError as ex:
helpers.compose_error_response(req, resp, ex)
+ self._call_after_hooks(req, resp, resource)
#
# Set status and headers
@@ -194,7 +196,7 @@ def on_put(self, req, resp, thing):
# Insert at the head of the list in case we get duplicate
# adds (will cause the last one to win).
- self._routes.insert(0, (path_template, method_map))
+ self._routes.insert(0, (path_template, method_map, resource))
def add_sink(self, sink, prefix=r'/'):
"""Adds a "sink" responder to the API.
@@ -267,7 +269,7 @@ def add_error_handler(self, exception, handler=None):
'member of the given exception class.')
# Insert at the head of the list in case we get duplicate
- # adds (will cause the last one to win).
+ # adds (will cause the most recently added one to win).
self._error_handlers.insert(0, (exception, handler))
# ------------------------------------------------------------------------
@@ -282,9 +284,10 @@ def _get_responder(self, path, method):
method: HTTP method (uppercase) requested
Returns:
- A 2-member tuple consisting of a responder callable and
+ A 3-member tuple consisting of a responder callable,
a dict containing parsed path fields (if any were specified in
- the matching route's URI template).
+ the matching route's URI template), and a reference to the
+ responder's resource instance.
Note:
If a responder was matched to the given URI, but the HTTP
@@ -298,7 +301,7 @@ def _get_responder(self, path, method):
"""
for route in self._routes:
- path_template, method_map = route
+ path_template, method_map, resource = route
m = path_template.match(path)
if m:
params = m.groupdict()
@@ -311,6 +314,7 @@ def _get_responder(self, path, method):
break
else:
params = {}
+ resource = None
for pattern, sink in self._sinks:
m = pattern.match(path)
@@ -322,4 +326,18 @@ def _get_responder(self, path, method):
else:
responder = falcon.responders.path_not_found
- return (responder, params)
+ return (responder, params, resource)
+
+ def _call_after_hooks(self, req, resp, resource):
+ """Executes each of the global "after" hooks, in turn."""
+
+ if not self._after:
+ return
+
+ for hook in self._after:
+ try:
+ hook(req, resp, resource)
+ except TypeError:
+ # NOTE(kgriffs): Catching the TypeError is a heuristic to
+ # detect old hooks that do not accept the "resource" param
+ hook(req, resp)
diff --git a/falcon/hooks.py b/falcon/hooks.py
index 06fd02152..cf8ecf6ab 100644
--- a/falcon/hooks.py
+++ b/falcon/hooks.py
@@ -24,20 +24,23 @@ def before(action):
"""Decorator to execute the given action function *before* the responder.
Args:
- action (callable): A function of the form ``func(req, resp, params)``,
- where params is a dict of URI Template field names, if any,
+ action (callable): A function of the form
+ ``func(req, resp, resource, params)``, where `resource` is a
+ reference to the resource class associated with the request,
+ and `params` is a dict of URI Template field names, if any,
that will be passed into the resource responder as *kwargs*.
- Hooks may inject extra params as needed. For example::
+ Note:
+ Hooks may inject extra params as needed. For example::
- def do_something(req, resp, params):
- try:
- params['id'] = int(params['id'])
- except ValueError:
- raise falcon.HTTPBadRequest('Invalid ID',
- 'ID was not valid.')
+ def do_something(req, resp, params):
+ try:
+ params['id'] = int(params['id'])
+ except ValueError:
+ raise falcon.HTTPBadRequest('Invalid ID',
+ 'ID was not valid.')
- params['answer'] = 42
+ params['answer'] = 42
"""
@@ -84,7 +87,9 @@ def after(action):
"""Decorator to execute the given action function *after* the responder.
Args:
- action (callable): A function of the form ``func(req, resp)``
+ action (callable): A function of the form
+ ``func(req, resp, resource)``, where `resource` is a
+ reference to the resource class associated with the request
"""
| diff --git a/tests/test_after_hooks.py b/tests/test_after_hooks.py
index d64c8965b..0f89c6584 100644
--- a/tests/test_after_hooks.py
+++ b/tests/test_after_hooks.py
@@ -143,6 +143,12 @@ def on_options(self, req, resp):
resp.status = falcon.HTTP_501
+class FaultyResource(object):
+
+ def on_get(self, req, resp):
+ raise falcon.HTTPError(falcon.HTTP_743, 'Query failed')
+
+
class TestHooks(testing.TestBase):
def before(self):
@@ -237,6 +243,17 @@ def test_multiple_global_hooks_wrap_default_405(self):
self.assertEqual('fluffy', self.srmock.headers_dict['X-Fluffiness'])
self.assertEqual('cute', self.srmock.headers_dict['X-Cuteness'])
+ def test_global_after_hooks_run_after_exception(self):
+ self.api = falcon.API(after=[fluffiness,
+ resource_aware_cuteness,
+ Smartness()])
+
+ self.api.add_route(self.test_route, FaultyResource())
+
+ actual_body = self.simulate_request(self.test_route, decode='utf-8')
+ self.assertEqual(falcon.HTTP_743, self.srmock.status)
+ self.assertEqual(actual_body, u'fluffy and cute and smart')
+
def test_output_validator(self):
self.simulate_request(self.test_route)
self.assertEqual(falcon.HTTP_723, self.srmock.status)
| [
{
"components": [
{
"doc": "Executes each of the global \"after\" hooks, in turn.",
"lines": [
331,
343
],
"name": "API._call_after_hooks",
"signature": "def _call_after_hooks(self, req, resp, resource):",
"type": "function"
}
]... | [
"tests/test_after_hooks.py::TestHooks::test_global_after_hooks_run_after_exception"
] | [
"tests/test_after_hooks.py::TestHooks::test_customized_options",
"tests/test_after_hooks.py::TestHooks::test_global_hook",
"tests/test_after_hooks.py::TestHooks::test_global_hook_is_resource_aware",
"tests/test_after_hooks.py::TestHooks::test_global_hook_wrap_default_405",
"tests/test_after_hooks.py::TestHo... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat(hooks): Run global "after" hooks even when an exception is raised
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/api.py]
(definition of API._call_after_hooks:)
def _call_after_hooks(self, req, resp, resource):
"""Executes each of the global "after" hooks, in turn."""
[end of new definitions in falcon/api.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | Here is the discussion in the issues of the pull request.
<issues>
Run "after" hooks even when an error is raised
I suspect that most "after" hooks are used for formatting response bodies to fit a specific API structure. In my case, this also applies to any exceptions raised, so it would be helpful to have the "after" hooks also be run against any `resp.body` generated as part of my custom error handler.
----------
Nice idea. I think this would apply for any instance of HTTPError and for any error type for which a custom error handler is defined.
In addition to affording response formatting, this would also allow timing/tracing hooks and such.
In my case, I need to catch all errors/exceptions so the server can format things appropriately. E.g. all of my encoding is done as a hook so if a request asked for json or xml the error handler needs to return an error in the correct format -- much easier to just keep all of that logic in a single hook rather than re-implementing it inside of each error handler.
Moving this to 0.2 milestone since I think it is unlikely to break existing apps, and is a sorely-needed enhancement.
--------------------
</issues> | 77d5e6394a88ead151c9469494749f95f06b24bf | |
sympy__sympy-2622 | 2,622 | sympy/sympy | 0.7 | 5e822eff91d14c188a3b50f9ead44c170432b9fc | 2013-11-22T11:16:45Z | diff --git a/sympy/matrices/dense.py b/sympy/matrices/dense.py
index 066c120fb4e9..6c767105af90 100644
--- a/sympy/matrices/dense.py
+++ b/sympy/matrices/dense.py
@@ -68,16 +68,28 @@ def __getitem__(self, key):
>>> m[::2]
[1, 3]
"""
- if type(key) is tuple:
+ if isinstance(key, tuple):
i, j = key
- if type(i) is slice or type(j) is slice:
- return self.submatrix(key)
- else:
+ try:
i, j = self.key2ij(key)
return self._mat[i*self.cols + j]
+ except (TypeError, IndexError):
+ if isinstance(i, slice):
+ i = range(self.rows)[i]
+ elif is_sequence(i):
+ pass
+ else:
+ i = [i]
+ if isinstance(j, slice):
+ j = range(self.cols)[j]
+ elif is_sequence(j):
+ pass
+ else:
+ j = [j]
+ return self.extract(i, j)
else:
# row-wise decomposition of matrix
- if type(key) is slice:
+ if isinstance(key, slice):
return self._mat[key]
return self._mat[a2idx(key)]
diff --git a/sympy/matrices/matrices.py b/sympy/matrices/matrices.py
index 6f7ed3db55bf..cac58f8d3721 100644
--- a/sympy/matrices/matrices.py
+++ b/sympy/matrices/matrices.py
@@ -999,7 +999,10 @@ def __mathml__(self):
def submatrix(self, keys):
"""
- Get a slice/submatrix of the matrix using the given slice.
+ Get a slice/submatrix of the matrix using the given slice. This is
+ used by the __getitem__ method and its functionality should
+ generally be accessed that way, e.g. M[:2, :3] is equivalent to
+ M.submatrix((slice(None, 2), slice(None, 3))).
Examples
========
@@ -1012,13 +1015,13 @@ def submatrix(self, keys):
[1, 2, 3, 4],
[2, 3, 4, 5],
[3, 4, 5, 6]])
- >>> m[:1, 1]
+ >>> m.submatrix((slice(None, 1), 1))
Matrix([[1]])
- >>> m[:2, :1]
+ >>> m.submatrix((slice(None, 2), slice(None, 1)))
Matrix([
[0],
[1]])
- >>> m[2:4, 2:4]
+ >>> m.submatrix((slice(2, 4), slice(2, 4)))
Matrix([
[4, 5],
[5, 6]])
@@ -4150,8 +4153,6 @@ def classof(A, B):
def a2idx(j, n=None):
"""Return integer after making positive and validating against n."""
- if isinstance(j, slice):
- return j
if type(j) is not int:
try:
j = j.__index__()
diff --git a/sympy/matrices/sparse.py b/sympy/matrices/sparse.py
index 325b5c030fb8..ad148a108bff 100644
--- a/sympy/matrices/sparse.py
+++ b/sympy/matrices/sparse.py
@@ -7,6 +7,7 @@
from sympy.core.compatibility import is_sequence, as_int
from sympy.core.singleton import S
from sympy.functions.elementary.miscellaneous import sqrt
+from sympy.utilities.iterables import uniq
from sympy.utilities.exceptions import SymPyDeprecationWarning
from .matrices import MatrixBase, ShapeError, a2idx
@@ -87,14 +88,25 @@ def __init__(self, *args):
def __getitem__(self, key):
- if type(key) is tuple:
+ if isinstance(key, tuple):
i, j = key
- if isinstance(i, int) and isinstance(j, int):
+ try:
i, j = self.key2ij(key)
- rv = self._smat.get((i, j), S.Zero)
- return rv
- elif isinstance(i, slice) or isinstance(j, slice):
- return self.submatrix(key)
+ return self._smat.get((i, j), S.Zero)
+ except (TypeError, IndexError):
+ if isinstance(i, slice):
+ i = range(self.rows)[i]
+ elif is_sequence(i):
+ pass
+ else:
+ i = [i]
+ if isinstance(j, slice):
+ j = range(self.cols)[j]
+ elif is_sequence(j):
+ pass
+ else:
+ j = [j]
+ return self.extract(i, j)
# check for single arg, like M[:] or M[3]
if isinstance(key, slice):
@@ -491,6 +503,38 @@ def add(self, other):
M._smat.pop(i, None)
return M
+ def extract(self, rowsList, colsList):
+ urow = list(uniq(rowsList))
+ ucol = list(uniq(colsList))
+ smat = {}
+ if len(urow)*len(ucol) < len(self._smat):
+ # there are fewer elements requested than there are elements in the matrix
+ for i, r in enumerate(urow):
+ for j, c in enumerate(ucol):
+ smat[i, j] = self._smat.get((r, c), 0)
+ else:
+ # most of the request will be zeros so check all of self's entries,
+ # keeping only the ones that are desired
+ for rk, ck in self._smat:
+ if rk in urow and ck in ucol:
+ smat[(urow.index(rk), ucol.index(ck))] = self._smat[(rk, ck)]
+
+ rv = self._new(len(urow), len(ucol), smat)
+ # rv is nominally correct but there might be rows/cols
+ # which require duplication
+ if len(rowsList) != len(urow):
+ for i, r in enumerate(rowsList):
+ i_previous = rowsList.index(r)
+ if i_previous != i:
+ rv = rv.row_insert(i, rv.row(i_previous))
+ if len(colsList) != len(ucol):
+ for i, c in enumerate(colsList):
+ i_previous = colsList.index(c)
+ if i_previous != i:
+ rv = rv.col_insert(i, rv.col(i_previous))
+ return rv
+ extract.__doc__ = MatrixBase.extract.__doc__
+
def submatrix(self, keys):
rlo, rhi, clo, chi = self.key2bounds(keys)
r, c = rhi - rlo, chi - clo
@@ -510,6 +554,7 @@ def submatrix(self, keys):
if rlo <= rk < rhi and clo <= ck < chi:
smat[(rk-rlo, ck-clo)] = self._smat[(rk, ck)]
return self._new(r, c, smat)
+ submatrix.__doc__ = MatrixBase.submatrix.__doc__
def is_symmetric(self, simplify=True):
"""Return True if self is symmetric.
diff --git a/sympy/mpmath/matrices/matrices.py b/sympy/mpmath/matrices/matrices.py
index eb363c2354c4..7e29b6ab55a6 100644
--- a/sympy/mpmath/matrices/matrices.py
+++ b/sympy/mpmath/matrices/matrices.py
@@ -405,9 +405,9 @@ def __repr__(self):
def __get_element(self, key):
'''
Fast extraction of the i,j element from the matrix
- This function is for private use only because is unsafe:
- 1. Does not check on the value of key it expects key to be a integer tuple (i,j)
- 2. Does not check bounds
+ This function is for private use only because it is unsafe:
+ 1. it assumes that ``key`` is an integer tuple (i,j)
+ 2. it does not check bounds
'''
if key in self.__data:
return self.__data[key]
@@ -418,9 +418,9 @@ def __set_element(self, key, value):
'''
Fast assignment of the i,j element in the matrix
This function is unsafe:
- 1. Does not check on the value of key it expects key to be a integer tuple (i,j)
- 2. Does not check bounds
- 3. Does not check the value type
+ 1. it assumes that ``key`` is an integer tuple (i,j)
+ 2. it does not check bounds
+ 3. it does not check the value type
'''
if value: # only store non-zeros
self.__data[key] = value
| diff --git a/sympy/matrices/tests/test_matrices.py b/sympy/matrices/tests/test_matrices.py
index c44561a41375..aa0bb46e7097 100644
--- a/sympy/matrices/tests/test_matrices.py
+++ b/sympy/matrices/tests/test_matrices.py
@@ -62,6 +62,50 @@ def test_addition():
assert a + b == a.add(b) == Matrix([[2, 4], [6, 1]])
+def test_fancy_index_matrix():
+ for M in (Matrix, SparseMatrix):
+ a = M(3, 3, range(9))
+ assert a == a[:, :]
+ assert a[1, :] == Matrix(1, 3, [3, 4, 5])
+ assert a[:, 1] == Matrix([1, 4, 7])
+ assert a[[0, 1], :] == Matrix([[0, 1, 2], [3, 4, 5]])
+ assert a[[0, 1], 2] == a[[0, 1], [2]]
+ assert a[2, [0, 1]] == a[[2], [0, 1]]
+ assert a[:, [0, 1]] == Matrix([[0, 1], [3, 4], [6, 7]])
+ assert a[0, 0] == 0
+ assert a[0:2, :] == Matrix([[0, 1, 2], [3, 4, 5]])
+ assert a[:, 0:2] == Matrix([[0, 1], [3, 4], [6, 7]])
+ assert a[::2, 1] == a[[0, 2], 1]
+ assert a[1, ::2] == a[1, [0, 2]]
+ a = M(3, 3, range(9))
+ assert a[[0, 2, 1, 2, 1], :] == Matrix([
+ [0, 1, 2],
+ [6, 7, 8],
+ [3, 4, 5],
+ [6, 7, 8],
+ [3, 4, 5]])
+ assert a[:, [0,2,1,2,1]] == Matrix([
+ [0, 2, 1, 2, 1],
+ [3, 5, 4, 5, 4],
+ [6, 8, 7, 8, 7]])
+
+ a = SparseMatrix.zeros(3)
+ a[1, 2] = 2
+ a[0, 1] = 3
+ a[2, 0] = 4
+ assert a.extract([1, 1], [2]) == Matrix([
+ [2],
+ [2]])
+ assert a.extract([1, 0], [2, 2, 2]) == Matrix([
+ [2, 2, 2],
+ [0, 0, 0]])
+ assert a.extract([1, 0, 1, 2], [2, 0, 1, 0]) == Matrix([
+ [2, 0, 0, 0],
+ [0, 0, 3, 0],
+ [2, 0, 0, 0],
+ [0, 4, 0, 4]])
+
+
def test_multiplication():
a = Matrix((
(1, 2),
| [
{
"components": [
{
"doc": "",
"lines": [
506,
535
],
"name": "SparseMatrix.extract",
"signature": "def extract(self, rowsList, colsList):",
"type": "function"
}
],
"file": "sympy/matrices/sparse.py"
}
] | [
"test_fancy_index_matrix"
] | [
"test_args",
"test_division",
"test_sum",
"test_addition",
"test_multiplication",
"test_power",
"test_creation",
"test_tolist",
"test_as_mutable",
"test_determinant",
"test_det_LU_decomposition",
"test_berkowitz_minors",
"test_submatrix",
"test_submatrix_assignment",
"test_extract",
"t... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Issue 3656: Implementation of fancy indexing in matrix
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/matrices/sparse.py]
(definition of SparseMatrix.extract:)
def extract(self, rowsList, colsList):
[end of new definitions in sympy/matrices/sparse.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2616 | 2,616 | sympy/sympy | 0.7 | ced4df31693f33d4ee9df4a569c77f59745e17c6 | 2013-11-20T10:58:43Z | diff --git a/sympy/series/order.py b/sympy/series/order.py
index a981f487f1f4..7b99f639c6a9 100644
--- a/sympy/series/order.py
+++ b/sympy/series/order.py
@@ -281,6 +281,12 @@ def contains(self, expr):
obj = Order(expr, *newargs)
return self.contains(obj)
+ def __contains__(self, other):
+ result = self.contains(other)
+ if result is None:
+ raise TypeError('contains did not evaluate to a bool')
+ return result
+
def _eval_subs(self, old, new):
if old.is_Symbol and old in self.variables:
i = self.variables.index(old)
| diff --git a/sympy/series/tests/test_order.py b/sympy/series/tests/test_order.py
index bd1b2dc03ba8..32ea8ab9c6ef 100644
--- a/sympy/series/tests/test_order.py
+++ b/sympy/series/tests/test_order.py
@@ -123,6 +123,12 @@ def test_contains_3():
assert Order(x**2*y).contains(Order(x*y**2)) is None
+def test_contains():
+ assert Order(1, x) not in Order(1)
+ assert Order(1) in Order(1, x)
+ raises(TypeError, lambda: Order(x*y**2) in Order(x**2*y))
+
+
def test_add_1():
assert Order(x + x) == Order(x)
assert Order(3*x - 2*x**2) == Order(x)
| [
{
"components": [
{
"doc": "",
"lines": [
284,
288
],
"name": "Order.__contains__",
"signature": "def __contains__(self, other):",
"type": "function"
}
],
"file": "sympy/series/order.py"
}
] | [
"test_contains"
] | [
"test_caching_bug",
"test_simple_1",
"test_simple_2",
"test_simple_3",
"test_simple_4",
"test_simple_5",
"test_simple_6",
"test_simple_7",
"test_simple_8",
"test_as_expr_variables",
"test_contains_0",
"test_contains_1",
"test_contains_2",
"test_contains_3",
"test_add_1",
"test_ln_args"... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Implement membership test operator (method __contains__) for Order, just like Set
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/series/order.py]
(definition of Order.__contains__:)
def __contains__(self, other):
[end of new definitions in sympy/series/order.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
falconry__falcon-202 | 202 | falconry/falcon | null | 4961e6a1d22d9649feb8cb0ce7b2a99e38898c3c | 2013-11-14T23:11:14Z | diff --git a/falcon/api.py b/falcon/api.py
index 14acadd5a..14666e2ab 100644
--- a/falcon/api.py
+++ b/falcon/api.py
@@ -34,8 +34,8 @@ class API(object):
"""
- __slots__ = ('_after', '_before', '_media_type', '_routes',
- '_default_route')
+ __slots__ = ('_after', '_before', '_error_handlers', '_media_type',
+ '_routes', '_default_route')
def __init__(self, media_type=DEFAULT_MEDIA_TYPE, before=None, after=None):
"""Initialize a new Falcon API instances
@@ -61,6 +61,8 @@ def __init__(self, media_type=DEFAULT_MEDIA_TYPE, before=None, after=None):
self._before = helpers.prepare_global_hooks(before)
self._after = helpers.prepare_global_hooks(after)
+ self._error_handlers = []
+
def __call__(self, env, start_response):
"""WSGI "app" method
@@ -90,6 +92,14 @@ def __call__(self, env, start_response):
if req.client_accepts('application/json'):
resp.body = ex.json()
+ except Exception as e:
+ for err_type, err_handler in self._error_handlers:
+ if isinstance(e, err_type):
+ err_handler(e, req, resp, params)
+ break
+ else:
+ raise
+
#
# Set status and headers
#
@@ -190,6 +200,21 @@ def set_default_route(self, default_resource):
self._default_route = helpers.create_http_method_map(
default_resource, set(), self._before, self._after)
+ def add_error_handler(self, exception, handler):
+ """Adds a handler for a given exception type
+
+ Args:
+ exception: Whenever an exception occurs when handling a request
+ that is an instance of this exception class, the given handler
+ callable will be used to handle the exception.
+ handler: Callable that gets called with (ex, req, resp, params)
+ when there is a matching exception when handling a request.
+
+ """
+ # Insert at the head of the list in case we get duplicate
+ # adds (will cause the last one to win).
+ self._error_handlers.insert(0, (exception, handler))
+
#----------------------------------------------------------------------------
# Helpers
#----------------------------------------------------------------------------
| diff --git a/falcon/tests/test_error_handlers.py b/falcon/tests/test_error_handlers.py
new file mode 100644
index 000000000..b9712de34
--- /dev/null
+++ b/falcon/tests/test_error_handlers.py
@@ -0,0 +1,72 @@
+import falcon
+import falcon.testing as testing
+
+
+def capture_error(e, req, resp, params):
+ resp.status = falcon.HTTP_723
+ resp.body = 'error: %s' % str(e)
+
+
+def handle_error_first(e, req, resp, params):
+ resp.status = falcon.HTTP_200
+ resp.body = 'first error handler'
+
+
+class CustomBaseException(Exception):
+ pass
+
+
+class CustomException(CustomBaseException):
+ pass
+
+
+class ErroredClassResource(object):
+ def on_get(self, req, resp):
+ raise Exception('Plain Exception')
+
+ def on_head(self, req, resp):
+ raise CustomBaseException('CustomBaseException')
+
+ def on_delete(self, req, resp):
+ raise CustomException('CustomException')
+
+
+class TestErrorHandler(testing.TestBase):
+
+ def test_caught_error(self):
+ self.api.add_error_handler(Exception, capture_error)
+
+ self.api.add_route(self.test_route, ErroredClassResource())
+
+ body = self.simulate_request(self.test_route)
+ self.assertEqual([b'error: Plain Exception'], body)
+
+ body = self.simulate_request(self.test_route, method='HEAD')
+ self.assertEqual(falcon.HTTP_723, self.srmock.status)
+ self.assertEqual([], body)
+
+ def test_uncaught_error(self):
+ self.api.add_error_handler(CustomException, capture_error)
+
+ self.api.add_route(self.test_route, ErroredClassResource())
+
+ self.assertRaises(Exception,
+ self.simulate_request, self.test_route)
+
+ def test_subclass_error(self):
+ self.api.add_error_handler(CustomBaseException, capture_error)
+
+ self.api.add_route(self.test_route, ErroredClassResource())
+
+ body = self.simulate_request(self.test_route, method='DELETE')
+ self.assertEqual(falcon.HTTP_723, self.srmock.status)
+ self.assertEqual([b'error: CustomException'], body)
+
+ def test_error_order(self):
+ self.api.add_error_handler(Exception, capture_error)
+ self.api.add_error_handler(Exception, handle_error_first)
+
+ self.api.add_route(self.test_route, ErroredClassResource())
+
+ body = self.simulate_request(self.test_route)
+ self.assertEqual([b'first error handler'], body)
| [
{
"components": [
{
"doc": "Adds a handler for a given exception type\n\nArgs:\n exception: Whenever an exception occurs when handling a request\n that is an instance of this exception class, the given handler\n callable will be used to handle the exception.\n handler: Callable... | [
"falcon/tests/test_error_handlers.py::TestErrorHandler::test_caught_error",
"falcon/tests/test_error_handlers.py::TestErrorHandler::test_error_order",
"falcon/tests/test_error_handlers.py::TestErrorHandler::test_subclass_error",
"falcon/tests/test_error_handlers.py::TestErrorHandler::test_uncaught_error"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
feat(api): Adds an Exception Handler Mechanism
An attempt at implementing #55. This currently fails the flake8 tests due to too much complexity in `falcon.api.API.__call__()`.
How to use:
``` python
def handler(ex, req, resp, params):
log_error(req, ex)
resp.status = falcon.HTTP_500
resp.body = "I regret to inform you that an unfortunate error has occured."
app.add_error_handler(ValueError, handler)
```
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in falcon/api.py]
(definition of API.add_error_handler:)
def add_error_handler(self, exception, handler):
"""Adds a handler for a given exception type
Args:
exception: Whenever an exception occurs when handling a request
that is an instance of this exception class, the given handler
callable will be used to handle the exception.
handler: Callable that gets called with (ex, req, resp, params)
when there is a matching exception when handling a request."""
[end of new definitions in falcon/api.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 77d5e6394a88ead151c9469494749f95f06b24bf | ||
sympy__sympy-2577 | 2,577 | sympy/sympy | 0.7 | a0e62084297c9427d75c9fd06e57f79e68147520 | 2013-11-09T13:34:00Z | diff --git a/sympy/functions/__init__.py b/sympy/functions/__init__.py
index a4c872fee964..cb995f00d727 100644
--- a/sympy/functions/__init__.py
+++ b/sympy/functions/__init__.py
@@ -10,7 +10,7 @@
from sympy.functions.combinatorial.numbers import (fibonacci, lucas, harmonic,
bernoulli, bell, euler, catalan)
from sympy.functions.elementary.miscellaneous import (sqrt, root, Min, Max,
- Id, real_root)
+ Id, real_root, cbrt)
from sympy.functions.elementary.complexes import (re, im, sign, Abs,
conjugate, arg, polar_lift, periodic_argument, unbranched_argument,
principal_branch, transpose, adjoint)
diff --git a/sympy/functions/elementary/miscellaneous.py b/sympy/functions/elementary/miscellaneous.py
index 4a1db16677cd..3e9ef14f391a 100644
--- a/sympy/functions/elementary/miscellaneous.py
+++ b/sympy/functions/elementary/miscellaneous.py
@@ -66,10 +66,9 @@ def sqrt(arg):
This is because the two are not equal to each other in general.
For example, consider x == -1:
- >>> sqrt(x**2).subs(x, -1)
- 1
- >>> x.subs(x, -1)
- -1
+ >>> from sympy import Eq
+ >>> Eq(sqrt(x**2), x).subs(x, -1)
+ False
This is because sqrt computes the principal square root, so the square may
put the argument in a different branch. This identity does hold if x is
@@ -98,7 +97,7 @@ def sqrt(arg):
See Also
========
- sympy.polys.rootoftools.RootOf, root
+ sympy.polys.rootoftools.RootOf, root, real_root
References
==========
@@ -111,6 +110,57 @@ def sqrt(arg):
return C.Pow(arg, S.Half)
+
+def cbrt(arg):
+ """This function computes the principial cube root of `arg`, so
+ it's just a shortcut for `arg**Rational(1, 3)`.
+
+ Examples
+ ========
+
+ >>> from sympy import cbrt, Symbol
+ >>> x = Symbol('x')
+
+ >>> cbrt(x)
+ x**(1/3)
+
+ >>> cbrt(x)**3
+ x
+
+ Note that cbrt(x**3) does not simplify to x.
+
+ >>> cbrt(x**3)
+ (x**3)**(1/3)
+
+ This is because the two are not equal to each other in general.
+ For example, consider `x == -1`:
+
+ >>> from sympy import Eq
+ >>> Eq(cbrt(x**3), x).subs(x, -1)
+ False
+
+ This is because cbrt computes the principal cube root, this
+ identity does hold if `x` is positive:
+
+ >>> y = Symbol('y', positive=True)
+ >>> cbrt(y**3)
+ y
+
+ See Also
+ ========
+
+ sympy.polys.rootoftools.RootOf, root, real_root
+
+ References
+ ==========
+
+ * http://en.wikipedia.org/wiki/Cube_root
+ * http://en.wikipedia.org/wiki/Principal_value
+
+ """
+ return C.Pow(arg, C.Rational(1, 3))
+
+
def root(arg, n):
"""The n-th root function (a shortcut for ``arg**(1/n)``)
| diff --git a/sympy/functions/elementary/tests/test_miscellaneous.py b/sympy/functions/elementary/tests/test_miscellaneous.py
index b217151aa454..c4099431a005 100644
--- a/sympy/functions/elementary/tests/test_miscellaneous.py
+++ b/sympy/functions/elementary/tests/test_miscellaneous.py
@@ -1,7 +1,7 @@
from sympy.core.symbol import Symbol
from sympy.core.numbers import Rational
from sympy.utilities.pytest import raises
-from sympy.functions.elementary.miscellaneous import sqrt, root, Min, Max, real_root
+from sympy.functions.elementary.miscellaneous import sqrt, cbrt, root, Min, Max, real_root
from sympy import S, Float, I, cos, sin, oo, pi, Add
@@ -159,6 +159,7 @@ def test_root():
assert root(2, 2) == sqrt(2)
assert root(2, 1) == 2
assert root(2, 3) == 2**Rational(1, 3)
+ assert root(2, 3) == cbrt(2)
assert root(2, -5) == 2**Rational(4, 5)/2
assert root(-2, 1) == -2
@@ -169,6 +170,7 @@ def test_root():
assert root(x, 2) == sqrt(x)
assert root(x, 1) == x
assert root(x, 3) == x**Rational(1, 3)
+ assert root(x, 3) == cbrt(x)
assert root(x, -5) == x**Rational(-1, 5)
assert root(x, n) == x**(1/n)
| [
{
"components": [
{
"doc": "This function computes the principial cube root of `arg`, so\nit's just a shortcut for `arg**Rational(1, 3)`.\n\nExamples\n========\n\n>>> from sympy import cbrt, Symbol\n>>> x = Symbol('x')\n\n>>> cbrt(x)\nx**(1/3)\n\n>>> cbrt(x)**3\nx\n\nNote that cbrt(x**3) does not ... | [
"test_Min",
"test_Max",
"test_root"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Issue 4071: add cbrt function
http://code.google.com/p/sympy/issues/detail?id=4071
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/functions/elementary/miscellaneous.py]
(definition of cbrt:)
def cbrt(arg):
"""This function computes the principial cube root of `arg`, so
it's just a shortcut for `arg**Rational(1, 3)`.
Examples
========
>>> from sympy import cbrt, Symbol
>>> x = Symbol('x')
>>> cbrt(x)
x**(1/3)
>>> cbrt(x)**3
x
Note that cbrt(x**3) does not simplify to x.
>>> cbrt(x**3)
(x**3)**(1/3)
This is because the two are not equal to each other in general.
For example, consider `x == -1`:
>>> from sympy import Eq
>>> Eq(cbrt(x**3), x).subs(x, -1)
False
This is because cbrt computes the principal cube root, this
identity does hold if `x` is positive:
>>> y = Symbol('y', positive=True)
>>> cbrt(y**3)
y
See Also
========
sympy.polys.rootoftools.RootOf, root, real_root
References
==========
* http://en.wikipedia.org/wiki/Cube_root
* http://en.wikipedia.org/wiki/Principal_value"""
[end of new definitions in sympy/functions/elementary/miscellaneous.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2571 | 2,571 | sympy/sympy | 0.7 | a5c88ab56e4ca06c3a1e83e1503a53c121cb42d1 | 2013-11-04T02:33:36Z | diff --git a/sympy/matrices/matrices.py b/sympy/matrices/matrices.py
index ab16975bc0f8..690996b6986b 100644
--- a/sympy/matrices/matrices.py
+++ b/sympy/matrices/matrices.py
@@ -6,7 +6,7 @@
from sympy.core.expr import Expr
from sympy.core.function import count_ops
from sympy.core.power import Pow
-from sympy.core.symbol import Symbol, Dummy
+from sympy.core.symbol import Symbol, Dummy, symbols
from sympy.core.numbers import Integer, ilcm, Rational, Float
from sympy.core.singleton import S
from sympy.core.sympify import sympify
@@ -783,6 +783,7 @@ def lower_triangular_solve(self, rhs):
LDLsolve
LUsolve
QRsolve
+ pinv_solve
"""
if not self.is_square:
@@ -805,6 +806,7 @@ def upper_triangular_solve(self, rhs):
LDLsolve
LUsolve
QRsolve
+ pinv_solve
"""
if not self.is_square:
raise NonSquareMatrixError("Matrix must be square.")
@@ -829,6 +831,7 @@ def cholesky_solve(self, rhs):
LDLsolve
LUsolve
QRsolve
+ pinv_solve
"""
if self.is_symmetric():
L = self._cholesky()
@@ -862,6 +865,7 @@ def diagonal_solve(self, rhs):
LDLsolve
LUsolve
QRsolve
+ pinv_solve
"""
if not self.is_diagonal:
raise TypeError("Matrix should be diagonal")
@@ -895,6 +899,7 @@ def LDLsolve(self, rhs):
diagonal_solve
LUsolve
QRsolve
+ pinv_solve
"""
if self.is_symmetric():
L, D = self.LDLdecomposition()
@@ -1262,6 +1267,7 @@ def LUsolve(self, rhs, iszerofunc=_iszero):
diagonal_solve
LDLsolve
QRsolve
+ pinv_solve
LUdecomposition
"""
if rhs.rows != self.rows:
@@ -1638,6 +1644,7 @@ def QRsolve(self, b):
diagonal_solve
LDLsolve
LUsolve
+ pinv_solve
QRdecomposition
"""
@@ -3975,6 +3982,135 @@ def replace(self, F, G, map=False):
return M.applyfunc(lambda x: x.replace(F, G, map))
+ def pinv(self):
+ """Calculate the Moore-Penrose pseudoinverse of the matrix.
+
+ The Moore-Penrose pseudoinverse exists and is unique for any matrix.
+ If the matrix is invertible, the pseudoinverse is the same as the
+ inverse.
+
+ Examples
+ ========
+
+ >>> from sympy import Matrix
+ >>> Matrix([[1, 2, 3], [4, 5, 6]]).pinv()
+ Matrix([
+ [-17/18, 4/9],
+ [ -1/9, 1/9],
+ [ 13/18, -2/9]])
+
+ See Also
+ ========
+
+ inv
+ pinv_solve
+
+ References
+ ==========
+
+ .. [1] https://en.wikipedia.org/wiki/Moore-Penrose_pseudoinverse
+
+ """
+ A = self
+ AH = self.H
+ # Trivial case: pseudoinverse of all-zero matrix is its transpose.
+ if A.is_zero:
+ return AH
+ try:
+ if self.rows >= self.cols:
+ return (AH * A).inv() * AH
+ else:
+ return AH * (A * AH).inv()
+ except ValueError:
+ # Matrix is not full rank, so A*AH cannot be inverted.
+ raise NotImplementedError('Rank-deficient matrices are not yet '
+ 'supported.')
+
+ def pinv_solve(self, B, arbitrary_matrix=None):
+ """Solve Ax = B using the Moore-Penrose pseudoinverse.
+
+ There may be zero, one, or infinite solutions. If one solution
+ exists, it will be returned. If infinite solutions exist, one will
+ be returned based on the value of arbitrary_matrix. If no solutions
+ exist, the least-squares solution is returned.
+
+ Parameters
+ ==========
+
+ B : Matrix
+ The right hand side of the equation to be solved for. Must have
+ the same number of rows as matrix A.
+ arbitrary_matrix : Matrix
+ If the system is underdetermined (e.g. A has more columns than
+ rows), infinite solutions are possible, in terms of an arbitrary
+ matrix. This parameter may be set to a specific matrix to use
+ for that purpose; if so, it must be the same shape as x, with as
+ many rows as matrix A has columns, and as many columns as matrix
+ B. If left as None, an appropriate matrix containing dummy
+ symbols in the form of ``wn_m`` will be used, with n and m being
+ row and column position of each symbol.
+
+ Returns
+ =======
+
+ x : Matrix
+ The matrix that will satisfy Ax = B. Will have as many rows as
+ matrix A has columns, and as many columns as matrix B.
+
+ Examples
+ ========
+
+ >>> from sympy import Matrix
+ >>> A = Matrix([[1, 2, 3], [4, 5, 6]])
+ >>> B = Matrix([7, 8])
+ >>> A.pinv_solve(B)
+ Matrix([
+ [ _w0_0/6 - _w1_0/3 + _w2_0/6 - 55/18],
+ [-_w0_0/3 + 2*_w1_0/3 - _w2_0/3 + 1/9],
+ [ _w0_0/6 - _w1_0/3 + _w2_0/6 + 59/18]])
+ >>> A.pinv_solve(B, arbitrary_matrix=Matrix([0, 0, 0]))
+ Matrix([
+ [-55/18],
+ [ 1/9],
+ [ 59/18]])
+
+ See Also
+ ========
+
+ lower_triangular_solve
+ upper_triangular_solve
+ cholesky_solve
+ diagonal_solve
+ LDLsolve
+ LUsolve
+ QRsolve
+ pinv
+
+ Notes
+ =====
+
+ This may return either exact solutions or least squares solutions.
+ To determine which, check ``A * A.pinv() * B == B``. It will be
+ True if exact solutions exist, and False if only a least-squares
+ solution exists. Be aware that the left hand side of that equation
+ may need to be simplified to correctly compare to the right hand
+ side.
+
+ References
+ ==========
+
+ .. [1] https://en.wikipedia.org/wiki/Moore-Penrose_pseudoinverse#Obtaining_all_solutions_of_a_linear_system
+
+ """
+ from sympy.matrices import eye
+ A = self
+ A_pinv = self.pinv()
+ if arbitrary_matrix is None:
+ rows, cols = A.cols, B.cols
+ w = symbols('w:{0}_:{1}'.format(rows, cols), cls=Dummy)
+ arbitrary_matrix = self.__class__(cols, rows, w).T
+ return A_pinv * B + (eye(A.cols) - A_pinv*A) * arbitrary_matrix
+
def classof(A, B):
"""
Get the type of the result when combining matrices of different types.
| diff --git a/sympy/matrices/tests/test_matrices.py b/sympy/matrices/tests/test_matrices.py
index 5f000b087e18..33cb0c325380 100644
--- a/sympy/matrices/tests/test_matrices.py
+++ b/sympy/matrices/tests/test_matrices.py
@@ -13,7 +13,7 @@
rot_axis3, wronskian, zeros)
from sympy.core.compatibility import long
from sympy.utilities.iterables import flatten, capture
-from sympy.utilities.pytest import raises, XFAIL
+from sympy.utilities.pytest import raises, XFAIL, slow
from sympy.abc import x, y, z
@@ -2229,3 +2229,59 @@ def test_atoms():
m = Matrix([[1, 2], [x, 1 - 1/x]])
assert m.atoms() == set([S(1),S(2),S(-1), x])
assert m.atoms(Symbol) == set([x])
+
+@slow
+def test_pinv():
+ from sympy.abc import a, b, c, d, e, f
+ # Pseudoinverse of an invertible matrix is the inverse.
+ A1 = Matrix([[a, b], [c, d]])
+ assert simplify(A1.pinv()) == simplify(A1.inv())
+ # Test the four properties of the pseudoinverse for various matrices.
+ As = [Matrix([[13, 104], [2212, 3], [-3, 5]]),
+ Matrix([[1, 7, 9], [11, 17, 19]]),
+ Matrix([a, b])]
+ for A in As:
+ A_pinv = A.pinv()
+ AAp = A * A_pinv
+ ApA = A_pinv * A
+ assert simplify(AAp * A) == A
+ assert simplify(ApA * A_pinv) == A_pinv
+ assert AAp.H == AAp
+ assert ApA.H == ApA
+
+def test_pinv_solve():
+ # Fully determined system (unique result, identical to other solvers).
+ A = Matrix([[1, 5], [7, 9]])
+ B = Matrix([12, 13])
+ assert A.pinv_solve(B) == A.cholesky_solve(B)
+ assert A.pinv_solve(B) == A.LDLsolve(B)
+ assert A.pinv_solve(B) == Matrix([sympify('-43/26'), sympify('71/26')])
+ # Fully determined, with two-dimensional B matrix.
+ B = Matrix([[12, 13, 14], [15, 16, 17]])
+ assert A.pinv_solve(B) == A.cholesky_solve(B)
+ assert A.pinv_solve(B) == A.LDLsolve(B)
+ assert A.pinv_solve(B) == Matrix([[-33, -37, -41], [69, 75, 81]]) / 26
+ # Underdetermined system (infinite results).
+ A = Matrix([[1, 0, 1], [0, 1, 1]])
+ B = Matrix([5, 7])
+ solution = A.pinv_solve(B)
+ w = {}
+ for s in solution.atoms(Symbol):
+ # Extract dummy symbols used in the solution.
+ w[s.name] = s
+ assert solution == Matrix([[w['w0_0']/3 + w['w1_0']/3 - w['w2_0']/3 + 1],
+ [w['w0_0']/3 + w['w1_0']/3 - w['w2_0']/3 + 3],
+ [-w['w0_0']/3 - w['w1_0']/3 + w['w2_0']/3 + 4]])
+ # Overdetermined system (least squares results).
+ A = Matrix([[1, 0], [0, 0], [0, 1]])
+ B = Matrix([3, 2, 1])
+ assert A.pinv_solve(B) == Matrix([3, 1])
+ # Proof the solution is not exact.
+ assert A * A.pinv() * B != B
+
+@XFAIL
+def test_pinv_solve_rank_deficient():
+ A = Matrix([[1, 0], [0, 0]])
+ B = Matrix([3, 0])
+ w1 = symbols('w1')
+ assert A.pinv_solve(B) == Matrix([3, w1])
| [
{
"components": [
{
"doc": "Calculate the Moore-Penrose pseudoinverse of the matrix.\n\nThe Moore-Penrose pseudoinverse exists and is unique for any matrix.\nIf the matrix is invertible, the pseudoinverse is the same as the\ninverse.\n\nExamples\n========\n\n>>> from sympy import Matrix\n>>> Matri... | [
"test_pinv_solve"
] | [
"test_args",
"test_division",
"test_sum",
"test_addition",
"test_multiplication",
"test_power",
"test_creation",
"test_tolist",
"test_as_mutable",
"test_determinant",
"test_det_LU_decomposition",
"test_berkowitz_minors",
"test_submatrix",
"test_submatrix_assignment",
"test_extract",
"t... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Matrices: partial implementation of the Moore-Penrose pseudoinverse.
Added two methods: one to calculate the pseudoinverse, and another to apply it
in the solution of Ax = B. The pseudoinverse calculation works only for full-
rank matrices and zero matrices; rank-deficient matrices will be more
complicated, possibly requiring a complete symbolic implementation of singular
value decomposition. Until then, this should work well enough (at least for
me!).
See https://en.wikipedia.org/wiki/Moore-Penrose_pseudoinverse
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/matrices/matrices.py]
(definition of MatrixBase.pinv:)
def pinv(self):
"""Calculate the Moore-Penrose pseudoinverse of the matrix.
The Moore-Penrose pseudoinverse exists and is unique for any matrix.
If the matrix is invertible, the pseudoinverse is the same as the
inverse.
Examples
========
>>> from sympy import Matrix
>>> Matrix([[1, 2, 3], [4, 5, 6]]).pinv()
Matrix([
[-17/18, 4/9],
[ -1/9, 1/9],
[ 13/18, -2/9]])
See Also
========
inv
pinv_solve
References
==========
.. [1] https://en.wikipedia.org/wiki/Moore-Penrose_pseudoinverse"""
(definition of MatrixBase.pinv_solve:)
def pinv_solve(self, B, arbitrary_matrix=None):
"""Solve Ax = B using the Moore-Penrose pseudoinverse.
There may be zero, one, or infinite solutions. If one solution
exists, it will be returned. If infinite solutions exist, one will
be returned based on the value of arbitrary_matrix. If no solutions
exist, the least-squares solution is returned.
Parameters
==========
B : Matrix
The right hand side of the equation to be solved for. Must have
the same number of rows as matrix A.
arbitrary_matrix : Matrix
If the system is underdetermined (e.g. A has more columns than
rows), infinite solutions are possible, in terms of an arbitrary
matrix. This parameter may be set to a specific matrix to use
for that purpose; if so, it must be the same shape as x, with as
many rows as matrix A has columns, and as many columns as matrix
B. If left as None, an appropriate matrix containing dummy
symbols in the form of ``wn_m`` will be used, with n and m being
row and column position of each symbol.
Returns
=======
x : Matrix
The matrix that will satisfy Ax = B. Will have as many rows as
matrix A has columns, and as many columns as matrix B.
Examples
========
>>> from sympy import Matrix
>>> A = Matrix([[1, 2, 3], [4, 5, 6]])
>>> B = Matrix([7, 8])
>>> A.pinv_solve(B)
Matrix([
[ _w0_0/6 - _w1_0/3 + _w2_0/6 - 55/18],
[-_w0_0/3 + 2*_w1_0/3 - _w2_0/3 + 1/9],
[ _w0_0/6 - _w1_0/3 + _w2_0/6 + 59/18]])
>>> A.pinv_solve(B, arbitrary_matrix=Matrix([0, 0, 0]))
Matrix([
[-55/18],
[ 1/9],
[ 59/18]])
See Also
========
lower_triangular_solve
upper_triangular_solve
cholesky_solve
diagonal_solve
LDLsolve
LUsolve
QRsolve
pinv
Notes
=====
This may return either exact solutions or least squares solutions.
To determine which, check ``A * A.pinv() * B == B``. It will be
True if exact solutions exist, and False if only a least-squares
solution exists. Be aware that the left hand side of that equation
may need to be simplified to correctly compare to the right hand
side.
References
==========
.. [1] https://en.wikipedia.org/wiki/Moore-Penrose_pseudoinverse#Obtaining_all_solutions_of_a_linear_system"""
[end of new definitions in sympy/matrices/matrices.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2556 | 2,556 | sympy/sympy | 0.7 | 3c450eee18dfe7bb97dfa5ef35a6db3e47e337eb | 2013-10-28T20:36:57Z | diff --git a/sympy/printing/ccode.py b/sympy/printing/ccode.py
index 0417d8325452..8e34538a63c5 100644
--- a/sympy/printing/ccode.py
+++ b/sympy/printing/ccode.py
@@ -15,7 +15,6 @@
from sympy.core.compatibility import string_types
from sympy.printing.codeprinter import CodePrinter
from sympy.printing.precedence import precedence
-from sympy.core.compatibility import default_sort_key
# dictionary mapping sympy function to (argument_conditions, C_function).
# Used in CCodePrinter._print_Function(self)
@@ -177,20 +176,6 @@ def _print_Piecewise(self, expr):
code = "%s" + last_line
return code % ": ".join(ecpairs) + " )"
- def _print_And(self, expr):
- PREC = precedence(expr)
- return ' && '.join(self.parenthesize(a, PREC)
- for a in sorted(expr.args, key=default_sort_key))
-
- def _print_Or(self, expr):
- PREC = precedence(expr)
- return ' || '.join(self.parenthesize(a, PREC)
- for a in sorted(expr.args, key=default_sort_key))
-
- def _print_Not(self, expr):
- PREC = precedence(expr)
- return '!' + self.parenthesize(expr.args[0], PREC)
-
def _print_Function(self, expr):
if expr.func.__name__ in self.known_functions:
cond_cfunc = self.known_functions[expr.func.__name__]
diff --git a/sympy/printing/codeprinter.py b/sympy/printing/codeprinter.py
index 49336af4172e..39d9dee576d7 100644
--- a/sympy/printing/codeprinter.py
+++ b/sympy/printing/codeprinter.py
@@ -1,6 +1,7 @@
from __future__ import print_function, division
from sympy.core import C, Add, Mul, Pow, S
+from sympy.core.compatibility import default_sort_key
from sympy.core.mul import _keep_coeff
from sympy.printing.str import StrPrinter
from sympy.printing.precedence import precedence
@@ -18,6 +19,12 @@ class CodePrinter(StrPrinter):
The base class for code-printing subclasses.
"""
+ _operators = {
+ 'and': '&&',
+ 'or': '||',
+ 'not': '!',
+ }
+
def _doprint_a_piece(self, expr, assign_to=None):
# Here we print an expression that may contain Indexed objects, they
# correspond to arrays in the generated code. The low-level implementation
@@ -144,6 +151,34 @@ def _print_Dummy(self, expr):
_print_EulerGamma = _print_NumberSymbol
_print_GoldenRatio = _print_NumberSymbol
+ def _print_And(self, expr):
+ PREC = precedence(expr)
+ return (" %s " % self._operators['and']).join(self.parenthesize(a, PREC)
+ for a in sorted(expr.args, key=default_sort_key))
+
+ def _print_Or(self, expr):
+ PREC = precedence(expr)
+ return (" %s " % self._operators['or']).join(self.parenthesize(a, PREC)
+ for a in sorted(expr.args, key=default_sort_key))
+
+ def _print_Xor(self, expr):
+ if self._operators.get('xor') is None:
+ return self._print_not_supported(expr)
+ PREC = precedence(expr)
+ return (" %s " % self._operators['xor']).join(self.parenthesize(a, PREC)
+ for a in sorted(expr.args, key=default_sort_key))
+
+ def _print_Equivalent(self, expr):
+ if self._operators.get('equivalent') is None:
+ return self._print_not_supported(expr)
+ PREC = precedence(expr)
+ return (" %s " % self._operators['equivalent']).join(self.parenthesize(a, PREC)
+ for a in sorted(expr.args, key=default_sort_key))
+
+ def _print_Not(self, expr):
+ PREC = precedence(expr)
+ return self._operators['not'] + self.parenthesize(expr.args[0], PREC)
+
def _print_Mul(self, expr):
prec = precedence(expr)
diff --git a/sympy/printing/fcode.py b/sympy/printing/fcode.py
index 12efa9f09c59..8b7f5556fd07 100644
--- a/sympy/printing/fcode.py
+++ b/sympy/printing/fcode.py
@@ -45,6 +45,18 @@ class FCodePrinter(CodePrinter):
"cosh", "tanh", "sqrt", "log", "exp", "erf", "Abs", "sign", "conjugate",
])
+ _operators = {
+ 'and': '.and.',
+ 'or': '.or.',
+ 'xor': '.neqv.',
+ 'equivalent': '.eqv.',
+ 'not': '.not. ',
+ }
+
+ _relationals = {
+ '!=': '/=',
+ }
+
def __init__(self, settings=None):
CodePrinter.__init__(self, settings)
self._init_leading_padding()
diff --git a/sympy/printing/jscode.py b/sympy/printing/jscode.py
index da0c75eada3b..64d5254e0f98 100644
--- a/sympy/printing/jscode.py
+++ b/sympy/printing/jscode.py
@@ -188,20 +188,6 @@ def _print_Piecewise(self, expr):
code = "if %s" + last_line
return code % "else if ".join(ecpairs)
- def _print_And(self, expr):
- PREC = precedence(expr)
- return ' && '.join(self.parenthesize(a, PREC)
- for a in sorted(expr.args, key=default_sort_key))
-
- def _print_Or(self, expr):
- PREC = precedence(expr)
- return ' || '.join(self.parenthesize(a, PREC)
- for a in sorted(expr.args, key=default_sort_key))
-
- def _print_Not(self, expr):
- PREC = precedence(expr)
- return '!' + self.parenthesize(expr.args[0], PREC)
-
def _print_Function(self, expr):
if expr.func.__name__ in self.known_functions:
cond_cfunc = self.known_functions[expr.func.__name__]
diff --git a/sympy/printing/precedence.py b/sympy/printing/precedence.py
index b0d3541b0dfc..06c30081ab87 100644
--- a/sympy/printing/precedence.py
+++ b/sympy/printing/precedence.py
@@ -7,9 +7,10 @@
# Default precedence values for some basic types
PRECEDENCE = {
"Lambda": 1,
- "Relational": 20,
+ "Xor": 10,
"Or": 20,
"And": 30,
+ "Relational": 35,
"Add": 40,
"Mul": 50,
"Pow": 60,
@@ -21,6 +22,8 @@
# treated like they were inherited, so not every single class has to be named
# here.
PRECEDENCE_VALUES = {
+ "Equivalent": PRECEDENCE["Xor"],
+ "Xor": PRECEDENCE["Xor"],
"Or": PRECEDENCE["Or"],
"And": PRECEDENCE["And"],
"Add": PRECEDENCE["Add"],
diff --git a/sympy/printing/str.py b/sympy/printing/str.py
index 927db21bf3bc..dd9c41b90e46 100644
--- a/sympy/printing/str.py
+++ b/sympy/printing/str.py
@@ -23,6 +23,8 @@ class StrPrinter(Printer):
"full_prec": "auto",
}
+ _relationals = dict()
+
def parenthesize(self, item, level):
if precedence(item) <= level:
return "(%s)" % self._print(item)
@@ -547,7 +549,7 @@ def _print_Float(self, expr):
def _print_Relational(self, expr):
return '%s %s %s' % (self.parenthesize(expr.lhs, precedence(expr)),
- expr.rel_op,
+ self._relationals.get(expr.rel_op) or expr.rel_op,
self.parenthesize(expr.rhs, precedence(expr)))
def _print_RootOf(self, expr):
| diff --git a/sympy/printing/tests/test_fcode.py b/sympy/printing/tests/test_fcode.py
index 96cb01b21e03..bb9a69570965 100644
--- a/sympy/printing/tests/test_fcode.py
+++ b/sympy/printing/tests/test_fcode.py
@@ -3,6 +3,8 @@
from sympy import Catalan, EulerGamma, E, GoldenRatio, I, pi
from sympy import Function, Rational, Integer, Lambda
+from sympy.core.relational import Relational
+from sympy.logic.boolalg import And, Or, Not, Equivalent, Xor
from sympy.printing.fcode import fcode, FCodePrinter
from sympy.tensor import IndexedBase, Idx
from sympy.utilities.lambdify import implemented_function
@@ -180,6 +182,176 @@ def test_line_wrapping():
)
+def test_fcode_precedence():
+ x, y = symbols("x y")
+ assert fcode(And(x < y, y < x + 1), source_format="free") == \
+ "x < y .and. y < x + 1"
+ assert fcode(Or(x < y, y < x + 1), source_format="free") == \
+ "x < y .or. y < x + 1"
+ assert fcode(Xor(x < y, y < x + 1, evaluate=False),
+ source_format="free") == "x < y .neqv. y < x + 1"
+ assert fcode(Equivalent(x < y, y < x + 1), source_format="free") == \
+ "x < y .eqv. y < x + 1"
+
+
+def test_fcode_Logical():
+ x, y, z = symbols("x y z")
+ # unary Not
+ assert fcode(Not(x), source_format="free") == ".not. x"
+ # binary And
+ assert fcode(And(x, y), source_format="free") == "x .and. y"
+ assert fcode(And(x, Not(y)), source_format="free") == "x .and. .not. y"
+ assert fcode(And(Not(x), y), source_format="free") == "y .and. .not. x"
+ assert fcode(And(Not(x), Not(y)), source_format="free") == \
+ ".not. x .and. .not. y"
+ assert fcode(Not(And(x, y), evaluate=False), source_format="free") == \
+ ".not. (x .and. y)"
+ # binary Or
+ assert fcode(Or(x, y), source_format="free") == "x .or. y"
+ assert fcode(Or(x, Not(y)), source_format="free") == "x .or. .not. y"
+ assert fcode(Or(Not(x), y), source_format="free") == "y .or. .not. x"
+ assert fcode(Or(Not(x), Not(y)), source_format="free") == \
+ ".not. x .or. .not. y"
+ assert fcode(Not(Or(x, y), evaluate=False), source_format="free") == \
+ ".not. (x .or. y)"
+ # mixed And/Or
+ assert fcode(And(Or(y, z), x), source_format="free") == "x .and. (y .or. z)"
+ assert fcode(And(Or(z, x), y), source_format="free") == "y .and. (x .or. z)"
+ assert fcode(And(Or(x, y), z), source_format="free") == "z .and. (x .or. y)"
+ assert fcode(Or(And(y, z), x), source_format="free") == "x .or. y .and. z"
+ assert fcode(Or(And(z, x), y), source_format="free") == "y .or. x .and. z"
+ assert fcode(Or(And(x, y), z), source_format="free") == "z .or. x .and. y"
+ # trinary And
+ assert fcode(And(x, y, z), source_format="free") == "x .and. y .and. z"
+ assert fcode(And(x, y, Not(z)), source_format="free") == \
+ "x .and. y .and. .not. z"
+ assert fcode(And(x, Not(y), z), source_format="free") == \
+ "x .and. z .and. .not. y"
+ assert fcode(And(Not(x), y, z), source_format="free") == \
+ "y .and. z .and. .not. x"
+ assert fcode(Not(And(x, y, z), evaluate=False), source_format="free") == \
+ ".not. (x .and. y .and. z)"
+ # trinary Or
+ assert fcode(Or(x, y, z), source_format="free") == "x .or. y .or. z"
+ assert fcode(Or(x, y, Not(z)), source_format="free") == \
+ "x .or. y .or. .not. z"
+ assert fcode(Or(x, Not(y), z), source_format="free") == \
+ "x .or. z .or. .not. y"
+ assert fcode(Or(Not(x), y, z), source_format="free") == \
+ "y .or. z .or. .not. x"
+ assert fcode(Not(Or(x, y, z), evaluate=False), source_format="free") == \
+ ".not. (x .or. y .or. z)"
+
+
+def test_fcode_Xlogical():
+ x, y, z = symbols("x y z")
+ # binary Xor
+ assert fcode(Xor(x, y, evaluate=False), source_format="free") == \
+ "x .neqv. y"
+ assert fcode(Xor(x, Not(y), evaluate=False), source_format="free") == \
+ "x .neqv. .not. y"
+ assert fcode(Xor(Not(x), y, evaluate=False), source_format="free") == \
+ "y .neqv. .not. x"
+ assert fcode(Xor(Not(x), Not(y), evaluate=False),
+ source_format="free") == ".not. x .neqv. .not. y"
+ assert fcode(Not(Xor(x, y, evaluate=False), evaluate=False),
+ source_format="free") == ".not. (x .neqv. y)"
+ # binary Equivalent
+ assert fcode(Equivalent(x, y), source_format="free") == "x .eqv. y"
+ assert fcode(Equivalent(x, Not(y)), source_format="free") == \
+ "x .eqv. .not. y"
+ assert fcode(Equivalent(Not(x), y), source_format="free") == \
+ "y .eqv. .not. x"
+ assert fcode(Equivalent(Not(x), Not(y)), source_format="free") == \
+ ".not. x .eqv. .not. y"
+ assert fcode(Not(Equivalent(x, y), evaluate=False),
+ source_format="free") == ".not. (x .eqv. y)"
+ # mixed And/Equivalent
+ assert fcode(Equivalent(And(y, z), x), source_format="free") == \
+ "x .eqv. y .and. z"
+ assert fcode(Equivalent(And(z, x), y), source_format="free") == \
+ "y .eqv. x .and. z"
+ assert fcode(Equivalent(And(x, y), z), source_format="free") == \
+ "z .eqv. x .and. y"
+ assert fcode(And(Equivalent(y, z), x), source_format="free") == \
+ "x .and. (y .eqv. z)"
+ assert fcode(And(Equivalent(z, x), y), source_format="free") == \
+ "y .and. (x .eqv. z)"
+ assert fcode(And(Equivalent(x, y), z), source_format="free") == \
+ "z .and. (x .eqv. y)"
+ # mixed Or/Equivalent
+ assert fcode(Equivalent(Or(y, z), x), source_format="free") == \
+ "x .eqv. y .or. z"
+ assert fcode(Equivalent(Or(z, x), y), source_format="free") == \
+ "y .eqv. x .or. z"
+ assert fcode(Equivalent(Or(x, y), z), source_format="free") == \
+ "z .eqv. x .or. y"
+ assert fcode(Or(Equivalent(y, z), x), source_format="free") == \
+ "x .or. (y .eqv. z)"
+ assert fcode(Or(Equivalent(z, x), y), source_format="free") == \
+ "y .or. (x .eqv. z)"
+ assert fcode(Or(Equivalent(x, y), z), source_format="free") == \
+ "z .or. (x .eqv. y)"
+ # mixed Xor/Equivalent
+ assert fcode(Equivalent(Xor(y, z, evaluate=False), x),
+ source_format="free") == "x .eqv. (y .neqv. z)"
+ assert fcode(Equivalent(Xor(z, x, evaluate=False), y),
+ source_format="free") == "y .eqv. (x .neqv. z)"
+ assert fcode(Equivalent(Xor(x, y, evaluate=False), z),
+ source_format="free") == "z .eqv. (x .neqv. y)"
+ assert fcode(Xor(Equivalent(y, z), x, evaluate=False),
+ source_format="free") == "x .neqv. (y .eqv. z)"
+ assert fcode(Xor(Equivalent(z, x), y, evaluate=False),
+ source_format="free") == "y .neqv. (x .eqv. z)"
+ assert fcode(Xor(Equivalent(x, y), z, evaluate=False),
+ source_format="free") == "z .neqv. (x .eqv. y)"
+ # mixed And/Xor
+ assert fcode(Xor(And(y, z), x, evaluate=False), source_format="free") == \
+ "x .neqv. y .and. z"
+ assert fcode(Xor(And(z, x), y, evaluate=False), source_format="free") == \
+ "y .neqv. x .and. z"
+ assert fcode(Xor(And(x, y), z, evaluate=False), source_format="free") == \
+ "z .neqv. x .and. y"
+ assert fcode(And(Xor(y, z, evaluate=False), x), source_format="free") == \
+ "x .and. (y .neqv. z)"
+ assert fcode(And(Xor(z, x, evaluate=False), y), source_format="free") == \
+ "y .and. (x .neqv. z)"
+ assert fcode(And(Xor(x, y, evaluate=False), z), source_format="free") == \
+ "z .and. (x .neqv. y)"
+ # mixed Or/Xor
+ assert fcode(Xor(Or(y, z), x, evaluate=False), source_format="free") == \
+ "x .neqv. y .or. z"
+ assert fcode(Xor(Or(z, x), y, evaluate=False), source_format="free") == \
+ "y .neqv. x .or. z"
+ assert fcode(Xor(Or(x, y), z, evaluate=False), source_format="free") == \
+ "z .neqv. x .or. y"
+ assert fcode(Or(Xor(y, z, evaluate=False), x), source_format="free") == \
+ "x .or. (y .neqv. z)"
+ assert fcode(Or(Xor(z, x, evaluate=False), y), source_format="free") == \
+ "y .or. (x .neqv. z)"
+ assert fcode(Or(Xor(x, y, evaluate=False), z), source_format="free") == \
+ "z .or. (x .neqv. y)"
+ # trinary Xor
+ assert fcode(Xor(x, y, z, evaluate=False), source_format="free") == \
+ "x .neqv. y .neqv. z"
+ assert fcode(Xor(x, y, Not(z), evaluate=False), source_format="free") == \
+ "x .neqv. y .neqv. .not. z"
+ assert fcode(Xor(x, Not(y), z, evaluate=False), source_format="free") == \
+ "x .neqv. z .neqv. .not. y"
+ assert fcode(Xor(Not(x), y, z, evaluate=False), source_format="free") == \
+ "y .neqv. z .neqv. .not. x"
+
+
+def test_fcode_Relational():
+ x, y = symbols("x y")
+ assert fcode(Relational(x, y, "=="), source_format="free") == "x == y"
+ assert fcode(Relational(x, y, "!="), source_format="free") == "x /= y"
+ assert fcode(Relational(x, y, ">="), source_format="free") == "x >= y"
+ assert fcode(Relational(x, y, "<="), source_format="free") == "x <= y"
+ assert fcode(Relational(x, y, ">"), source_format="free") == "x > y"
+ assert fcode(Relational(x, y, "<"), source_format="free") == "x < y"
+
+
def test_fcode_Piecewise():
x = symbols('x')
code = fcode(Piecewise((x, x < 1), (x**2, True)))
| [
{
"components": [
{
"doc": "",
"lines": [
154,
157
],
"name": "CodePrinter._print_And",
"signature": "def _print_And(self, expr):",
"type": "function"
},
{
"doc": "",
"lines": [
159,
162
... | [
"test_fcode_precedence",
"test_fcode_Logical",
"test_fcode_Xlogical",
"test_fcode_Relational"
] | [
"test_printmethod",
"test_fcode_Pow",
"test_fcode_Rational",
"test_fcode_Integer",
"test_fcode_Float",
"test_fcode_functions",
"test_fcode_functions_with_integers",
"test_fcode_NumberSymbol",
"test_fcode_complex",
"test_implicit",
"test_not_fortran",
"test_user_functions",
"test_inline_funct... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Fortran code printing of logical and relational operators.
Generalize the code printing of logical operators (And, Or, Not) and relational operators (==, !=, >=, etc.) to make them customizable by overriding a dict. Add the correct syntax for Fortran code printing for these and add tests. Also implement Fortran printing of Xor and Equivalent, and fix the precedence order for Relational.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/printing/codeprinter.py]
(definition of CodePrinter._print_And:)
def _print_And(self, expr):
(definition of CodePrinter._print_Or:)
def _print_Or(self, expr):
(definition of CodePrinter._print_Xor:)
def _print_Xor(self, expr):
(definition of CodePrinter._print_Equivalent:)
def _print_Equivalent(self, expr):
(definition of CodePrinter._print_Not:)
def _print_Not(self, expr):
[end of new definitions in sympy/printing/codeprinter.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2551 | 2,551 | sympy/sympy | 0.7 | 8180bc8757fc43d17603f0e82148a4d0115d3e44 | 2013-10-26T18:16:31Z | diff --git a/AUTHORS b/AUTHORS
index d847dced8f2f..a4dd96467161 100644
--- a/AUTHORS
+++ b/AUTHORS
@@ -290,6 +290,6 @@ Alkiviadis G. Akritas <akritas@uth.gr>
Vinit Ravishankar <vinit.ravishankar@gmail.com>
Mike Boyle <boyle@astro.cornell.edu>
Heiner Kirchhoffer <Heiner.Kirchhoffer@gmail.com>
-Pablo Puente <ppuedom@yahoo.com>
+Pablo Puente <ppuedom@gmail.com>
James Fiedler <jrfiedler@gmail.com>
Harsh Gupta <gupta.harsh96@gmail.com>
diff --git a/sympy/functions/elementary/trigonometric.py b/sympy/functions/elementary/trigonometric.py
index e855f4b1341c..cfcbeb19e6e5 100644
--- a/sympy/functions/elementary/trigonometric.py
+++ b/sympy/functions/elementary/trigonometric.py
@@ -714,7 +714,114 @@ def _sage_(self):
return sage.cos(self.args[0]._sage_())
-class sec(TrigonometricFunction): # TODO implement rest all functions for sec. see cos, sin, tan.
+class ReciprocalTrigonometricFunction(TrigonometricFunction):
+ """Base class for reciprocal functions of trigonometric functions. """
+
+ nargs = 1
+ _reciprocal_of = None # mandatory, to be defined in subclass
+
+ # _is_even and _is_odd are used for correct evaluation of csc(-x), sec(-x)
+ # TODO refactor into TrigonometricFunction common parts of
+ # trigonometric functions eval() like even/odd, func(x+2*k*pi), etc.
+ _is_even = None # optional, to be defined in subclass
+ _is_odd = None # optional, to be defined in subclass
+
+ def _call_reciprocal(self, method_name, *args, **kwargs):
+ # Calls method_name on _reciprocal_of
+ o = self._reciprocal_of(self.args[0])
+ if kwargs:
+ return getattr(o, method_name)(**kwargs)
+ else:
+ return getattr(o, method_name)(*args)
+
+ def _calculate_reciprocal(self, method_name, *args, **kwargs):
+ # If calling method_name on _reciprocal_of returns a value != None
+ # then return the reciprocal of that value
+ t = self._call_reciprocal(method_name, *args, **kwargs)
+ return 1/t if t != None else t
+
+ def _rewrite_reciprocal(self, method_name, arg):
+ # Special handling for rewrite functions. If reciprocal rewrite returns
+ # unmodified expression, then return None
+ t = self._call_reciprocal(method_name, arg)
+ if t != None and t != self._reciprocal_of(arg):
+ return 1/t
+ else:
+ return
+
+ def fdiff(self, argindex=1):
+ return self._calculate_reciprocal("fdiff", argindex)
+
+ def _eval_rewrite_as_exp(self, arg):
+ return self._rewrite_reciprocal("_eval_rewrite_as_exp", arg)
+
+ def _eval_rewrite_as_Pow(self, arg):
+ return self._rewrite_reciprocal("_eval_rewrite_as_Pow", arg)
+
+ def _eval_rewrite_as_sin(self, arg):
+ return self._rewrite_reciprocal("_eval_rewrite_as_sin", arg)
+
+ def _eval_rewrite_as_cos(self, arg):
+ return self._rewrite_reciprocal("_eval_rewrite_as_cos", arg)
+
+ def _eval_rewrite_as_tan(self, arg):
+ return self._rewrite_reciprocal("_eval_rewrite_as_tan", arg)
+
+ def _eval_rewrite_as_pow(self, arg):
+ return self._rewrite_reciprocal("_eval_rewrite_as_pow", arg)
+
+ def _eval_rewrite_as_sqrt(self, arg):
+ return self._rewrite_reciprocal("_eval_rewrite_as_sqrt", arg)
+
+ def _eval_conjugate(self):
+ return self.func(self.args[0].conjugate())
+
+ def as_real_imag(self, deep=True, **hints):
+ return (1/self._reciprocal_of(self.args[0])).as_real_imag(deep,
+ **hints)
+
+ def _eval_expand_trig(self, **hints):
+ return self._calculate_reciprocal("_eval_expand_trig", **hints)
+
+ def _eval_is_real(self):
+ return self._reciprocal_of(self.args[0])._eval_is_real()
+
+ def _eval_as_leading_term(self, x):
+ return (1/self._reciprocal_of(self.args[0]))._eval_as_leading_term(x)
+
+ def _eval_is_bounded(self):
+ return (1/self._reciprocal_of(self.args[0])).is_bounded
+
+ def _eval_nseries(self, x, n, logx):
+ return (1/self._reciprocal_of(self.args[0]))._eval_nseries(x, n, logx)
+
+ @classmethod
+ def eval(cls, arg):
+ if arg.could_extract_minus_sign():
+ if cls._is_even:
+ return cls(-arg)
+ if cls._is_odd:
+ return -cls(-arg)
+
+ pi_coeff = _pi_coeff(arg)
+ if (pi_coeff is not None
+ and not (2*pi_coeff).is_integer
+ and pi_coeff.is_Rational):
+ q = pi_coeff.q
+ p = pi_coeff.p % (2*q)
+ if p > q:
+ narg = (pi_coeff - 1)*S.Pi
+ return -cls(narg)
+ if 2*p > q:
+ narg = (1 - pi_coeff)*S.Pi
+ return -cls(narg)
+ t = cls._reciprocal_of.eval(arg)
+ return 1/t if t != None else t
+
+
+class sec(ReciprocalTrigonometricFunction):
+ _reciprocal_of = cos
+ _is_even = True
def _eval_rewrite_as_cos(self, arg):
return (1/cos(arg))
@@ -728,8 +835,16 @@ def fdiff(self, argindex=1):
else:
raise ArgumentIndexError(self, argindex)
+ # TODO def taylor_term(n, x, *previous_terms):
+
+ def _sage_(self):
+ import sage.all as sage
+ return sage.sec(self.args[0]._sage_())
-class csc(TrigonometricFunction): # TODO implement other functions for csc as in cos, sin, tan.
+
+class csc(ReciprocalTrigonometricFunction):
+ _reciprocal_of = sin
+ _is_odd = True
def _eval_rewrite_as_sin(self, arg):
return (1/sin(arg))
@@ -743,6 +858,12 @@ def fdiff(self, argindex=1):
else:
raise ArgumentIndexError(self, argindex)
+ # TODO def taylor_term(n, x, *previous_terms):
+
+ def _sage_(self):
+ import sage.all as sage
+ return sage.csc(self.args[0]._sage_())
+
class tan(TrigonometricFunction):
"""
diff --git a/sympy/parsing/sympy_parser.py b/sympy/parsing/sympy_parser.py
index 5ab72fde219e..ed95334f4c60 100644
--- a/sympy/parsing/sympy_parser.py
+++ b/sympy/parsing/sympy_parser.py
@@ -453,7 +453,7 @@ def implicit_application(result, local_dict, global_dict):
... standard_transformations, implicit_application)
>>> transformations = standard_transformations + (implicit_application,)
>>> parse_expr('cot z + csc z', transformations=transformations)
- csc(z) + cot(z)
+ cot(z) + csc(z)
"""
for step in (_group_parentheses(implicit_application),
_apply_functions,
| diff --git a/sympy/core/tests/test_args.py b/sympy/core/tests/test_args.py
index 96b4c0edba53..d91d1c70fb4b 100644
--- a/sympy/core/tests/test_args.py
+++ b/sympy/core/tests/test_args.py
@@ -1153,6 +1153,9 @@ def test_sympy__functions__elementary__piecewise__Piecewise():
def test_sympy__functions__elementary__trigonometric__TrigonometricFunction():
pass
+@SKIP("abstract class")
+def test_sympy__functions__elementary__trigonometric__ReciprocalTrigonometricFunction():
+ pass
def test_sympy__functions__elementary__trigonometric__acos():
from sympy.functions.elementary.trigonometric import acos
diff --git a/sympy/functions/elementary/tests/test_trigonometric.py b/sympy/functions/elementary/tests/test_trigonometric.py
index e73a326e34f7..efe4edc8f250 100644
--- a/sympy/functions/elementary/tests/test_trigonometric.py
+++ b/sympy/functions/elementary/tests/test_trigonometric.py
@@ -1,7 +1,8 @@
-from sympy import (symbols, Symbol, nan, oo, zoo, I, sinh, sin, acot, pi, atan,
- acos, Rational, sqrt, asin, acot, cot, coth, E, S, tan, tanh, cos,
+from sympy import (symbols, Symbol, nan, oo, zoo, I, sinh, sin, pi, atan,
+ acos, Rational, sqrt, asin, acot, coth, E, S, tan, tanh, cos,
cosh, atan2, exp, log, asinh, acoth, atanh, O, cancel, Matrix, re, im,
- Float, Pow, gcd, sec, csc, cot, diff, simplify, Heaviside, arg, conjugate)
+ Float, Pow, gcd, sec, csc, cot, diff, simplify, Heaviside, arg,
+ conjugate, series)
from sympy.utilities.pytest import XFAIL, slow, raises
from sympy.core.compatibility import xrange
@@ -969,7 +970,124 @@ def test_tancot_rewrite_sqrt():
assert 1e-3 > abs( cot(x.evalf(7)) - c1.evalf(4) ), "fails for %d*pi/%d" % (i, n)
def test_sec():
+ x = symbols('x', real=True)
+ z = symbols('z')
+
+ assert sec.nargs == 1
+
+ assert sec(0) == 1
+ assert sec(pi) == -1
+ assert sec(pi/2) == oo
+ assert sec(-pi/2) == oo
+ assert sec(pi/6) == 2*sqrt(3)/3
+ assert sec(pi/3) == 2
+ assert sec(5*pi/2) == oo
+ assert sec(9*pi/7) == -sec(2*pi/7)
+ assert sec(I) == 1/cosh(1)
+ assert sec(x*I) == 1/cosh(x)
+ assert sec(-x) == sec(x)
+
+ assert sec(x).rewrite(exp) == 1/(exp(I*x)/2 + exp(-I*x)/2)
+ assert sec(x).rewrite(sin) == sec(x)
+ assert sec(x).rewrite(cos) == 1/cos(x)
+ assert sec(x).rewrite(tan) == (tan(x/2)**2 + 1)/(-tan(x/2)**2 + 1)
+ assert sec(x).rewrite(pow) == sec(x)
+ assert sec(x).rewrite(sqrt) == sec(x)
+
+ assert sec(z).conjugate() == sec(conjugate(z))
+
+ assert (sec(z).as_real_imag() ==
+ (cos(re(z))*cosh(im(z))/(sin(re(z))**2*sinh(im(z))**2 +
+ cos(re(z))**2*cosh(im(z))**2),
+ sin(re(z))*sinh(im(z))/(sin(re(z))**2*sinh(im(z))**2 +
+ cos(re(z))**2*cosh(im(z))**2)))
+
+ assert sec(x).expand(trig=True) == 1/cos(x)
+ assert sec(2*x).expand(trig=True) == 1/(2*cos(x)**2 - 1)
+
+ assert sec(x).is_real == True
+ assert sec(z).is_real == None
+
+ assert sec(x).as_leading_term() == sec(x)
+
+ assert sec(0).is_bounded == True
+ assert sec(x).is_bounded == None
+ assert sec(pi/2).is_bounded == False
+
+ assert series(sec(x), x, x0=0, n=6) == 1 + x**2/2 + 5*x**4/24 + O(x**6)
+
+ # https://code.google.com/p/sympy/issues/detail?id=4067
+ assert series(sqrt(sec(x))) == 1 + x**2/4 + 7*x**4/96 + O(x**6)
+
+ # https://code.google.com/p/sympy/issues/detail?id=4068
+ assert (series(sqrt(sec(x)), x, x0=pi*3/2, n=4) ==
+ 1/sqrt(x) +x**(S(3)/2)/12 + x**(S(7)/2)/160 + O(x**4))
+
assert sec(x).diff(x) == tan(x)*sec(x)
+
def test_csc():
+ x = symbols('x', real=True)
+ z = symbols('z')
+
+ # https://code.google.com/p/sympy/issues/detail?id=3608
+ cosecant = csc('x')
+ alternate = 1/sin('x')
+ assert cosecant.equals(alternate) == True
+ assert alternate.equals(cosecant) == True
+
+ assert csc.nargs == 1
+
+ assert csc(0) == oo
+ assert csc(pi) == oo
+
+ assert csc(pi/2) == 1
+ assert csc(-pi/2) == -1
+ assert csc(pi/6) == 2
+ assert csc(pi/3) == 2*sqrt(3)/3
+ assert csc(5*pi/2) == 1
+ assert csc(9*pi/7) == -csc(2*pi/7)
+ assert csc(I) == -I/sinh(1)
+ assert csc(x*I) == -I/sinh(x)
+ assert csc(-x) == -csc(x)
+
+ assert csc(x).rewrite(exp) == 2*I/(exp(I*x) - exp(-I*x))
+ assert csc(x).rewrite(sin) == 1/sin(x)
+ assert csc(x).rewrite(cos) == csc(x)
+ assert csc(x).rewrite(tan) == (tan(x/2)**2 + 1)/(2*tan(x/2))
+
+ assert csc(z).conjugate() == csc(conjugate(z))
+
+ assert (csc(z).as_real_imag() ==
+ (sin(re(z))*cosh(im(z))/(sin(re(z))**2*cosh(im(z))**2 +
+ cos(re(z))**2*sinh(im(z))**2),
+ -cos(re(z))*sinh(im(z))/(sin(re(z))**2*cosh(im(z))**2 +
+ cos(re(z))**2*sinh(im(z))**2)))
+
+ assert csc(x).expand(trig=True) == 1/sin(x)
+ assert csc(2*x).expand(trig=True) == 1/(2*sin(x)*cos(x))
+
+ assert csc(x).is_real == True
+ assert csc(z).is_real == None
+
+ assert csc(x).as_leading_term() == csc(x)
+
+ assert csc(0).is_bounded == False
+ assert csc(x).is_bounded == None
+ assert csc(pi/2).is_bounded == True
+
+ assert series(csc(x), x, x0=pi/2, n=6) == 1 + x**2/2 + 5*x**4/24 + O(x**6)
+ assert (series(csc(x), x, x0=0, n=6) ==
+ 1/x + x/6 + 7*x**3/360 - 13*x**5/5040 + O(x**6))
+
assert csc(x).diff(x) == -cot(x)*csc(x)
+
+
+@XFAIL
+@slow
+def test_csc_rewrite_failing():
+ # Move these 2 tests to test_csc() once bugs fixed
+ # sin(x).rewrite(pow) raises RuntimeError: maximum recursion depth
+ # https://code.google.com/p/sympy/issues/detail?id=4072
+ assert csc(x).rewrite(pow) == csc(x)
+ assert csc(x).rewrite(sqrt) == csc(x)
| diff --git a/AUTHORS b/AUTHORS
index d847dced8f2f..a4dd96467161 100644
--- a/AUTHORS
+++ b/AUTHORS
@@ -290,6 +290,6 @@ Alkiviadis G. Akritas <akritas@uth.gr>
Vinit Ravishankar <vinit.ravishankar@gmail.com>
Mike Boyle <boyle@astro.cornell.edu>
Heiner Kirchhoffer <Heiner.Kirchhoffer@gmail.com>
-Pablo Puente <ppuedom@yahoo.com>
+Pablo Puente <ppuedom@gmail.com>
James Fiedler <jrfiedler@gmail.com>
Harsh Gupta <gupta.harsh96@gmail.com>
| [
{
"components": [
{
"doc": "Base class for reciprocal functions of trigonometric functions. ",
"lines": [
717,
819
],
"name": "ReciprocalTrigonometricFunction",
"signature": "class ReciprocalTrigonometricFunction(TrigonometricFunction):",
... | [
"test_sec",
"test_csc"
] | [
"test_all_classes_are_tested",
"test_sympy__assumptions__assume__AppliedPredicate",
"test_sympy__assumptions__assume__Predicate",
"test_sympy__combinatorics__subsets__Subset",
"test_sympy__combinatorics__polyhedron__Polyhedron",
"test_sympy__combinatorics__partitions__Partition",
"test_sympy__concrete__... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
issue 1475: Add missing functions to sec() and csc()
Previous implementation of csc() and sec() was missing most of the methods
required to work with simplify(), eval(), rewrite(), series().
This implementation reuses partly the implementation of the functions in
sin() and cos(). This is done through an intermediate base class
ReciprocalTrigonometricFunction which csc and sec now subclass.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/functions/elementary/trigonometric.py]
(definition of ReciprocalTrigonometricFunction:)
class ReciprocalTrigonometricFunction(TrigonometricFunction):
"""Base class for reciprocal functions of trigonometric functions. """
(definition of ReciprocalTrigonometricFunction._call_reciprocal:)
def _call_reciprocal(self, method_name, *args, **kwargs):
(definition of ReciprocalTrigonometricFunction._calculate_reciprocal:)
def _calculate_reciprocal(self, method_name, *args, **kwargs):
(definition of ReciprocalTrigonometricFunction._rewrite_reciprocal:)
def _rewrite_reciprocal(self, method_name, arg):
(definition of ReciprocalTrigonometricFunction.fdiff:)
def fdiff(self, argindex=1):
(definition of ReciprocalTrigonometricFunction._eval_rewrite_as_exp:)
def _eval_rewrite_as_exp(self, arg):
(definition of ReciprocalTrigonometricFunction._eval_rewrite_as_Pow:)
def _eval_rewrite_as_Pow(self, arg):
(definition of ReciprocalTrigonometricFunction._eval_rewrite_as_sin:)
def _eval_rewrite_as_sin(self, arg):
(definition of ReciprocalTrigonometricFunction._eval_rewrite_as_cos:)
def _eval_rewrite_as_cos(self, arg):
(definition of ReciprocalTrigonometricFunction._eval_rewrite_as_tan:)
def _eval_rewrite_as_tan(self, arg):
(definition of ReciprocalTrigonometricFunction._eval_rewrite_as_pow:)
def _eval_rewrite_as_pow(self, arg):
(definition of ReciprocalTrigonometricFunction._eval_rewrite_as_sqrt:)
def _eval_rewrite_as_sqrt(self, arg):
(definition of ReciprocalTrigonometricFunction._eval_conjugate:)
def _eval_conjugate(self):
(definition of ReciprocalTrigonometricFunction.as_real_imag:)
def as_real_imag(self, deep=True, **hints):
(definition of ReciprocalTrigonometricFunction._eval_expand_trig:)
def _eval_expand_trig(self, **hints):
(definition of ReciprocalTrigonometricFunction._eval_is_real:)
def _eval_is_real(self):
(definition of ReciprocalTrigonometricFunction._eval_as_leading_term:)
def _eval_as_leading_term(self, x):
(definition of ReciprocalTrigonometricFunction._eval_is_bounded:)
def _eval_is_bounded(self):
(definition of ReciprocalTrigonometricFunction._eval_nseries:)
def _eval_nseries(self, x, n, logx):
(definition of ReciprocalTrigonometricFunction.eval:)
def eval(cls, arg):
(definition of sec._sage_:)
def _sage_(self):
(definition of csc._sage_:)
def _sage_(self):
[end of new definitions in sympy/functions/elementary/trigonometric.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | |
sympy__sympy-2505 | 2,505 | sympy/sympy | 0.7 | 5e2ea80aedee445852fe075e1122488ac90d7fe0 | 2013-10-04T21:17:16Z | diff --git a/sympy/functions/special/error_functions.py b/sympy/functions/special/error_functions.py
index c42bacce75b6..ded0fd3b2b07 100644
--- a/sympy/functions/special/error_functions.py
+++ b/sympy/functions/special/error_functions.py
@@ -1239,6 +1239,17 @@ def _eval_rewrite_as_Si(self, nu, z):
_eval_rewrite_as_Chi = _eval_rewrite_as_Si
_eval_rewrite_as_Shi = _eval_rewrite_as_Si
+ def _eval_nseries(self, x, n, logx):
+ if not self.args[0].has(x):
+ nu = self.args[0]
+ if nu == 1:
+ f = self._eval_rewrite_as_Si(*self.args)
+ return f._eval_nseries(x, n, logx)
+ elif nu.is_Integer and nu > 1:
+ f = self._eval_rewrite_as_Ei(*self.args)
+ return f._eval_nseries(x, n, logx)
+ return super(expint, self)._eval_nseries(x, n, logx)
+
def E1(z):
"""
| diff --git a/sympy/functions/special/tests/test_error_functions.py b/sympy/functions/special/tests/test_error_functions.py
index c8f650188a09..dbaba3746b3e 100644
--- a/sympy/functions/special/tests/test_error_functions.py
+++ b/sympy/functions/special/tests/test_error_functions.py
@@ -377,6 +377,17 @@ def test_expint():
assert mytn(expint(3, x), expint(3, x).rewrite(Ei).rewrite(expint),
x**2*E1(x)/2 + (1 - x)*exp(-x)/2, x)
+ assert expint(S(3)/2, z).nseries(z) == \
+ 2 + 2*z - z**2/3 + z**3/15 - z**4/84 + z**5/540 - \
+ 2*sqrt(pi)*sqrt(z) + O(z**6)
+
+ assert E1(z).series(z) == -EulerGamma - log(z) + z - \
+ z**2/4 + z**3/18 - z**4/96 + z**5/600 + O(z**6)
+
+ assert expint(4, z).series(z) == S(1)/3 - z/2 + z**2/2 + \
+ z**3*(log(z)/6 - S(11)/36 + EulerGamma/6) - z**4/24 + \
+ z**5/240 + O(z**6)
+
def test__eis():
assert _eis(z).diff(z) == -_eis(z) + 1/z
| [
{
"components": [
{
"doc": "",
"lines": [
1242,
1251
],
"name": "expint._eval_nseries",
"signature": "def _eval_nseries(self, x, n, logx):",
"type": "function"
}
],
"file": "sympy/functions/special/error_functions.py"
}
] | [
"test_expint"
] | [
"test_erf",
"test_erf_series",
"test_erf_evalf",
"test__erfs",
"test_erfc",
"test_erfc_series",
"test_erfc_evalf",
"test_erfi",
"test_erfi_series",
"test_erfi_evalf",
"test_erf2",
"test_erfinv",
"test_erfinv_evalf",
"test_erfcinv",
"test_erf2inv",
"test_ei",
"test__eis",
"test_li",... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Implement missing series methods for exponential integrals (first part of the issue 2958)
http://code.google.com/p/sympy/issues/detail?id=2958
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/functions/special/error_functions.py]
(definition of expint._eval_nseries:)
def _eval_nseries(self, x, n, logx):
[end of new definitions in sympy/functions/special/error_functions.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2453 | 2,453 | sympy/sympy | 0.7 | e96d4a8fc195987f158e1741fdc78815c380c1f1 | 2013-09-12T20:05:45Z | diff --git a/sympy/physics/mechanics/functions.py b/sympy/physics/mechanics/functions.py
index 4a06a496531b..2704defd9915 100644
--- a/sympy/physics/mechanics/functions.py
+++ b/sympy/physics/mechanics/functions.py
@@ -12,6 +12,7 @@
'mlatex',
'kinematic_equations',
'inertia_of_point_mass',
+ 'get_motion_params',
'partial_velocity',
'linear_momentum',
'angular_momentum',
@@ -23,11 +24,12 @@
MechanicsStrPrinter,
MechanicsPrettyPrinter,
MechanicsLatexPrinter,
- dynamicsymbols)
+ dynamicsymbols,
+ _check_frame, _check_vector)
from sympy.physics.mechanics.particle import Particle
from sympy.physics.mechanics.rigidbody import RigidBody
from sympy.physics.mechanics.point import Point
-from sympy import sympify, diff, sin, cos, Matrix
+from sympy import sympify, solve, diff, sin, cos, Matrix, Symbol, integrate
from sympy.core.basic import S
@@ -460,6 +462,156 @@ def kinematic_equations(speeds, coords, rot_type, rot_order=''):
raise ValueError('Not an approved rotation type for this function')
+def get_motion_params(frame, **kwargs):
+ """
+ Calculates the three motion parameters - position, velocity and acceleration
+ as vectorial functions of time in the given frame.
+
+ If a higher order order differential function is provided, the lower order
+ functions are used as boundary conditions. The values of time at which the
+ boundary conditions are specified are taken from timevalue1(for position
+ boundary condition) and timevalue2(for velocity boundary condition).
+
+ If any of the boundary conditions are not provided, they are taken to be zero
+ by default (zero vectors, in case of vectorial inputs). If the boundary
+ conditions are also functions of time, they are converted to constants by
+ substituting the time values in the dynamicsymbols._t time Symbol.
+
+ This function can also be used for calculating rotational motion parameters.
+
+ Parameters
+ ==========
+
+ frame : ReferenceFrame
+ The frame to express the motion parameters in
+
+ acceleration : Vector
+ Acceleration of the object/frame as a function of time
+
+ velocity : Vector
+ Velocity as function of time or as boundary condition
+ of velocity at time = timevalue1
+
+ position : Vector
+ Velocity as function of time or as boundary condition
+ of velocity at time = timevalue1
+
+ timevalue1 : sympyfiable
+ Value of time for position boundary condition
+
+ timevalue2 : sympyfiable
+ Value of time for velocity boundary condition
+
+ Examples
+ ========
+
+ >>> from sympy.physics.mechanics import ReferenceFrame, get_motion_params, dynamicsymbols
+ >>> from sympy import symbols
+ >>> R = ReferenceFrame('R')
+ >>> v1, v2, v3 = dynamicsymbols('v1 v2 v3')
+ >>> v = v1*R.x + v2*R.y + v3*R.z
+ >>> get_motion_params(R, position = v)
+ (v1''*R.x + v2''*R.y + v3''*R.z, v1'*R.x + v2'*R.y + v3'*R.z, v1*R.x + v2*R.y + v3*R.z)
+ >>> a, b, c = symbols('a b c')
+ >>> v = a*R.x + b*R.y + c*R.z
+ >>> get_motion_params(R, velocity = v)
+ (0, a*R.x + b*R.y + c*R.z, a*t*R.x + b*t*R.y + c*t*R.z)
+ >>> parameters = get_motion_params(R, acceleration = v)
+ >>> parameters[1]
+ a*t*R.x + b*t*R.y + c*t*R.z
+ >>> parameters[2]
+ a*t**2/2*R.x + b*t**2/2*R.y + c*t**2/2*R.z
+
+ """
+
+ ##Helper functions
+
+ def _integrate_boundary(expr, var, valueofvar, value):
+ """
+ Returns indefinite integral of expr wrt var, using the boundary
+ condition of expr's value being 'value' at var = valueofvar.
+ """
+ CoI = Symbol('CoI')
+ expr = integrate(expr, var) + CoI
+ n = expr.subs({CoI: solve(expr.subs({var: valueofvar}) -\
+ value.subs({var: valueofvar}), CoI)[0]})
+ return n
+
+ def _process_vector_differential(vectdiff, condition, \
+ variable, valueofvar, frame):
+ """
+ Helper function for get_motion methods. Finds derivative of vectdiff wrt
+ variable, and its integral using the specified boundary condition at
+ value of variable = valueofvar.
+ Returns a tuple of - (derivative, function and integral) wrt vectdiff
+
+ """
+
+ #Make sure boundary condition is independent of 'variable'
+ if condition != 0:
+ condition = frame.express(condition)
+ #Special case of vectdiff == 0
+ if vectdiff == Vector(0):
+ return (0, 0, condition)
+ #Express vectdiff completely in condition's frame to give vectdiff1
+ vectdiff1 = frame.express(vectdiff)
+ #Find derivative of vectdiff
+ vectdiff2 = frame.dt(vectdiff)
+ #Integrate and use boundary condition
+ vectdiff0 = Vector(0)
+ for dim in frame:
+ function1 = vectdiff1.dot(dim)
+ vectdiff0 += _integrate_boundary(function1, variable, valueofvar,
+ dim.dot(condition)) * dim
+ #Return list
+ return (vectdiff2, vectdiff, vectdiff0)
+
+ ##Function body
+
+ _check_frame(frame)
+ #Decide mode of operation based on user's input
+ if 'acceleration' in kwargs:
+ mode = 2
+ elif 'velocity' in kwargs:
+ mode = 1
+ else:
+ mode = 0
+ #All the possible parameters in kwargs
+ #Not all are required for every case
+ #If not specified, set to default values(may or may not be used in
+ #calculations)
+ conditions = ['acceleration', 'velocity', 'position',
+ 'timevalue', 'timevalue1', 'timevalue2']
+ for i, x in enumerate(conditions):
+ if x not in kwargs:
+ if i < 3:
+ kwargs[x] = Vector(0)
+ else:
+ kwargs[x] = S(0)
+ elif i < 3:
+ _check_vector(kwargs[x])
+ else:
+ kwargs[x] = sympify(kwargs[x])
+ if mode == 2:
+ vel = _process_vector_differential(kwargs['acceleration'],
+ kwargs['velocity'],
+ dynamicsymbols._t,
+ kwargs['timevalue2'], frame)[2]
+ pos = _process_vector_differential(vel, kwargs['position'],
+ dynamicsymbols._t,
+ kwargs['timevalue1'], frame)[2]
+ return (kwargs['acceleration'], vel, pos)
+ elif mode == 1:
+ return _process_vector_differential(kwargs['velocity'],
+ kwargs['position'],
+ dynamicsymbols._t,
+ kwargs['timevalue1'], frame)
+ else:
+ vel = frame.dt(kwargs['position'])
+ acc = frame.dt(vel)
+ return (acc, vel, kwargs['position'])
+
+
def partial_velocity(vel_list, u_list, frame):
"""Returns a list of partial velocities.
| diff --git a/sympy/physics/mechanics/tests/test_functions.py b/sympy/physics/mechanics/tests/test_functions.py
index 4b4435ce5d69..0660bfabc011 100644
--- a/sympy/physics/mechanics/tests/test_functions.py
+++ b/sympy/physics/mechanics/tests/test_functions.py
@@ -1,4 +1,4 @@
-from sympy import S, sin, cos, pi, sqrt, symbols
+from sympy import S, Integral, sin, cos, pi, sqrt, symbols
from sympy.physics.mechanics import (Dyadic, Particle, Point, ReferenceFrame,
RigidBody, Vector)
from sympy.physics.mechanics import (angular_momentum, cross, dot,
@@ -6,7 +6,7 @@
inertia_of_point_mass,
kinematic_equations, kinetic_energy,
linear_momentum, outer, partial_velocity,
- potential_energy)
+ potential_energy, get_motion_params)
from sympy.utilities.pytest import raises
Vector.simp = True
@@ -333,6 +333,61 @@ def test_express():
assert C.z == express((sin(q3)*B.x + cos(q3)*B.z), C)
+def test_get_motion_methods():
+ #Initialization
+ t = dynamicsymbols._t
+ s1, s2, s3 = symbols('s1 s2 s3')
+ S1, S2, S3 = symbols('S1 S2 S3')
+ S4, S5, S6 = symbols('S4 S5 S6')
+ t1, t2 = symbols('t1 t2')
+ a, b, c = dynamicsymbols('a b c')
+ ad, bd, cd = dynamicsymbols('a b c', 1)
+ a2d, b2d, c2d = dynamicsymbols('a b c', 2)
+ v0 = S1*N.x + S2*N.y + S3*N.z
+ v01 = S4*N.x + S5*N.y + S6*N.z
+ v1 = s1*N.x + s2*N.y + s3*N.z
+ v2 = a*N.x + b*N.y + c*N.z
+ v2d = ad*N.x + bd*N.y + cd*N.z
+ v2dd = a2d*N.x + b2d*N.y + c2d*N.z
+ #Test position parameter
+ assert get_motion_params(frame = N) == (0, 0, 0)
+ assert get_motion_params(N, position=v1) == (0, 0, v1)
+ assert get_motion_params(N, position=v2) == (v2dd, v2d, v2)
+ #Test velocity parameter
+ assert get_motion_params(N, velocity=v1) == (0, v1, v1 * t)
+ assert get_motion_params(N, velocity=v1, position=v0, timevalue1=t1) == \
+ (0, v1, v0 + v1*(t - t1))
+ assert get_motion_params(N, velocity=v1, position=v2, timevalue1=t1) == \
+ (0, v1, v1*t - v1*t1 + v2.subs(t, t1))
+ integral_vector = Integral(a, t)*N.x + Integral(b, t)*N.y + Integral(c, t)*N.z
+ assert get_motion_params(N, velocity=v2, position=v0, timevalue1=t1) == (v2d, v2,
+ v0 + integral_vector -
+ integral_vector.subs(t, t1))
+ #Test acceleration parameter
+ assert get_motion_params(N, acceleration=v1) == (v1, v1 * t, v1 * t**2/2)
+ assert get_motion_params(N, acceleration=v1, velocity=v0,
+ position=v2, timevalue1=t1, timevalue2=t2) == \
+ (v1, (v0 + v1*t - v1*t2),
+ -v0*t1 + v1*t**2/2 + v1*t2*t1 - \
+ v1*t1**2/2 + t*(v0 - v1*t2) + \
+ v2.subs(t, t1))
+ assert get_motion_params(N, acceleration=v1, velocity=v0,
+ position=v01, timevalue1=t1, timevalue2=t2) == \
+ (v1, v0 + v1*t - v1*t2,
+ -v0*t1 + v01 + v1*t**2/2 + \
+ v1*t2*t1 - v1*t1**2/2 + \
+ t*(v0 - v1*t2))
+ i = Integral(a, t)
+ i_sub = i.subs(t, t2)
+ assert get_motion_params(N, acceleration=a*N.x, velocity=S1*N.x,
+ position=S2*N.x, timevalue1=t1, timevalue2=t2) == \
+ (a*N.x,
+ (S1 + i - i_sub)*N.x,
+ (S2 + Integral(S1 - t*(a.subs(t, t2)) + i, t) - \
+ Integral(S1 - t1*(a.subs(t, t2)) + \
+ i.subs(t, t1), t))*N.x)
+
+
def test_inertia():
N = ReferenceFrame('N')
ixx, iyy, izz = symbols('ixx iyy izz')
| [
{
"components": [
{
"doc": "Calculates the three motion parameters - position, velocity and acceleration\nas vectorial functions of time in the given frame.\n\nIf a higher order order differential function is provided, the lower order\nfunctions are used as boundary conditions. The values of time ... | [
"test_dot",
"test_dot_different_frames",
"test_cross",
"test_cross_different_frames",
"test_operator_match",
"test_express",
"test_get_motion_methods",
"test_inertia",
"test_kin_eqs",
"test_inertia_of_point_mass",
"test_partial_velocity",
"test_linear_momentum",
"test_angular_momentum_and_li... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Added get_motion_params function to mechanics core
Added get_motion_params method to get accln, velocity and position as functions of time given a time-dependent vectorial function and boundary conditions
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/physics/mechanics/functions.py]
(definition of get_motion_params:)
def get_motion_params(frame, **kwargs):
"""Calculates the three motion parameters - position, velocity and acceleration
as vectorial functions of time in the given frame.
If a higher order order differential function is provided, the lower order
functions are used as boundary conditions. The values of time at which the
boundary conditions are specified are taken from timevalue1(for position
boundary condition) and timevalue2(for velocity boundary condition).
If any of the boundary conditions are not provided, they are taken to be zero
by default (zero vectors, in case of vectorial inputs). If the boundary
conditions are also functions of time, they are converted to constants by
substituting the time values in the dynamicsymbols._t time Symbol.
This function can also be used for calculating rotational motion parameters.
Parameters
==========
frame : ReferenceFrame
The frame to express the motion parameters in
acceleration : Vector
Acceleration of the object/frame as a function of time
velocity : Vector
Velocity as function of time or as boundary condition
of velocity at time = timevalue1
position : Vector
Velocity as function of time or as boundary condition
of velocity at time = timevalue1
timevalue1 : sympyfiable
Value of time for position boundary condition
timevalue2 : sympyfiable
Value of time for velocity boundary condition
Examples
========
>>> from sympy.physics.mechanics import ReferenceFrame, get_motion_params, dynamicsymbols
>>> from sympy import symbols
>>> R = ReferenceFrame('R')
>>> v1, v2, v3 = dynamicsymbols('v1 v2 v3')
>>> v = v1*R.x + v2*R.y + v3*R.z
>>> get_motion_params(R, position = v)
(v1''*R.x + v2''*R.y + v3''*R.z, v1'*R.x + v2'*R.y + v3'*R.z, v1*R.x + v2*R.y + v3*R.z)
>>> a, b, c = symbols('a b c')
>>> v = a*R.x + b*R.y + c*R.z
>>> get_motion_params(R, velocity = v)
(0, a*R.x + b*R.y + c*R.z, a*t*R.x + b*t*R.y + c*t*R.z)
>>> parameters = get_motion_params(R, acceleration = v)
>>> parameters[1]
a*t*R.x + b*t*R.y + c*t*R.z
>>> parameters[2]
a*t**2/2*R.x + b*t**2/2*R.y + c*t**2/2*R.z"""
(definition of get_motion_params._integrate_boundary:)
def _integrate_boundary(expr, var, valueofvar, value):
"""Returns indefinite integral of expr wrt var, using the boundary
condition of expr's value being 'value' at var = valueofvar."""
(definition of get_motion_params._process_vector_differential:)
def _process_vector_differential(vectdiff, condition, \ variable, valueofvar, frame):
"""Helper function for get_motion methods. Finds derivative of vectdiff wrt
variable, and its integral using the specified boundary condition at
value of variable = valueofvar.
Returns a tuple of - (derivative, function and integral) wrt vectdiff"""
[end of new definitions in sympy/physics/mechanics/functions.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2431 | 2,431 | sympy/sympy | 0.7 | 8180bc8757fc43d17603f0e82148a4d0115d3e44 | 2013-09-02T06:38:52Z | diff --git a/doc/src/modules/calculus/index.rst b/doc/src/modules/calculus/index.rst
new file mode 100644
index 000000000000..5ff24db89565
--- /dev/null
+++ b/doc/src/modules/calculus/index.rst
@@ -0,0 +1,5 @@
+========
+Calculus
+========
+
+.. automodule:: sympy.calculus
diff --git a/doc/src/modules/index.rst b/doc/src/modules/index.rst
index 5854eb15773f..8dade7a04d8e 100644
--- a/doc/src/modules/index.rst
+++ b/doc/src/modules/index.rst
@@ -48,6 +48,7 @@ access any SymPy module, or use this contens:
tensor/index.rst
utilities/index.rst
parsing.rst
+ calculus/index.rst
physics/index.rst
categories.rst
diffgeom.rst
diff --git a/setup.py b/setup.py
index b590bb0a5187..de70789c305d 100755
--- a/setup.py
+++ b/setup.py
@@ -48,6 +48,7 @@
modules = [
'sympy.assumptions',
'sympy.assumptions.handlers',
+ 'sympy.calculus',
'sympy.categories',
'sympy.combinatorics',
'sympy.concrete',
@@ -218,6 +219,7 @@ def run(self):
# $ python bin/generate_test_list.py
tests = [
'sympy.assumptions.tests',
+ 'sympy.calculus.tests',
'sympy.categories.tests',
'sympy.combinatorics.tests',
'sympy.concrete.tests',
@@ -239,6 +241,7 @@ def run(self):
'sympy.mpmath.tests',
'sympy.ntheory.tests',
'sympy.parsing.tests',
+ 'sympy.physics.hep.tests',
'sympy.physics.mechanics.tests',
'sympy.physics.quantum.tests',
'sympy.physics.tests',
diff --git a/sympy/calculus/__init__.py b/sympy/calculus/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/sympy/calculus/euler.py b/sympy/calculus/euler.py
new file mode 100644
index 000000000000..62a184890cef
--- /dev/null
+++ b/sympy/calculus/euler.py
@@ -0,0 +1,92 @@
+from sympy import Function, sympify, diff, Eq, S, Symbol, Derivative
+from itertools import combinations_with_replacement
+
+def euler_equations(L, funcs=(), vars=()):
+ """Find the Euler-Lagrange equations for a given Lagrangian.
+
+ Parameters
+ ==========
+ L : Expr
+ The Lagrangian that should be a function of the functions listed
+ in the second argument and their first derivatives.
+
+ For example, in the case of two functions (f(x,y), g(x,y)) and
+ two independent variables (x,y) the Lagrangian would have the form::
+
+ L(f(x,y),g(x,y),df/dx,df/dy,dg/dx,dg/dy,x,y)
+
+ funcs : Function or list/tuple of Functions
+ The functions that the Lagrangian depends on. The Euler equations
+ are differential equations for each of these functions.
+
+ vars : Symbol or list/tuple of Symbols
+ The Symbols that are the independent variables of the functions.
+
+ In many cases it is not necessary to provide anything, except the
+ Lagrangian, it will be autodetected (and an error raised if this
+ couldn't be done).
+
+ Returns
+ =======
+ eqns : set of Eq
+ The set of differential equations, one for each function.
+
+ Examples
+ ========
+
+ >>> from sympy import Symbol, Function
+ >>> from sympy.calculus.euler import euler_equations
+ >>> x = Function('x')
+ >>> t = Symbol('t')
+ >>> L = (x(t).diff(t))**2/2 - x(t)**2/2
+ >>> euler_equations(L, x(t), t)
+ set([-x(t) - Derivative(x(t), t, t) == 0])
+ >>> u = Function('u')
+ >>> x = Symbol('x')
+ >>> L = (u(t, x).diff(t))**2/2 - (u(t, x).diff(x))**2/2
+ >>> euler_equations(L, u(t, x), [t, x])
+ set([-Derivative(u(t, x), t, t) + Derivative(u(t, x), x, x) == 0])
+
+ References
+ ==========
+
+ .. [1] http://en.wikipedia.org/wiki/Euler%E2%80%93Lagrange_equation
+
+ """
+
+ if not isinstance(funcs, (tuple, list)):
+ funcs = (funcs,)
+
+ if not funcs:
+ funcs = tuple(L.atoms(Function))
+ else:
+ for f in funcs:
+ if not isinstance(f, Function):
+ raise TypeError('Function expected, got: %s' % f)
+
+ if not isinstance(vars, (tuple, list)):
+ vars = (vars,)
+
+ if not vars:
+ vars = funcs[0].args
+ else:
+ vars = tuple(sympify(var) for var in vars)
+
+ if not all(isinstance(v, Symbol) for v in vars):
+ raise TypeError('Variables are not symbols, got %s' % vars)
+
+ for f in funcs:
+ if not vars == f.args:
+ raise ValueError("Variables %s don't match function arguments: %s" % (vars, f))
+
+ order = max(len(d.variables) for d in L.atoms(Derivative) if d.expr in funcs)
+
+ eqns = []
+ for f in funcs:
+ eq = diff(L, f)
+ for i in range(1, order + 1):
+ for p in combinations_with_replacement(vars, i):
+ eq = eq + S.NegativeOne**i*diff(L, diff(f, *p), *p)
+ eqns.append(Eq(eq, 0))
+
+ return set(eqns)
| diff --git a/sympy/calculus/tests/__init__.py b/sympy/calculus/tests/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/sympy/calculus/tests/test_euler.py b/sympy/calculus/tests/test_euler.py
new file mode 100644
index 000000000000..b44d06ba9fd3
--- /dev/null
+++ b/sympy/calculus/tests/test_euler.py
@@ -0,0 +1,62 @@
+from sympy import (Symbol, Function, Derivative, Eq, cos, sin)
+from sympy.utilities.pytest import raises
+from sympy.calculus.euler import euler_equations
+
+def test_euler_interface():
+ x = Function('x')
+ y = Symbol('y')
+ t = Symbol('t')
+ raises(TypeError, lambda: euler_equations())
+ raises(TypeError, lambda: euler_equations(x(t).diff(t)*y(t), [x(t), y]))
+ raises(ValueError, lambda: euler_equations(x(t).diff(t)*x(y), [x(t), x(y)]))
+ raises(TypeError, lambda: euler_equations(x(t).diff(t)**2, x(0)))
+
+
+def test_euler_pendulum():
+ x = Function('x')
+ t = Symbol('t')
+ L = (x(t).diff(t))**2/2 + cos(x(t))
+ assert euler_equations(L, x(t), t) == \
+ set([Eq(-sin(x(t)) - Derivative(x(t), t, t), 0)])
+
+
+def test_euler_henonheiles():
+ x = Function('x')
+ y = Function('y')
+ t = Symbol('t')
+ L = sum((z(t).diff(t))**2/2 - z(t)**2/2 for z in [x, y])
+ L += -x(t)**2*y(t) + y(t)**3/3
+ assert euler_equations(L, [x(t), y(t)], t) == \
+ set([Eq(-x(t)**2 + y(t)**2 - y(t) - Derivative(y(t), t, t), 0),
+ Eq(-2*x(t)*y(t) - x(t) - Derivative(x(t), t, t), 0)])
+
+
+def test_euler_sineg():
+ psi = Function('psi')
+ t = Symbol('t')
+ x = Symbol('x')
+ L = (psi(t, x).diff(t))**2/2 - (psi(t, x).diff(x))**2/2 + cos(psi(t, x))
+ assert euler_equations(L, psi(t, x), [t, x]) == \
+ set([Eq(-sin(psi(t, x)) - Derivative(psi(t, x), t, t) + \
+ Derivative(psi(t, x), x, x), 0)])
+
+
+def test_euler_high_order():
+ # an example from hep-th/0309038
+ m = Symbol('m')
+ k = Symbol('k')
+ x = Function('x')
+ y = Function('y')
+ t = Symbol('t')
+ L = m*Derivative(x(t), t)**2/2 + m*Derivative(y(t), t)**2/2 - \
+ k*Derivative(x(t), t)*Derivative(y(t), t, t) + \
+ k*Derivative(y(t), t)*Derivative(x(t), t, t)
+ assert euler_equations(L, [x(t), y(t)]) == \
+ set([Eq(-2*k*Derivative(x(t), t, t, t) - \
+ m*Derivative(y(t), t, t), 0), Eq(2*k*Derivative(y(t), t, t, t) - \
+ m*Derivative(x(t), t, t), 0)])
+
+ w = Symbol('w')
+ L = x(t, w).diff(t, w)**2/2
+ assert euler_equations(L) == \
+ set([Eq(Derivative(x(t, w), t, t, w, w), 0)])
| diff --git a/doc/src/modules/calculus/index.rst b/doc/src/modules/calculus/index.rst
new file mode 100644
index 000000000000..5ff24db89565
--- /dev/null
+++ b/doc/src/modules/calculus/index.rst
@@ -0,0 +1,5 @@
+========
+Calculus
+========
+
+.. automodule:: sympy.calculus
diff --git a/doc/src/modules/index.rst b/doc/src/modules/index.rst
index 5854eb15773f..8dade7a04d8e 100644
--- a/doc/src/modules/index.rst
+++ b/doc/src/modules/index.rst
@@ -48,6 +48,7 @@ access any SymPy module, or use this contens:
tensor/index.rst
utilities/index.rst
parsing.rst
+ calculus/index.rst
physics/index.rst
categories.rst
diffgeom.rst
| [
{
"components": [
{
"doc": "Find the Euler-Lagrange equations for a given Lagrangian.\n\nParameters\n==========\nL : Expr\n The Lagrangian that should be a function of the functions listed\n in the second argument and their first derivatives.\n\n For example, in the case of two functions ... | [
"test_euler_interface",
"test_euler_pendulum",
"test_euler_henonheiles",
"test_euler_sineg"
] | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Issue 3198: implement Euler-Lagrange equations
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/calculus/euler.py]
(definition of euler_equations:)
def euler_equations(L, funcs=(), vars=()):
"""Find the Euler-Lagrange equations for a given Lagrangian.
Parameters
==========
L : Expr
The Lagrangian that should be a function of the functions listed
in the second argument and their first derivatives.
For example, in the case of two functions (f(x,y), g(x,y)) and
two independent variables (x,y) the Lagrangian would have the form::
L(f(x,y),g(x,y),df/dx,df/dy,dg/dx,dg/dy,x,y)
funcs : Function or list/tuple of Functions
The functions that the Lagrangian depends on. The Euler equations
are differential equations for each of these functions.
vars : Symbol or list/tuple of Symbols
The Symbols that are the independent variables of the functions.
In many cases it is not necessary to provide anything, except the
Lagrangian, it will be autodetected (and an error raised if this
couldn't be done).
Returns
=======
eqns : set of Eq
The set of differential equations, one for each function.
Examples
========
>>> from sympy import Symbol, Function
>>> from sympy.calculus.euler import euler_equations
>>> x = Function('x')
>>> t = Symbol('t')
>>> L = (x(t).diff(t))**2/2 - x(t)**2/2
>>> euler_equations(L, x(t), t)
set([-x(t) - Derivative(x(t), t, t) == 0])
>>> u = Function('u')
>>> x = Symbol('x')
>>> L = (u(t, x).diff(t))**2/2 - (u(t, x).diff(x))**2/2
>>> euler_equations(L, u(t, x), [t, x])
set([-Derivative(u(t, x), t, t) + Derivative(u(t, x), x, x) == 0])
References
==========
.. [1] http://en.wikipedia.org/wiki/Euler%E2%80%93Lagrange_equation"""
[end of new definitions in sympy/calculus/euler.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | |
sympy__sympy-2422 | 2,422 | sympy/sympy | 0.7 | c4e3ddc4908eb71b75323995b10902a4bcad7630 | 2013-08-28T15:57:17Z | diff --git a/doc/src/modules/crypto.rst b/doc/src/modules/crypto.rst
index 79163197b682..6af046326814 100644
--- a/doc/src/modules/crypto.rst
+++ b/doc/src/modules/crypto.rst
@@ -12,6 +12,7 @@ Included in this module are both block ciphers and stream ciphers.
* RSA
* Kid RSA
* linear feedback shift registers (a stream cipher)
+ * ElGamal encryption
.. module:: sympy.crypto.crypto
@@ -74,3 +75,11 @@ Included in this module are both block ciphers and stream ciphers.
.. autofunction:: lfsr_autocorrelation
.. autofunction:: lfsr_connection_polynomial
+
+.. autofunction:: elgamal_public_key
+
+.. autofunction:: elgamal_private_key
+
+.. autofunction:: encipher_elgamal
+
+.. autofunction:: decipher_elgamal
diff --git a/sympy/crypto/__init__.py b/sympy/crypto/__init__.py
index d238510615e6..92379068cb2a 100644
--- a/sympy/crypto/__init__.py
+++ b/sympy/crypto/__init__.py
@@ -7,4 +7,5 @@
decipher_kid_rsa, kid_rsa_private_key, kid_rsa_public_key,
decipher_rsa, rsa_private_key, rsa_public_key, encipher_rsa,
lfsr_connection_polynomial, lfsr_autocorrelation, lfsr_sequence,
- encode_morse, decode_morse)
+ encode_morse, decode_morse, elgamal_private_key, elgamal_public_key,
+ decipher_elgamal, encipher_elgamal)
diff --git a/sympy/crypto/crypto.py b/sympy/crypto/crypto.py
index 278c500fcbef..d4667ec175db 100644
--- a/sympy/crypto/crypto.py
+++ b/sympy/crypto/crypto.py
@@ -4,9 +4,13 @@
from __future__ import print_function
+from random import randrange
+
+from sympy import nextprime
from sympy.core import Rational, S, Symbol
+from sympy.core.numbers import igcdex
from sympy.matrices import Matrix
-from sympy.ntheory import isprime, totient
+from sympy.ntheory import isprime, totient, primitive_root
from sympy.polys.domains import FF
from sympy.polys.polytools import gcd, Poly
from sympy.utilities.iterables import flatten, uniq
@@ -1447,3 +1451,137 @@ def lfsr_connection_polynomial(s):
dC = Poly(C).degree()
coeffsC = [C.subs(x, 0)] + [C.coeff(x**i) for i in range(1, dC + 1)]
return sum([coeffsC[i] % p*x**i for i in range(dC + 1) if coeffsC[i] is not None])
+
+
+#################### ElGamal #############################
+
+
+def elgamal_private_key(digit=10):
+ """
+ Return three number tuple as private key.
+
+ Elgamal encryption is based on mathmatical problem
+ Discrete Logarithm Problem (DLP). For example,
+
+ `a^{b} \equiv c \pmod p`
+
+ In general, if a and b are known, c is easily
+ calculated. If b is unknown, it is hard to use
+ a and c to get b.
+
+ Parameters
+ ==========
+
+ digit : Key length in binary
+
+ Returns
+ =======
+
+ (p, r, d) : p = prime number, r = primitive root, d = random number
+
+
+ Examples
+ ========
+
+ >>> from sympy.crypto.crypto import elgamal_private_key
+ >>> from sympy.ntheory import is_primitive_root, isprime
+ >>> a, b, _ = elgamal_private_key()
+ >>> isprime(a)
+ True
+ >>> is_primitive_root(b, a)
+ True
+
+ """
+ p = nextprime(2**digit)
+ return p, primitive_root(p), randrange(2, p)
+
+
+def elgamal_public_key(prk):
+ """
+ Return three number tuple as public key.
+
+ Parameters
+ ==========
+
+ prk : Tuple (p, r, e) generated by ``elgamal_private_key``
+
+ Returns
+ =======
+ (p, r, e = r**d mod p) : d is a random number in private key.
+
+ Examples
+ ========
+
+ >>> from sympy.crypto.crypto import elgamal_public_key
+ >>> elgamal_public_key((1031, 14, 636))
+ (1031, 14, 212)
+
+ """
+ return prk[0], prk[1], pow(prk[1], prk[2], prk[0])
+
+
+def encipher_elgamal(m, puk):
+ """
+ Encrypt message with public key
+
+ m is plain text message in int. puk is
+ public key (p, r, e). In order to encrypt
+ a message, random a number ``a`` between ``2`` and ``p``,
+ encryped message is `c_{1}` and `c_{2}`
+
+ `c_{1} \equiv r^{a} \pmod p`
+
+ `c_{2} \equiv m e^{a} \pmod p`
+
+ Parameters
+ ==========
+
+ m : int of encoded message
+ puk : public key
+
+ Returns
+ =======
+
+ (c1, c2) : Encipher into two number
+
+ Examples
+ ========
+
+ >>> from sympy.crypto.crypto import encipher_elgamal
+ >>> encipher_elgamal(100, (1031, 14, 212)) # doctest: +SKIP
+ (835, 271)
+
+ """
+ if m > puk[0]:
+ ValueError('Message {} should be less than prime {}'.format(m, puk[0]))
+ r = randrange(2, puk[0])
+ return pow(puk[1], r, puk[0]), m * pow(puk[2], r, puk[0]) % puk[0]
+
+
+def decipher_elgamal(ct, prk):
+ r"""
+ Decrypt message with private key
+
+ `ct = (c_{1}, c_{2})`
+
+ `prk = (p, r, d)`
+
+ According to extended Eucliden theorem,
+ `u c_{1}^{d} + p n = 1`
+
+ `u \equiv 1/{{c_{1}}^d} \pmod p`
+
+ `u c_{2} \equiv \frac{1}{c_{1}^d} c_{2} \equiv \frac{1}{r^{ad}} c_{2} \pmod p`
+
+ `\frac{1}{r^{ad}} m e^a \equiv \frac{1}{r^{ad}} m {r^{d a}} \equiv m \pmod p`
+
+ Examples
+ ========
+
+ >>> from sympy.crypto.crypto import decipher_elgamal
+ >>> decipher_elgamal((835, 271), (1031, 14, 636))
+ 100
+
+ """
+ u = igcdex(ct[0] ** prk[2], prk[0])[0]
+ return u * ct[1] % prk[0]
| diff --git a/sympy/crypto/tests/test_crypto.py b/sympy/crypto/tests/test_crypto.py
index f9f56a51bce9..a369194dcac4 100644
--- a/sympy/crypto/tests/test_crypto.py
+++ b/sympy/crypto/tests/test_crypto.py
@@ -8,8 +8,10 @@
decipher_kid_rsa, kid_rsa_private_key, kid_rsa_public_key,
decipher_rsa, rsa_private_key, rsa_public_key, encipher_rsa,
lfsr_connection_polynomial, lfsr_autocorrelation, lfsr_sequence,
- encode_morse, decode_morse)
+ encode_morse, decode_morse, elgamal_private_key, elgamal_public_key,
+ encipher_elgamal, decipher_elgamal)
from sympy.matrices import Matrix
+from sympy.ntheory import isprime, is_primitive_root
from sympy.polys.domains import FF
from sympy.utilities.pytest import raises
@@ -238,3 +240,15 @@ def test_lfsr_connection_polynomial():
assert lfsr_connection_polynomial(s) == x**2 + 1
s = lfsr_sequence([F(1), F(1)], [F(0), F(1)], 5)
assert lfsr_connection_polynomial(s) == x**2 + x + 1
+
+def test_elgamal_private_key():
+ a, b, _ = elgamal_private_key(digit=100)
+ assert isprime(a)
+ assert is_primitive_root(b, a)
+ assert len(bin(a)) >= 102
+
+def test_elgamal():
+ dk = elgamal_private_key(20)
+ ek = elgamal_public_key(dk)
+ m = 12345
+ assert m == decipher_elgamal(encipher_elgamal(m, ek), dk)
| diff --git a/doc/src/modules/crypto.rst b/doc/src/modules/crypto.rst
index 79163197b682..6af046326814 100644
--- a/doc/src/modules/crypto.rst
+++ b/doc/src/modules/crypto.rst
@@ -12,6 +12,7 @@ Included in this module are both block ciphers and stream ciphers.
* RSA
* Kid RSA
* linear feedback shift registers (a stream cipher)
+ * ElGamal encryption
.. module:: sympy.crypto.crypto
@@ -74,3 +75,11 @@ Included in this module are both block ciphers and stream ciphers.
.. autofunction:: lfsr_autocorrelation
.. autofunction:: lfsr_connection_polynomial
+
+.. autofunction:: elgamal_public_key
+
+.. autofunction:: elgamal_private_key
+
+.. autofunction:: encipher_elgamal
+
+.. autofunction:: decipher_elgamal
| [
{
"components": [
{
"doc": "Return three number tuple as private key.\n\nElgamal encryption is based on mathmatical problem\nDiscrete Logarithm Problem (DLP). For example,\n\n`a^{b} \\equiv c \\pmod p`\n\nIn general, if a and b are known, c is easily\ncalculated. If b is unknown, it is hard to use... | [
"test_alphabet_of_cipher",
"test_cycle_list",
"test_encipher_shift",
"test_encipher_affine",
"test_encipher_substitution",
"test_encipher_vigenere",
"test_decipher_vigenere",
"test_encipher_hill",
"test_decipher_hill",
"test_encipher_bifid5",
"test_bifid5_square",
"test_decipher_bifid5",
"te... | [] | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
ElGamal encryption
TODO:
- [x] Detail algorithm in docstring
- [x] Add more test case
- [x] Use primitive_root in #2307
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/crypto/crypto.py]
(definition of elgamal_private_key:)
def elgamal_private_key(digit=10):
"""Return three number tuple as private key.
Elgamal encryption is based on mathmatical problem
Discrete Logarithm Problem (DLP). For example,
`a^{b} \equiv c \pmod p`
In general, if a and b are known, c is easily
calculated. If b is unknown, it is hard to use
a and c to get b.
Parameters
==========
digit : Key length in binary
Returns
=======
(p, r, d) : p = prime number, r = primitive root, d = random number
Examples
========
>>> from sympy.crypto.crypto import elgamal_private_key
>>> from sympy.ntheory import is_primitive_root, isprime
>>> a, b, _ = elgamal_private_key()
>>> isprime(a)
True
>>> is_primitive_root(b, a)
True"""
(definition of elgamal_public_key:)
def elgamal_public_key(prk):
"""Return three number tuple as public key.
Parameters
==========
prk : Tuple (p, r, e) generated by ``elgamal_private_key``
Returns
=======
(p, r, e = r**d mod p) : d is a random number in private key.
Examples
========
>>> from sympy.crypto.crypto import elgamal_public_key
>>> elgamal_public_key((1031, 14, 636))
(1031, 14, 212)"""
(definition of encipher_elgamal:)
def encipher_elgamal(m, puk):
"""Encrypt message with public key
m is plain text message in int. puk is
public key (p, r, e). In order to encrypt
a message, random a number ``a`` between ``2`` and ``p``,
encryped message is `c_{1}` and `c_{2}`
`c_{1} \equiv r^{a} \pmod p`
`c_{2} \equiv m e^{a} \pmod p`
Parameters
==========
m : int of encoded message
puk : public key
Returns
=======
(c1, c2) : Encipher into two number
Examples
========
>>> from sympy.crypto.crypto import encipher_elgamal
>>> encipher_elgamal(100, (1031, 14, 212)) # doctest: +SKIP
(835, 271)"""
(definition of decipher_elgamal:)
def decipher_elgamal(ct, prk):
"""Decrypt message with private key
`ct = (c_{1}, c_{2})`
`prk = (p, r, d)`
According to extended Eucliden theorem,
`u c_{1}^{d} + p n = 1`
`u \equiv 1/{{c_{1}}^d} \pmod p`
`u c_{2} \equiv \frac{1}{c_{1}^d} c_{2} \equiv \frac{1}{r^{ad}} c_{2} \pmod p`
`\frac{1}{r^{ad}} m e^a \equiv \frac{1}{r^{ad}} m {r^{d a}} \equiv m \pmod p`
Examples
========
>>> from sympy.crypto.crypto import decipher_elgamal
>>> decipher_elgamal((835, 271), (1031, 14, 636))
100"""
[end of new definitions in sympy/crypto/crypto.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | |
sympy__sympy-2412 | 2,412 | sympy/sympy | 0.7 | 105bc84480bb77c757eb722f9ed7d1ca50178f96 | 2013-08-25T11:03:52Z | diff --git a/sympy/physics/mechanics/essential.py b/sympy/physics/mechanics/essential.py
index 67c48c512df3..46e8e5b3c0b5 100644
--- a/sympy/physics/mechanics/essential.py
+++ b/sympy/physics/mechanics/essential.py
@@ -2,7 +2,7 @@
__all__ = ['ReferenceFrame', 'Vector', 'Dyadic', 'dynamicsymbols',
'MechanicsStrPrinter', 'MechanicsPrettyPrinter',
- 'MechanicsLatexPrinter']
+ 'MechanicsLatexPrinter', 'CoordinateSym']
from sympy import (
Symbol, sin, cos, eye, trigsimp, diff, sqrt, sympify,
@@ -36,9 +36,9 @@ def __init__(self, inlist):
"""
Just like Vector's init, you shouldn't call this.
- Stores a Dyadic as a list of lists; the inner list has the measure number
- and the two unit vectors; the outerlist holds each unique unit vector
- pair.
+ Stores a Dyadic as a list of lists; the inner list has the measure
+ number and the two unit vectors; the outerlist holds each unique
+ unit vector pair.
"""
@@ -377,8 +377,8 @@ def express(self, frame1, frame2=None):
The first frame is the list side expression, the second frame is the
right side; if Dyadic is in form A.x|B.y, you can express it in two
- different frames. If no second frame is given, the Dyadic is expressed in
- only one frame.
+ different frames. If no second frame is given, the Dyadic is
+ expressed in only one frame.
Parameters
==========
@@ -412,7 +412,7 @@ def express(self, frame1, frame2=None):
def doit(self, **hints):
"""Calls .doit() on each term in the Dyadic"""
- return sum([Dyadic( [ (v[0].doit(**hints), v[1], v[2]) ]) for
+ return sum([Dyadic([(v[0].doit(**hints), v[1], v[2])]) for
v in self.args])
def dt(self, frame):
@@ -469,13 +469,62 @@ def subs(self, *args, **kwargs):
"""
- return sum([ Dyadic([(v[0].subs(*args, **kwargs), v[1], v[2])])
+ return sum([Dyadic([(v[0].subs(*args, **kwargs), v[1], v[2])])
for v in self.args])
dot = __and__
cross = __xor__
+class CoordinateSym(Symbol):
+ """
+ Class to represent the coordinate symbols associated wrt a Reference
+ Frame
+
+ Users should not instantiate this class. Instances of this class must
+ only be accessed through the corresponding frame as 'frame[index]'
+
+ Examples
+ ========
+
+ >>> from sympy.physics.mechanics import ReferenceFrame
+ >>> A = ReferenceFrame('A')
+ >>> A[1]
+ A_y
+ >>> type(A[0])
+ <class 'sympy.physics.mechanics.essential.CoordinateSym'>
+
+ Refer Symbol documentation for more information-
+ """
+ __doc__ += Symbol.__doc__
+
+ def __new__(cls, name, frame, index):
+ obj = super(CoordinateSym, cls).__new__(cls, name)
+ _check_frame(frame)
+ if index > 2:
+ raise ValueError("Value of index cannot be greater than 2")
+ obj._id = (frame, index)
+ return obj
+
+ @property
+ def frame(self):
+ return self._id[0]
+
+ def __eq__(self, other):
+ #Check if the other object is a CoordinateSym of the same frame
+ #and same index
+ if isinstance(other, CoordinateSym):
+ if other._id == self._id:
+ return True
+ return False
+
+ def __ne__(self, other):
+ return not self.__eq__(other)
+
+ def __hash__(self):
+ return tuple((self._id[0].__hash__(), self._id[1])).__hash__()
+
+
class ReferenceFrame(object):
"""A reference frame in classical mechanics.
@@ -490,7 +539,7 @@ class ReferenceFrame(object):
"""
- def __init__(self, name, indices=None, latexs=None):
+ def __init__(self, name, variables=None, indices=None, latexs=None):
"""ReferenceFrame initialization method.
A ReferenceFrame has a set of orthonormal basis vectors, along with
@@ -515,7 +564,7 @@ def __init__(self, name, indices=None, latexs=None):
>>> N = ReferenceFrame('N')
>>> N.x
N.x
- >>> O = ReferenceFrame('O', ('1', '2', '3'))
+ >>> O = ReferenceFrame('O', indices=('1', '2', '3'))
>>> O.x
O['1']
>>> O['1']
@@ -578,7 +627,12 @@ def __init__(self, name, indices=None, latexs=None):
raise TypeError('Latex entries must be strings')
self.latex_vecs = latexs
self.name = name
+ self._var_dict = {}
+ #The _dcm_dict dictionary will only store the dcms of parent-child
+ #relationships. The _dcm_cache dictionary will work as the dcm
+ #cache.
self._dcm_dict = {}
+ self._dcm_cache = {}
self._ang_vel_dict = {}
self._ang_acc_dict = {}
self._dlist = [self._dcm_dict, self._ang_vel_dict, self._ang_acc_dict]
@@ -586,11 +640,33 @@ def __init__(self, name, indices=None, latexs=None):
self._x = Vector([(Matrix([1, 0, 0]), self)])
self._y = Vector([(Matrix([0, 1, 0]), self)])
self._z = Vector([(Matrix([0, 0, 1]), self)])
+ #Associate coordinate symbols wrt this frame
+ if variables is not None:
+ if not isinstance(variables, (tuple, list)):
+ raise TypeError('Supply the variable names as a list/tuple')
+ if len(variables) != 3:
+ raise ValueError('Supply 3 variable names')
+ for i in variables:
+ if not isinstance(i, string_types):
+ raise TypeError('Variable names must be strings')
+ else:
+ variables = [name + '_x', name + '_y', name + '_z']
+ self.varlist = (CoordinateSym(variables[0], self, 0), \
+ CoordinateSym(variables[1], self, 1), \
+ CoordinateSym(variables[2], self, 2))
def __getitem__(self, ind):
- """Returns basis vector for the provided index (index being an str)"""
- if not isinstance(ind, string_types):
- raise TypeError('Supply a valid str for the index')
+ """
+ Returns basis vector for the provided index, if the index is a string.
+
+ If the index is a number, returns the coordinate variable correspon-
+ -ding to that index.
+ """
+ if not isinstance(ind, str):
+ if ind < 3:
+ return self.varlist[ind]
+ else:
+ raise ValueError("Invalid index provided")
if self.indices[0] == ind:
return self.x
if self.indices[1] == ind:
@@ -641,6 +717,48 @@ def _w_diff_dcm(self, otherframe):
w3 = trigsimp(expand(angvelmat[3]), recursive=True)
return -Vector([(Matrix([w1, w2, w3]), self)])
+ def variable_map(self, otherframe):
+ """
+ Returns a dictionary which expresses the variables of this frame
+ in terms of the variables of otherframe.
+
+ If Vector.simp is True, returns a simplified version of the mapped
+ values. Else, returns them without simplification.
+
+ Simplification may take time.
+
+ Parameters
+ ==========
+
+ otherframe : ReferenceFrame
+ The other frame to map this variables to
+
+ Examples
+ ========
+
+ >>> from sympy.physics.mechanics import ReferenceFrame, dynamicsymbols
+ >>> A = ReferenceFrame('A')
+ >>> q = dynamicsymbols('q')
+ >>> B = A.orientnew('B', 'Axis', [q, A.z])
+ >>> A.variable_map(B)
+ {A_x: B_x*cos(q(t)) - B_y*sin(q(t)), A_y: B_x*sin(q(t)) + B_y*cos(q(t)), A_z: B_z}
+
+ """
+
+ _check_frame(otherframe)
+ if (otherframe, Vector.simp) in self._var_dict:
+ return self._var_dict[(otherframe, Vector.simp)]
+ else:
+ vars_matrix = self.dcm(otherframe) * Matrix(otherframe.varlist)
+ mapping = {}
+ for i, x in enumerate(self):
+ if Vector.simp:
+ mapping[self.varlist[i]] = trigsimp(vars_matrix[i], method='fu')
+ else:
+ mapping[self.varlist[i]] = vars_matrix[i]
+ self._var_dict[(otherframe, Vector.simp)] = mapping
+ return mapping
+
def ang_acc_in(self, otherframe):
"""Returns the angular acceleration Vector of the ReferenceFrame.
@@ -738,10 +856,17 @@ def dcm(self, otherframe):
"""
_check_frame(otherframe)
+ #Check if the dcm wrt that frame has already been calculated
+ if otherframe in self._dcm_cache:
+ return self._dcm_cache[otherframe]
flist = self._dict_list(otherframe, 0)
outdcm = eye(3)
for i in range(len(flist) - 1):
- outdcm = outdcm * flist[i + 1]._dcm_dict[flist[i]]
+ outdcm = outdcm * flist[i]._dcm_dict[flist[i + 1]]
+ #After calculation, store the dcm in dcm cache for faster
+ #future retrieval
+ self._dcm_cache[otherframe] = outdcm
+ otherframe._dcm_cache[self] = outdcm.T
return outdcm
def orient(self, parent, rot_type, amounts, rot_order=''):
@@ -839,7 +964,6 @@ def _rot(axis, angle):
if not rot_order in approved_orders:
raise TypeError('The supplied order is not an approved type')
parent_orient = []
-
if rot_type == 'AXIS':
if not rot_order == '':
raise TypeError('Axis orientation takes no rotation order')
@@ -864,7 +988,7 @@ def _rot(axis, angle):
q0, q1, q2, q3 = amounts
parent_orient = (Matrix([[q0 ** 2 + q1 ** 2 - q2 ** 2 - q3 **
2, 2 * (q1 * q2 - q0 * q3), 2 * (q0 * q2 + q1 * q3)],
- [2 * (q1 * q2 + q0 * q3), q0 ** 2 - q1 ** 2 + q2 **2 - q3 ** 2,
+ [2 * (q1 * q2 + q0 * q3), q0 ** 2 - q1 ** 2 + q2 ** 2 - q3 ** 2,
2 * (q2 * q3 - q0 * q1)], [2 * (q1 * q3 - q0 * q2), 2 * (q0 *
q1 + q2 * q3), q0 ** 2 - q1 ** 2 - q2 ** 2 + q3 ** 2]]))
elif rot_type == 'BODY':
@@ -885,8 +1009,22 @@ def _rot(axis, angle):
* _rot(a1, amounts[0]))
else:
raise NotImplementedError('That is not an implemented rotation')
- self._dcm_dict.update({parent: parent_orient})
- parent._dcm_dict.update({self: parent_orient.T})
+ #Reset the _dcm_cache of this frame, and remove it from the _dcm_caches
+ #of the frames it is linked to. Also remove it from the _dcm_dict of
+ #its parent
+ frames = self._dcm_cache.keys()
+ for frame in frames:
+ if frame in self._dcm_dict:
+ del frame._dcm_dict[self]
+ del frame._dcm_cache[self]
+ #Add the dcm relationship to _dcm_dict
+ self._dcm_dict = self._dlist[0] = {}
+ self._dcm_dict.update({parent: parent_orient.T})
+ parent._dcm_dict.update({self: parent_orient})
+ #Also update the dcm cache after resetting it
+ self._dcm_cache = {}
+ self._dcm_cache.update({parent: parent_orient.T})
+ parent._dcm_cache.update({self: parent_orient})
if rot_type == 'QUATERNION':
t = dynamicsymbols._t
q0, q1, q2, q3 = amounts
@@ -919,9 +1057,10 @@ def _rot(axis, angle):
wvec = self._w_diff_dcm(parent)
self._ang_vel_dict.update({parent: wvec})
parent._ang_vel_dict.update({self: -wvec})
+ self._var_dict = {}
- def orientnew(self, newname, rot_type, amounts, rot_order='', indices=None,
- latexs=None):
+ def orientnew(self, newname, rot_type, amounts, rot_order='', variables=None,
+ indices=None, latexs=None):
"""Creates a new ReferenceFrame oriented with respect to this Frame.
See ReferenceFrame.orient() for acceptable rotation types, amounts,
@@ -955,7 +1094,7 @@ def orientnew(self, newname, rot_type, amounts, rot_order='', indices=None,
"""
- newframe = ReferenceFrame(newname, indices, latexs)
+ newframe = ReferenceFrame(newname, variables, indices, latexs)
newframe.orient(self, rot_type, amounts, rot_order)
return newframe
@@ -1029,6 +1168,131 @@ def set_ang_vel(self, otherframe, value):
self._ang_vel_dict.update({otherframe: value})
otherframe._ang_vel_dict.update({self: -value})
+ def express(self, field, variables=False):
+ """
+ Re-express a vector/scalar function in this frame
+
+ If variables is True, then the coordinate variables present
+ in the vector field expression are also substituted in terms of
+ the base scalars of this frame
+
+ Parameters
+ ==========
+
+ field : Vector/sympifyable
+ The vector/scalar field to express in this frame
+
+ variables : boolean
+ Boolean to specify whether to substitute base scalars
+ in vector expression. If field is scalar, this parameter
+ is not considered
+
+ Examples
+ ========
+
+ >>> from sympy.physics.mechanics import ReferenceFrame
+ >>> R0 = ReferenceFrame('R0')
+ >>> R1 = ReferenceFrame('R1')
+ >>> from sympy import Symbol
+ >>> q = Symbol('q')
+ >>> R1.orient(R0, 'Axis', [q, R0.z])
+ >>> R0.express(4*R1.x + 5*R1.z)
+ 4*cos(q)*R0.x + 4*sin(q)*R0.y + 5*R0.z
+
+ """
+
+ if field == 0:
+ return 0
+ if isinstance(field, Vector):
+ #Given field is a Vector
+ if variables:
+ #If variables attribute is True, substitute
+ #the coordinate variables in the Vector
+ frame_list = [x[-1] for x in field.args]
+ subs_dict = {}
+ for frame in frame_list:
+ subs_dict.update(frame.variable_map(self))
+ field = field.subs(subs_dict)
+ #Re-express to other frame
+ outvec = Vector([])
+ for i, v in enumerate(field.args):
+ if v[1] != self:
+ temp = self.dcm(v[1]) * v[0]
+ if Vector.simp:
+ temp = temp.applyfunc(lambda x: \
+ trigsimp(x, method='fu'))
+ outvec += Vector([(temp, self)])
+ else:
+ outvec += Vector([v])
+ return outvec
+
+ else:
+ #Given field is a scalar
+ frame_set = set([])
+ field = sympify(field)
+ #Subsitute all the coordinate variables
+ for x in field.atoms():
+ if isinstance(x, CoordinateSym)and x.frame != self:
+ frame_set.add(x.frame)
+ subs_dict = {}
+ for frame in frame_set:
+ subs_dict.update(frame.variable_map(self))
+ return field.subs(subs_dict)
+
+ def dt(self, expr, order=1):
+ """
+ Calculate the time derivative of a field function in this frame.
+
+ References
+ ==========
+
+ http://en.wikipedia.org/wiki/
+ Rotating_reference_frame#Time_derivatives_in_the_two_frames
+
+ Parameters
+ ==========
+
+ expr : Vector/sympifyable
+ The field whose time derivative is to be calculated
+
+ order : integer
+ The order of the derivative to be calculated
+
+ Examples
+ ========
+
+ >>> from sympy.physics.mechanics import ReferenceFrame, Vector, dynamicsymbols
+ >>> from sympy import Symbol
+ >>> q1 = Symbol('q1')
+ >>> u1 = dynamicsymbols('u1')
+ >>> N = ReferenceFrame('N')
+ >>> A = N.orientnew('A', 'Axis', [q1, N.x])
+ >>> v = u1 * N.x
+ >>> A.set_ang_vel(N, 10*A.x)
+ >>> A.x.dt(N) == 0
+ True
+ >>> v.dt(N)
+ u1'*N.x
+
+ """
+
+ t = dynamicsymbols._t
+ if order == 0:
+ return expr
+ if order%1 != 0 or order < 0:
+ raise ValueError("Unsupported value of order entered")
+ if isinstance(expr, Vector):
+ outvec = S(0)
+ for i, v in enumerate(expr.args):
+ if v[1] == self:
+ outvec += Vector([(self.express(v[0]).diff(t), self)])
+ else:
+ outvec += v[1].dt(Vector([v])) + \
+ (v[1].ang_vel_in(self) ^ Vector([v]))
+ return outvec
+ else:
+ return diff(self.express(expr), t, order)
+
@property
def x(self):
"""The basis Vector for the ReferenceFrame, in the x direction. """
@@ -1136,7 +1400,7 @@ def __and__(self, other):
out += ((v2[0].T)
* (v2[1].dcm(v1[1]))
* (v1[0]))[0]
- if Vector.simp is True:
+ if Vector.simp:
return trigsimp(sympify(out), recursive=True)
else:
return sympify(out)
@@ -1520,6 +1784,14 @@ def diff(self, wrt, otherframe):
outvec += Vector([(d, otherframe)]).express(v[1])
return outvec
+ def express(self, otherframe, variables=False):
+ """
+ Returns a Vector equivalent to this one, expressed in otherframe.
+ Uses ReferenceFrame's .express method.
+ Refer the docstring for ReferenceFrame.express
+ """
+ return otherframe.express(self, variables)
+
def doit(self, **hints):
"""Calls .doit() on each term in the Vector"""
ov = S(0)
@@ -1528,80 +1800,13 @@ def doit(self, **hints):
return ov
def dt(self, otherframe):
- """Returns the time derivative of the Vector in a ReferenceFrame.
-
- Returns a Vector which is the time derivative of the self Vector, taken
+ """Returns a Vector which is the time derivative of the self Vector, taken
in frame otherframe.
- Parameters
- ==========
-
- otherframe : ReferenceFrame
- The ReferenceFrame that the partial derivative is taken in.
-
- Examples
- ========
-
- >>> from sympy.physics.mechanics import ReferenceFrame, Vector, dynamicsymbols
- >>> from sympy import Symbol
- >>> q1 = Symbol('q1')
- >>> u1 = dynamicsymbols('u1')
- >>> N = ReferenceFrame('N')
- >>> A = N.orientnew('A', 'Axis', [q1, N.x])
- >>> v = u1 * N.x
- >>> A.set_ang_vel(N, 10*A.x)
- >>> A.x.dt(N) == 0
- True
- >>> v.dt(N)
- u1'*N.x
-
+ Calls ReferenceFrame' express method
+ Refer the docstring for ReferenceFrame.express
"""
-
- outvec = S(0)
- _check_frame(otherframe)
- for i, v in enumerate(self.args):
- if v[1] == otherframe:
- outvec += Vector([(v[0].diff(dynamicsymbols._t), otherframe)])
- else:
- outvec += (Vector([v]).dt(v[1]) +
- (v[1].ang_vel_in(otherframe) ^ Vector([v])))
- return outvec
-
- def express(self, otherframe):
- """Returns a vector, expressed in the other frame.
-
- A new Vector is returned, equalivalent to this Vector, but its
- components are all defined in only the otherframe.
-
- Parameters
- ==========
-
- otherframe : ReferenceFrame
- The frame for this Vector to be described in
-
- Examples
- ========
-
- >>> from sympy.physics.mechanics import ReferenceFrame, Vector, dynamicsymbols
- >>> q1 = dynamicsymbols('q1')
- >>> N = ReferenceFrame('N')
- >>> A = N.orientnew('A', 'Axis', [q1, N.y])
- >>> A.x.express(N)
- cos(q1)*N.x - sin(q1)*N.z
-
- """
-
- _check_frame(otherframe)
- outvec = Vector([])
- for i, v in enumerate(self.args):
- if v[1] != otherframe:
- temp = otherframe.dcm(v[1]) * v[0]
- if Vector.simp is True:
- temp = temp.applyfunc(lambda x: trigsimp(x, method='fu'))
- outvec += Vector([(temp, otherframe)])
- else:
- outvec += Vector([v])
- return outvec
+ return otherframe.dt(self)
def simplify(self):
"""Returns a simplified Vector."""
@@ -1684,7 +1889,7 @@ def _print_Function(self, expr, exp=None):
sup += r"^{%s}" % self._print(exp)
return r"%s" % (name + sup + sub)
else:
- args = [ str(self._print(arg)) for arg in expr.args ]
+ args = [str(self._print(arg)) for arg in expr.args]
# How inverse trig functions should be displayed, formats are:
# abbreviated: asin, full: arcsin, power: sin^-1
inv_trig_style = self._settings['inv_trig_style']
@@ -1934,5 +2139,6 @@ def dynamicsymbols(names, level=0):
else:
return reduce(diff, [t]*level, esses(t))
+
dynamicsymbols._t = Symbol('t')
dynamicsymbols._str = '\''
diff --git a/sympy/physics/mechanics/functions.py b/sympy/physics/mechanics/functions.py
index da417dc5311d..88e6a685be36 100644
--- a/sympy/physics/mechanics/functions.py
+++ b/sympy/physics/mechanics/functions.py
@@ -48,13 +48,11 @@ def dot(vec1, vec2):
def express(vec, frame, frame2=None):
- """Express convenience wrapper for Vector.express(): \n"""
- if not isinstance(vec, (Vector, Dyadic)):
- raise TypeError('Can only express Vectors')
- if isinstance(vec, Vector):
- return vec.express(frame)
- else:
+ """Express convenience wrapper"""
+ if isinstance(vec, Dyadic):
return vec.express(frame, frame2)
+ else:
+ return frame.express(vec)
express.__doc__ += Vector.express.__doc__
| diff --git a/sympy/core/tests/test_args.py b/sympy/core/tests/test_args.py
index 73cba10b55b3..bb0b1778b8be 100644
--- a/sympy/core/tests/test_args.py
+++ b/sympy/core/tests/test_args.py
@@ -1968,6 +1968,11 @@ def test_sympy__matrices__expressions__factorizations__SofSVD():
def test_sympy__matrices__expressions__factorizations__Factorization():
pass
+def test_sympy__physics__mechanics__essential__CoordinateSym():
+ from sympy.physics.mechanics import CoordinateSym
+ from sympy.physics.mechanics import ReferenceFrame
+ assert _test_args(CoordinateSym('R_x', ReferenceFrame('R'), 0))
+
def test_sympy__physics__gaussopt__BeamParameter():
from sympy.physics.gaussopt import BeamParameter
assert _test_args(BeamParameter(530e-9, 1, w=1e-3))
diff --git a/sympy/physics/mechanics/tests/test_essential.py b/sympy/physics/mechanics/tests/test_essential.py
index eae02f5a3361..3e3cb5cde50a 100644
--- a/sympy/physics/mechanics/tests/test_essential.py
+++ b/sympy/physics/mechanics/tests/test_essential.py
@@ -1,6 +1,8 @@
-from sympy import cos, Matrix, sin, symbols, pi, Function
+from sympy import cos, Matrix, sin, symbols, simplify, pi, Function, \
+ zeros
from sympy.abc import x, y, z
-from sympy.physics.mechanics import Vector, ReferenceFrame, dot, dynamicsymbols
+from sympy.physics.mechanics import Vector, ReferenceFrame, dot, \
+ dynamicsymbols, CoordinateSym, express
from sympy.physics.mechanics.essential import MechanicsLatexPrinter
Vector.simp = True
@@ -43,6 +45,58 @@ def test_dyadic():
assert d1.dt(B) == (-qd) * (A.y | A.x) + (-qd) * (A.x | A.y)
+def test_coordinate_vars():
+ """Tests the coordinate variables functionality"""
+ assert CoordinateSym('Ax', A, 0) == A[0]
+ assert CoordinateSym('Ax', A, 1) == A[1]
+ assert CoordinateSym('Ax', A, 2) == A[2]
+ q = dynamicsymbols('q')
+ qd = dynamicsymbols('q', 1)
+ assert isinstance(A[0], CoordinateSym) and \
+ isinstance(A[0], CoordinateSym) and \
+ isinstance(A[0], CoordinateSym)
+ assert A.variable_map(A) == {A[0]:A[0], A[1]:A[1], A[2]:A[2]}
+ assert A[0].frame == A
+ B = A.orientnew('B', 'Axis', [q, A.z])
+ assert B.variable_map(A) == {B[2]: A[2], B[1]: -A[0]*sin(q) + A[1]*cos(q),
+ B[0]: A[0]*cos(q) + A[1]*sin(q)}
+ assert A.variable_map(B) == {A[0]: B[0]*cos(q) - B[1]*sin(q),
+ A[1]: B[0]*sin(q) + B[1]*cos(q), A[2]: B[2]}
+ assert A.dt(B[0]) == -A[0]*sin(q)*qd + A[1]*cos(q)*qd
+ assert A.dt(B[1]) == -A[0]*cos(q)*qd - A[1]*sin(q)*qd
+ assert A.dt(B[2]) == 0
+ assert express(B[0], A) == A[0]*cos(q) + A[1]*sin(q)
+ assert express(B[1], A) == -A[0]*sin(q) + A[1]*cos(q)
+ assert express(B[2], A) == A[2]
+ assert B.dt(A[0]*A.x + A[1]*A.y + A[2]*A.z) == A[1]*qd*A.x - A[0]*qd*A.y
+ assert A.dt(B[0]*B.x + B[1]*B.y + B[2]*B.z) == - B[1]*qd*B.x + B[0]*qd*B.y
+ assert A.express(B[0]*B[1]*B[2]) == \
+ A[2]*(-A[0]*sin(q) + A[1]*cos(q))*(A[0]*cos(q) + A[1]*sin(q))
+ assert simplify(A.dt(B[0]*B[1]*B[2])) == \
+ A[2]*(-A[0]**2*cos(2*q) - 2*A[0]*A[1]*sin(2*q) + \
+ A[1]**2*cos(2*q))*qd
+ assert A.express(B[0]*B.x + B[1]*B.y + B[2]*B.z) == \
+ (B[0]*cos(q) - B[1]*sin(q))*A.x + (B[0]*sin(q) + \
+ B[1]*cos(q))*A.y + B[2]*A.z
+ assert A.express(B[0]*B.x + B[1]*B.y + B[2]*B.z, variables=True) == \
+ A[0]*A.x + A[1]*A.y + A[2]*A.z
+ assert B.express(A[0]*A.x + A[1]*A.y + A[2]*A.z) == \
+ (A[0]*cos(q) + A[1]*sin(q))*B.x + \
+ (-A[0]*sin(q) + A[1]*cos(q))*B.y + A[2]*B.z
+ assert B.express(A[0]*A.x + A[1]*A.y + A[2]*A.z, variables=True) == \
+ B[0]*B.x + B[1]*B.y + B[2]*B.z
+ N = B.orientnew('N', 'Axis', [-q, B.z])
+ assert N.variable_map(A) == {N[0]: A[0], N[2]: A[2], N[1]: A[1]}
+ C = A.orientnew('C', 'Axis', [q, A.x + A.y + A.z])
+ mapping = A.variable_map(C)
+ assert mapping[A[0]] == 2*C[0]*cos(q)/3 + C[0]/3 - 2*C[1]*sin(q + pi/6)/3 +\
+ C[1]/3 - 2*C[2]*cos(q + pi/3)/3 + C[2]/3
+ assert mapping[A[1]] == -2*C[0]*cos(q + pi/3)/3 + \
+ C[0]/3 + 2*C[1]*cos(q)/3 + C[1]/3 - 2*C[2]*sin(q + pi/6)/3 + C[2]/3
+ assert mapping[A[2]] == -2*C[0]*sin(q + pi/6)/3 + C[0]/3 - \
+ 2*C[1]*cos(q + pi/3)/3 + C[1]/3 + 2*C[2]*cos(q)/3 + C[2]/3
+
+
def test_ang_vel():
q1, q2, q3, q4 = dynamicsymbols('q1 q2 q3 q4')
q1d, q2d, q3d, q4d = dynamicsymbols('q1 q2 q3 q4', 1)
@@ -124,7 +178,7 @@ def test_dcm():
sin(q3) - sin(q2) * cos(q1) * cos(q3)], [- sin(q3) * cos(q2), sin(q2),
cos(q2) * cos(q3)]])
# This is a little touchy. Is it ok to use simplify in assert?
- assert D.dcm(C) == Matrix(
+ test_mat = D.dcm(C) - Matrix(
[[cos(q1) * cos(q3) * cos(q4) - sin(q3) * (- sin(q4) * cos(q2) +
sin(q1) * sin(q2) * cos(q4)), - sin(q2) * sin(q4) - sin(q1) *
cos(q2) * cos(q4), sin(q3) * cos(q1) * cos(q4) + cos(q3) * (- sin(q4) *
@@ -134,6 +188,7 @@ def test_dcm():
sin(q3) * (cos(q2) * cos(q4) + sin(q1) * sin(q2) * sin(q4)), sin(q2) *
cos(q4) - sin(q1) * sin(q4) * cos(q2), sin(q3) * sin(q4) * cos(q1) +
cos(q3) * (cos(q2) * cos(q4) + sin(q1) * sin(q2) * sin(q4))]])
+ assert test_mat.expand() == zeros(3, 3)
assert E.dcm(N) == Matrix(
[[cos(q2)*cos(q3), sin(q3)*cos(q2), -sin(q2)],
[sin(q1)*sin(q2)*cos(q3) - sin(q3)*cos(q1), sin(q1)*sin(q2)*sin(q3) +
@@ -186,11 +241,12 @@ def test_Vector_diffs():
v2 = q3 * B.x + v1
v3 = v1.dt(B)
v4 = v2.dt(B)
+ v5 = q1*A.x + q2*A.y + q3*A.z
- assert v1.dt(N) == q2d * A.x + q2 * q3d * A.y + q3d * N.y
- assert v1.dt(A) == q2d * A.x + q3 * q3d * N.x + q3d * N.y
- assert v1.dt(B) == (q2d * A.x + q3 * q3d * N.x + q3d * N.y - q3 * cos(q3) *
- q2d * N.z)
+ assert v1.dt(N) == N.dt(v1) == q2d * A.x + q2 * q3d * A.y + q3d * N.y
+ assert v1.dt(A) == A.dt(v1) == q2d * A.x + q3 * q3d * N.x + q3d * N.y
+ assert v1.dt(B) == B.dt(v1) == (q2d * A.x + q3 * q3d * N.x + q3d *
+ N.y - q3 * cos(q3) * q2d * N.z)
assert v2.dt(N) == (q2d * A.x + (q2 + q3) * q3d * A.y + q3d * B.x + q3d *
N.y)
assert v2.dt(A) == q2d * A.x + q3d * B.x + q3 * q3d * N.x + q3d * N.y
@@ -218,6 +274,9 @@ def test_Vector_diffs():
(2 * q3d**2 + q3 * q3dd) * N.x + (q3dd - q3 * q3d**2) *
N.y + (2 * q3 * sin(q3) * q2d * q3d - 2 * cos(q3) *
q2d * q3d - q3 * cos(q3) * q2dd) * N.z)
+ assert B.dt(v5) == v5.dt(B) == q1d*A.x + (q3*q2d + q2d)*A.y + (-q2*q2d + q3d)*A.z
+ assert A.dt(v5) == v5.dt(A) == q1d*A.x + q2d*A.y + q3d*A.z
+ assert N.dt(v5) == v5.dt(N) == (-q2*q3d + q1d)*A.x + (q1*q3d + q2d)*A.y + q3d*A.z
assert v3.diff(q1d, N) == 0
assert v3.diff(q2d, N) == A.x - q3 * cos(q3) * N.z
assert v3.diff(q3d, N) == q3 * N.x + N.y
| [
{
"components": [
{
"doc": "Class to represent the coordinate symbols associated wrt a Reference\nFrame\n\nUsers should not instantiate this class. Instances of this class must\nonly be accessed through the corresponding frame as 'frame[index]'\n\nExamples\n========\n\n>>> from sympy.physics.mecha... | [
"test_sympy__physics__mechanics__essential__CoordinateSym",
"test_dyadic",
"test_coordinate_vars",
"test_ang_vel",
"test_dcm",
"test_Vector",
"test_Vector_diffs",
"test_vector_simplify",
"test_dyadic_simplify"
] | [
"test_all_classes_are_tested",
"test_sympy__assumptions__assume__Predicate",
"test_sympy__combinatorics__subsets__Subset",
"test_sympy__combinatorics__polyhedron__Polyhedron",
"test_sympy__combinatorics__partitions__Partition",
"test_sympy__concrete__products__Product",
"test_sympy__concrete__summations... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Modifying mechanics core
Modified ReferenceFrame and Vector classes to incorporate coordinate variables. Base scalars are assigned to every ReferenceFrame.
express, dt methods have been refactored (also added to ReferenceFrame) to facilitate re-expression of vector and scalar fields.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/physics/mechanics/essential.py]
(definition of CoordinateSym:)
class CoordinateSym(Symbol):
"""Class to represent the coordinate symbols associated wrt a Reference
Frame
Users should not instantiate this class. Instances of this class must
only be accessed through the corresponding frame as 'frame[index]'
Examples
========
>>> from sympy.physics.mechanics import ReferenceFrame
>>> A = ReferenceFrame('A')
>>> A[1]
A_y
>>> type(A[0])
<class 'sympy.physics.mechanics.essential.CoordinateSym'>
Refer Symbol documentation for more information-"""
(definition of CoordinateSym.__new__:)
def __new__(cls, name, frame, index):
(definition of CoordinateSym.frame:)
def frame(self):
(definition of CoordinateSym.__eq__:)
def __eq__(self, other):
(definition of CoordinateSym.__ne__:)
def __ne__(self, other):
(definition of CoordinateSym.__hash__:)
def __hash__(self):
(definition of ReferenceFrame.variable_map:)
def variable_map(self, otherframe):
"""Returns a dictionary which expresses the variables of this frame
in terms of the variables of otherframe.
If Vector.simp is True, returns a simplified version of the mapped
values. Else, returns them without simplification.
Simplification may take time.
Parameters
==========
otherframe : ReferenceFrame
The other frame to map this variables to
Examples
========
>>> from sympy.physics.mechanics import ReferenceFrame, dynamicsymbols
>>> A = ReferenceFrame('A')
>>> q = dynamicsymbols('q')
>>> B = A.orientnew('B', 'Axis', [q, A.z])
>>> A.variable_map(B)
{A_x: B_x*cos(q(t)) - B_y*sin(q(t)), A_y: B_x*sin(q(t)) + B_y*cos(q(t)), A_z: B_z}"""
(definition of ReferenceFrame.express:)
def express(self, field, variables=False):
"""Re-express a vector/scalar function in this frame
If variables is True, then the coordinate variables present
in the vector field expression are also substituted in terms of
the base scalars of this frame
Parameters
==========
field : Vector/sympifyable
The vector/scalar field to express in this frame
variables : boolean
Boolean to specify whether to substitute base scalars
in vector expression. If field is scalar, this parameter
is not considered
Examples
========
>>> from sympy.physics.mechanics import ReferenceFrame
>>> R0 = ReferenceFrame('R0')
>>> R1 = ReferenceFrame('R1')
>>> from sympy import Symbol
>>> q = Symbol('q')
>>> R1.orient(R0, 'Axis', [q, R0.z])
>>> R0.express(4*R1.x + 5*R1.z)
4*cos(q)*R0.x + 4*sin(q)*R0.y + 5*R0.z"""
(definition of ReferenceFrame.dt:)
def dt(self, expr, order=1):
"""Calculate the time derivative of a field function in this frame.
References
==========
http://en.wikipedia.org/wiki/
Rotating_reference_frame#Time_derivatives_in_the_two_frames
Parameters
==========
expr : Vector/sympifyable
The field whose time derivative is to be calculated
order : integer
The order of the derivative to be calculated
Examples
========
>>> from sympy.physics.mechanics import ReferenceFrame, Vector, dynamicsymbols
>>> from sympy import Symbol
>>> q1 = Symbol('q1')
>>> u1 = dynamicsymbols('u1')
>>> N = ReferenceFrame('N')
>>> A = N.orientnew('A', 'Axis', [q1, N.x])
>>> v = u1 * N.x
>>> A.set_ang_vel(N, 10*A.x)
>>> A.x.dt(N) == 0
True
>>> v.dt(N)
u1'*N.x"""
[end of new definitions in sympy/physics/mechanics/essential.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2367 | 2,367 | sympy/sympy | 0.7 | 0ef741156ba8fc4d261d452301bf1753ef283730 | 2013-08-08T21:50:57Z | diff --git a/sympy/physics/quantum/circuitplot.py b/sympy/physics/quantum/circuitplot.py
index f3b77fa433a9..ef67196b5178 100644
--- a/sympy/physics/quantum/circuitplot.py
+++ b/sympy/physics/quantum/circuitplot.py
@@ -19,7 +19,10 @@
from sympy import Mul
from sympy.core.compatibility import u
from sympy.external import import_module
-from sympy.physics.quantum.gate import Gate,OneQubitGate,CGate,CGateS
+from sympy.physics.quantum.gate import Gate, OneQubitGate, CGate, CGateS
+from sympy.core.core import BasicMeta
+from sympy.core.assumptions import ManagedProperties
+
__all__ = [
'CircuitPlot',
@@ -27,6 +30,8 @@
'labeller',
'Mz',
'Mx',
+ 'CreateOneQubitGate',
+ 'CreateCGate',
]
np = import_module('numpy')
@@ -46,6 +51,9 @@ def circuit_plot(*args, **kwargs):
Line2D = matplotlib.lines.Line2D
Circle = matplotlib.patches.Circle
+ #from matplotlib import rc
+ #rc('text',usetex=True)
+
class CircuitPlot(object):
"""A class for managing a circuit plot."""
@@ -56,6 +64,7 @@ class CircuitPlot(object):
not_radius = 0.15
swap_delta = 0.05
labels = []
+ inits = {}
label_buffer = 0.5
def __init__(self, c, nqubits, **kwargs):
@@ -113,9 +122,11 @@ def _plot_wires(self):
)
self._axes.add_line(line)
if self.labels:
+ init_label_buffer = 0
+ if self.inits.get(self.labels[i]): init_label_buffer = 0.25
self._axes.text(
- xdata[0]-self.label_buffer,ydata[0],
- r'$|%s\rangle$' % self.labels[i],
+ xdata[0]-self.label_buffer-init_label_buffer,ydata[0],
+ render_label(self.labels[i],self.inits),
size=self.fontsize,
color='k',ha='center',va='center')
self._plot_measured_wires()
@@ -136,13 +147,13 @@ def _plot_measured_wires(self):
self._axes.add_line(line)
# Also double any controlled lines off these wires
for i,g in enumerate(self._gates()):
- if isinstance(g,CGate) or isinstance(g,CGateS):
+ if isinstance(g, CGate) or isinstance(g, CGateS):
wires = g.controls + g.targets
for wire in wires:
if wire in ismeasured and \
self._gate_grid[i] > self._gate_grid[ismeasured[wire]]:
- ydata = min(wires),max(wires)
- xdata = self._gate_grid[i]-dy,self._gate_grid[i]-dy
+ ydata = min(wires), max(wires)
+ xdata = self._gate_grid[i]-dy, self._gate_grid[i]-dy
line = Line2D(
xdata, ydata,
color='k',
@@ -295,7 +306,21 @@ def circuit_plot(c, nqubits, **kwargs):
"""
return CircuitPlot(c, nqubits, **kwargs)
-def labeller(n,symbol='q'):
+def render_label(label, inits={}):
+ """Slightly more flexible way to render labels.
+
+ >>> from sympy.physics.quantum.circuitplot import render_label
+ >>> render_label('q0')
+ '$|q0\\\\rangle$'
+ >>> render_label('q0', {'q0':'0'})
+ '$|q0\\\\rangle=|0\\\\rangle$'
+ """
+ init = inits.get(label)
+ if init:
+ return r'$|%s\rangle=|%s\rangle$' % (label, init)
+ return r'$|%s\rangle$' % label
+
+def labeller(n, symbol='q'):
"""Autogenerate labels for wires of quantum circuits.
Parameters
@@ -314,17 +339,38 @@ def labeller(n,symbol='q'):
return ['%s_%d' % (symbol,n-i-1) for i in range(n)]
class Mz(OneQubitGate):
- """Mock-up of a z measurement gate. This is in circuitplot rather than
- gate.py because it's not a real gate, it just draws one.
+ """Mock-up of a z measurement gate.
+
+ This is in circuitplot rather than gate.py because it's not a real
+ gate, it just draws one.
"""
measurement = True
gate_name='Mz'
gate_name_latex=u('M_z')
class Mx(OneQubitGate):
- """Mock-up of an x measurement gate. This is in circuitplot rather than
- gate.py because it's not a real gate, it just draws one.
+ """Mock-up of an x measurement gate.
+
+ This is in circuitplot rather than gate.py because it's not a real
+ gate, it just draws one.
"""
measurement = True
gate_name='Mx'
gate_name_latex=u('M_x')
+
+class CreateOneQubitGate(ManagedProperties):
+ def __new__(mcl, name, latexname=None):
+ if not latexname:
+ latexname = name
+ return BasicMeta.__new__(mcl, name + "Gate", (OneQubitGate,),
+ {'gate_name': name, 'gate_name_latex': latexname})
+
+def CreateCGate(name, latexname=None):
+ """Use a lexical closure to make a controlled gate.
+ """
+ if not latexname:
+ latexname = name
+ onequbitgate = CreateOneQubitGate(name, latexname)
+ def ControlledGate(ctrls,target):
+ return CGate(tuple(ctrls),onequbitgate(target))
+ return ControlledGate
diff --git a/sympy/physics/quantum/qasm.py b/sympy/physics/quantum/qasm.py
new file mode 100644
index 000000000000..d297315aeffa
--- /dev/null
+++ b/sympy/physics/quantum/qasm.py
@@ -0,0 +1,227 @@
+"""
+
+qasm.py - Functions to parse a set of qasm commands into a Sympy Circuit.
+
+Examples taken from Chuang's page: http://www.media.mit.edu/quanta/qasm2circ/
+
+The code returns a circuit and an associated list of labels.
+>>> from sympy.physics.quantum.qasm import Qasm
+>>> q = Qasm('qubit q0', 'qubit q1', 'h q0', 'cnot q0,q1')
+>>> q.get_circuit()
+CNOT(1,0)*H(1)
+
+>>> q = Qasm('qubit q0', 'qubit q1', 'cnot q0,q1', 'cnot q1,q0', 'cnot q0,q1')
+>>> q.get_circuit()
+CNOT(1,0)*CNOT(0,1)*CNOT(1,0)
+"""
+
+__all__ = [
+ 'Qasm',
+ ]
+
+from sympy.physics.quantum.gate import H, CNOT, X, Z, CGate, CGateS, SWAP, S, T,CPHASE
+from sympy.physics.quantum.circuitplot import Mz
+
+def read_qasm(lines):
+ return Qasm(*lines.splitlines())
+
+def read_qasm_file(filename):
+ return Qasm(*open(filename).readlines())
+
+def prod(c):
+ p = 1
+ for ci in c:
+ p *= ci
+ return p
+
+def flip_index(i, n):
+ """Reorder qubit indices from largest to smallest.
+
+ >>> from sympy.physics.quantum.qasm import flip_index
+ >>> flip_index(0, 2)
+ 1
+ >>> flip_index(1, 2)
+ 0
+ """
+ return n-i-1
+
+def trim(line):
+ """Remove everything following comment # characters in line.
+
+ >>> from sympy.physics.quantum.qasm import trim
+ >>> trim('nothing happens here')
+ 'nothing happens here'
+ >>> trim('something #happens here')
+ 'something '
+ """
+ if not '#' in line:
+ return line
+ return line.split('#')[0]
+
+def get_index(target, labels):
+ """Get qubit labels from the rest of the line,and return indices
+
+ >>> from sympy.physics.quantum.qasm import get_index
+ >>> get_index('q0', ['q0', 'q1'])
+ 1
+ >>> get_index('q1', ['q0', 'q1'])
+ 0
+ """
+ nq = len(labels)
+ return flip_index(labels.index(target), nq)
+
+def get_indices(targets, labels):
+ return [get_index(t, labels) for t in targets]
+
+def nonblank(args):
+ for line in args:
+ line = trim(line)
+ if line.isspace():
+ continue
+ yield line
+ return
+
+def fullsplit(line):
+ words = line.split()
+ rest = ' '.join(words[1:])
+ return fixcommand(words[0]), [s.strip() for s in rest.split(',')]
+
+def fixcommand(c):
+ """Fix Qasm command names.
+
+ Remove all of forbidden characters from command c, and
+ replace 'def' with 'qdef'.
+ """
+ forbidden_characters = ['-']
+ c = c.lower()
+ for char in forbidden_characters:
+ c = c.replace(char, '')
+ if c == 'def':
+ return 'qdef'
+ return c
+
+def stripquotes(s):
+ """Replace explicit quotes in a string.
+
+ >>> from sympy.physics.quantum.qasm import stripquotes
+ >>> stripquotes("'S'") == 'S'
+ True
+ >>> stripquotes('"S"') == 'S'
+ True
+ >>> stripquotes('S') == 'S'
+ True
+ """
+ s = s.replace('"', '') # Remove second set of quotes?
+ s = s.replace("'", '')
+ return s
+
+class Qasm(object):
+ """Class to form objects from Qasm lines
+
+ >>> from sympy.physics.quantum.qasm import Qasm
+ >>> q = Qasm('qubit q0', 'qubit q1', 'h q0', 'cnot q0,q1')
+ >>> q.get_circuit()
+ CNOT(1,0)*H(1)
+ >>> q = Qasm('qubit q0', 'qubit q1', 'cnot q0,q1', 'cnot q1,q0', 'cnot q0,q1')
+ >>> q.get_circuit()
+ CNOT(1,0)*CNOT(0,1)*CNOT(1,0)
+ """
+ def __init__(self, *args, **kwargs):
+ self.defs = {}
+ self.circuit = []
+ self.labels = []
+ self.inits = {}
+ self.add(*args)
+ self.kwargs = kwargs
+
+ def add(self, *lines):
+ for line in nonblank(lines):
+ command, rest = fullsplit(line)
+ if self.defs.get(command): #defs come first, since you can override built-in
+ function = self.defs.get(command)
+ indices = self.indices(rest)
+ if len(indices) == 1:
+ self.circuit.append(function(indices[0]))
+ else:
+ self.circuit.append(function(indices[:-1], indices[-1]))
+ elif hasattr(self, command):
+ function = getattr(self, command)
+ function(*rest)
+ else:
+ print("Function %s not defined. Skipping" % command)
+
+ def get_circuit(self):
+ return prod(reversed(self.circuit))
+
+ def get_labels(self):
+ return list(reversed(self.labels))
+
+ def plot(self):
+ from sympy.physics.quantum.circuitplot import CircuitPlot
+ circuit, labels = self.get_circuit(), self.get_labels()
+ CircuitPlot(circuit, len(labels), labels=labels, inits=self.inits)
+
+ def qubit(self, arg, init=None):
+ self.labels.append(arg)
+ if init: self.inits[arg] = init
+
+ def indices(self, args):
+ return get_indices(args, self.labels)
+
+ def index(self, arg):
+ return get_index(arg, self.labels)
+
+ def nop(self, *args):
+ pass
+
+ def x(self, arg):
+ self.circuit.append(X(self.index(arg)))
+
+ def z(self, arg):
+ self.circuit.append(Z(self.index(arg)))
+
+ def h(self, arg):
+ self.circuit.append(H(self.index(arg)))
+
+ def s(self, arg):
+ self.circuit.append(S(self.index(arg)))
+
+ def t(self, arg):
+ self.circuit.append(T(self.index(arg)))
+
+ def measure(self, arg):
+ self.circuit.append(Mz(self.index(arg)))
+
+ def cnot(self, a1, a2):
+ self.circuit.append(CNOT(*self.indices([a1, a2])))
+
+ def swap(self, a1, a2):
+ self.circuit.append(SWAP(*self.indices([a1, a2])))
+
+ def cphase(self, a1, a2):
+ self.circuit.append(CPHASE(*self.indices([a1, a2])))
+
+ def toffoli(self, a1, a2, a3):
+ i1, i2, i3 = self.indices([a1, a2, a3])
+ self.circuit.append(CGateS((i1, i2), X(i3)))
+
+ def cx(self, a1, a2):
+ fi, fj = self.indices([a1, a2])
+ self.circuit.append(CGate(fi, X(fj)))
+
+ def cz(self, a1, a2):
+ fi, fj = self.indices([a1, a2])
+ self.circuit.append(CGate(fi, Z(fj)))
+
+ def defbox(self, *args):
+ print("defbox not supported yet. Skipping: ", args)
+
+ def qdef(self, name, ncontrols, symbol):
+ from sympy.physics.quantum.circuitplot import CreateOneQubitGate, CreateCGate
+ ncontrols = int(ncontrols)
+ command = fixcommand(name)
+ symbol = stripquotes(symbol)
+ if ncontrols > 0:
+ self.defs[command] = CreateCGate(symbol)
+ else:
+ self.defs[command] = CreateOneQubitGate(symbol)
| diff --git a/sympy/core/tests/test_args.py b/sympy/core/tests/test_args.py
index b95e1eb124aa..b8c9900ce75e 100644
--- a/sympy/core/tests/test_args.py
+++ b/sympy/core/tests/test_args.py
@@ -2262,7 +2262,6 @@ def test_sympy__physics__quantum__piab__PIABKet():
from sympy.physics.quantum.piab import PIABKet
assert _test_args(PIABKet('K'))
-
def test_sympy__physics__quantum__qexpr__QExpr():
from sympy.physics.quantum.qexpr import QExpr
assert _test_args(QExpr(0))
diff --git a/sympy/physics/quantum/tests/test_circuitplot.py b/sympy/physics/quantum/tests/test_circuitplot.py
index ed861d4e1cb3..da29eee17e2e 100644
--- a/sympy/physics/quantum/tests/test_circuitplot.py
+++ b/sympy/physics/quantum/tests/test_circuitplot.py
@@ -1,10 +1,26 @@
-from sympy.physics.quantum.circuitplot import labeller
+from sympy.physics.quantum.circuitplot import labeller, render_label, Mz, CreateOneQubitGate,\
+ CreateCGate
from sympy.physics.quantum.gate import CNOT, H, X, Z, SWAP, CGate, S, T
from sympy.external import import_module
from sympy.utilities.pytest import skip
mpl = import_module('matplotlib')
+def test_render_label():
+ assert render_label('q0') == r'$|q0\rangle$'
+ assert render_label('q0', {'q0': '0'}) == r'$|q0\rangle=|0\rangle$'
+
+def test_Mz():
+ assert str(Mz(0)) == 'Mz(0)'
+
+def test_create1():
+ Qgate = CreateOneQubitGate('Q')
+ assert str(Qgate(0)) == 'Q(0)'
+
+def test_createc():
+ Qgate = CreateCGate('Q')
+ assert str(Qgate([1],0)) == 'C((1),Q(0))'
+
def test_labeller():
"""Test the labeller utility"""
assert labeller(2) == ['q_1', 'q_0']
diff --git a/sympy/physics/quantum/tests/test_qasm.py b/sympy/physics/quantum/tests/test_qasm.py
new file mode 100644
index 000000000000..659a4a4d16d5
--- /dev/null
+++ b/sympy/physics/quantum/tests/test_qasm.py
@@ -0,0 +1,94 @@
+from sympy.physics.quantum.qasm import Qasm, prod, flip_index, trim,\
+ get_index, nonblank, fullsplit, fixcommand, stripquotes, read_qasm
+from sympy.physics.quantum.gate import X, Z, H, S, T
+from sympy.physics.quantum.gate import CNOT, SWAP, CPHASE, CGate, CGateS
+from sympy.physics.quantum.circuitplot import Mz, CreateOneQubitGate, CreateCGate
+
+def test_qasm_readqasm():
+ qasm_lines = """\
+ qubit q_0
+ qubit q_1
+ h q_0
+ cnot q_0,q_1
+ """
+ q = read_qasm(qasm_lines)
+ assert q.get_circuit() == CNOT(1,0)*H(1)
+
+def test_qasm_ex1():
+ q = Qasm('qubit q0', 'qubit q1', 'h q0', 'cnot q0,q1')
+ assert q.get_circuit() == CNOT(1,0)*H(1)
+
+def test_qasm_ex1_methodcalls():
+ q = Qasm()
+ q.qubit('q_0')
+ q.qubit('q_1')
+ q.h('q_0')
+ q.cnot('q_0', 'q_1')
+ assert q.get_circuit() == CNOT(1,0)*H(1)
+
+def test_qasm_swap():
+ q = Qasm('qubit q0', 'qubit q1', 'cnot q0,q1', 'cnot q1,q0', 'cnot q0,q1')
+ assert q.get_circuit() == CNOT(1,0)*CNOT(0,1)*CNOT(1,0)
+
+
+def test_qasm_ex2():
+ q = Qasm('qubit q_0', 'qubit q_1', 'qubit q_2', 'h q_1',
+ 'cnot q_1,q_2', 'cnot q_0,q_1', 'h q_0',
+ 'measure q_1', 'measure q_0',
+ 'c-x q_1,q_2', 'c-z q_0,q_2')
+ assert q.get_circuit() == CGate(2,Z(0))*CGate(1,X(0))*Mz(2)*Mz(1)*H(2)*CNOT(2,1)*CNOT(1,0)*H(1)
+
+def test_qasm_1q():
+ for symbol, gate in [('x', X), ('z', Z), ('h', H), ('s', S), ('t', T), ('measure', Mz)]:
+ q = Qasm('qubit q_0', '%s q_0' % symbol)
+ assert q.get_circuit() == gate(0)
+
+def test_qasm_2q():
+ for symbol, gate in [('cnot', CNOT), ('swap', SWAP), ('cphase', CPHASE)]:
+ q = Qasm('qubit q_0', 'qubit q_1', '%s q_0,q_1' % symbol)
+ assert q.get_circuit() == gate(1,0)
+
+def test_qasm_3q():
+ q = Qasm('qubit q0', 'qubit q1', 'qubit q2', 'toffoli q2,q1,q0')
+ assert q.get_circuit() == CGateS((0,1),X(2))
+
+def test_qasm_prod():
+ assert prod([1, 2, 3]) == 6
+ assert prod([H(0), X(1)])== H(0)*X(1)
+
+def test_qasm_flip_index():
+ assert flip_index(0, 2) == 1
+ assert flip_index(1, 2) == 0
+
+def test_qasm_trim():
+ assert trim('nothing happens here') == 'nothing happens here'
+ assert trim("Something #happens here") == "Something "
+
+def test_qasm_get_index():
+ assert get_index('q0', ['q0', 'q1']) == 1
+ assert get_index('q1', ['q0', 'q1']) == 0
+
+def test_qasm_nonblank():
+ assert list(nonblank('abcd')) == list('abcd')
+ assert list(nonblank('abc ')) == list('abc')
+
+def test_qasm_fullsplit():
+ assert fullsplit('g q0,q1,q2, q3') == ('g', ['q0', 'q1', 'q2', 'q3'])
+
+def test_qasm_fixcommand():
+ assert fixcommand('foo') == 'foo'
+ assert fixcommand('def') == 'qdef'
+
+def test_qasm_stripquotes():
+ assert stripquotes("'S'") == 'S'
+ assert stripquotes('"S"') == 'S'
+ assert stripquotes('S') == 'S'
+
+def test_qasm_qdef():
+ # weaker test condition (str) since we don't have access to the actual class
+ q = Qasm("def Q,0,Q",'qubit q0','Q q0')
+ Qgate = CreateOneQubitGate('Q')
+ assert str(q.get_circuit()) == 'Q(0)'
+ q = Qasm("def CQ,1,Q", 'qubit q0', 'qubit q1', 'CQ q0,q1')
+ Qgate = CreateCGate('Q')
+ assert str(q.get_circuit()) == 'C((1),Q(0))'
| [
{
"components": [
{
"doc": "Slightly more flexible way to render labels.\n\n>>> from sympy.physics.quantum.circuitplot import render_label\n>>> render_label('q0')\n'$|q0\\\\rangle$'\n>>> render_label('q0', {'q0':'0'})\n'$|q0\\\\rangle=|0\\\\rangle$'",
"lines": [
309,
32... | [
"test_render_label",
"test_Mz",
"test_create1",
"test_createc",
"test_labeller",
"test_qasm_readqasm",
"test_qasm_ex1",
"test_qasm_ex1_methodcalls",
"test_qasm_swap",
"test_qasm_ex2",
"test_qasm_1q",
"test_qasm_2q",
"test_qasm_3q",
"test_qasm_prod",
"test_qasm_flip_index",
"test_qasm_t... | [
"test_all_classes_are_tested",
"test_sympy__assumptions__assume__Predicate",
"test_sympy__combinatorics__subsets__Subset",
"test_sympy__combinatorics__polyhedron__Polyhedron",
"test_sympy__combinatorics__partitions__Partition",
"test_sympy__concrete__products__Product",
"test_sympy__concrete__summations... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Pull request to integrate qasm parser
This PR makes several changes centered around creating a circuit from the
[qasm quantum circuit language](http://www.media.mit.edu/quanta/qasm2circ/).
Examples of the parser are now included on the [sympy quantum circuit parsing
notebook](http://nbviewer.ipython.org/5843312). Roughly half of the examples
from the MIT page now plot directly from the qasm text.
Most of these changes are in the qasm.py file, which defines a Qasm class
that does most of the work.
Did a rebase of these changes on 8/8/2013 to make sure they could be merged
easily. Have since learned that this complicates things unnecessarily,
at least in terms of the number of commits that show up here.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/physics/quantum/circuitplot.py]
(definition of render_label:)
def render_label(label, inits={}):
"""Slightly more flexible way to render labels.
>>> from sympy.physics.quantum.circuitplot import render_label
>>> render_label('q0')
'$|q0\\rangle$'
>>> render_label('q0', {'q0':'0'})
'$|q0\\rangle=|0\\rangle$'"""
(definition of CreateOneQubitGate:)
class CreateOneQubitGate(ManagedProperties):
(definition of CreateOneQubitGate.__new__:)
def __new__(mcl, name, latexname=None):
(definition of CreateCGate:)
def CreateCGate(name, latexname=None):
"""Use a lexical closure to make a controlled gate.
"""
(definition of CreateCGate.ControlledGate:)
def ControlledGate(ctrls,target):
[end of new definitions in sympy/physics/quantum/circuitplot.py]
[start of new definitions in sympy/physics/quantum/qasm.py]
(definition of read_qasm:)
def read_qasm(lines):
(definition of read_qasm_file:)
def read_qasm_file(filename):
(definition of prod:)
def prod(c):
(definition of flip_index:)
def flip_index(i, n):
"""Reorder qubit indices from largest to smallest.
>>> from sympy.physics.quantum.qasm import flip_index
>>> flip_index(0, 2)
1
>>> flip_index(1, 2)
0"""
(definition of trim:)
def trim(line):
"""Remove everything following comment # characters in line.
>>> from sympy.physics.quantum.qasm import trim
>>> trim('nothing happens here')
'nothing happens here'
>>> trim('something #happens here')
'something '"""
(definition of get_index:)
def get_index(target, labels):
"""Get qubit labels from the rest of the line,and return indices
>>> from sympy.physics.quantum.qasm import get_index
>>> get_index('q0', ['q0', 'q1'])
1
>>> get_index('q1', ['q0', 'q1'])
0"""
(definition of get_indices:)
def get_indices(targets, labels):
(definition of nonblank:)
def nonblank(args):
(definition of fullsplit:)
def fullsplit(line):
(definition of fixcommand:)
def fixcommand(c):
"""Fix Qasm command names.
Remove all of forbidden characters from command c, and
replace 'def' with 'qdef'."""
(definition of stripquotes:)
def stripquotes(s):
"""Replace explicit quotes in a string.
>>> from sympy.physics.quantum.qasm import stripquotes
>>> stripquotes("'S'") == 'S'
True
>>> stripquotes('"S"') == 'S'
True
>>> stripquotes('S') == 'S'
True"""
(definition of Qasm:)
class Qasm(object):
"""Class to form objects from Qasm lines
>>> from sympy.physics.quantum.qasm import Qasm
>>> q = Qasm('qubit q0', 'qubit q1', 'h q0', 'cnot q0,q1')
>>> q.get_circuit()
CNOT(1,0)*H(1)
>>> q = Qasm('qubit q0', 'qubit q1', 'cnot q0,q1', 'cnot q1,q0', 'cnot q0,q1')
>>> q.get_circuit()
CNOT(1,0)*CNOT(0,1)*CNOT(1,0)"""
(definition of Qasm.__init__:)
def __init__(self, *args, **kwargs):
(definition of Qasm.add:)
def add(self, *lines):
(definition of Qasm.get_circuit:)
def get_circuit(self):
(definition of Qasm.get_labels:)
def get_labels(self):
(definition of Qasm.plot:)
def plot(self):
(definition of Qasm.qubit:)
def qubit(self, arg, init=None):
(definition of Qasm.indices:)
def indices(self, args):
(definition of Qasm.index:)
def index(self, arg):
(definition of Qasm.nop:)
def nop(self, *args):
(definition of Qasm.x:)
def x(self, arg):
(definition of Qasm.z:)
def z(self, arg):
(definition of Qasm.h:)
def h(self, arg):
(definition of Qasm.s:)
def s(self, arg):
(definition of Qasm.t:)
def t(self, arg):
(definition of Qasm.measure:)
def measure(self, arg):
(definition of Qasm.cnot:)
def cnot(self, a1, a2):
(definition of Qasm.swap:)
def swap(self, a1, a2):
(definition of Qasm.cphase:)
def cphase(self, a1, a2):
(definition of Qasm.toffoli:)
def toffoli(self, a1, a2, a3):
(definition of Qasm.cx:)
def cx(self, a1, a2):
(definition of Qasm.cz:)
def cz(self, a1, a2):
(definition of Qasm.defbox:)
def defbox(self, *args):
(definition of Qasm.qdef:)
def qdef(self, name, ncontrols, symbol):
[end of new definitions in sympy/physics/quantum/qasm.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2363 | 2,363 | sympy/sympy | 0.7 | 0ef741156ba8fc4d261d452301bf1753ef283730 | 2013-08-08T15:22:22Z | diff --git a/sympy/polys/polyclasses.py b/sympy/polys/polyclasses.py
index fd0f6c800cfa..95dba25db37b 100644
--- a/sympy/polys/polyclasses.py
+++ b/sympy/polys/polyclasses.py
@@ -504,7 +504,26 @@ def degree_list(f):
def total_degree(f):
"""Returns the total degree of ``f``. """
- return max([sum(m) for m in f.monoms()])
+ return max(sum(m) for m in f.monoms())
+
+ def homogenize(f, s):
+ """Return homogeneous polynomial of ``f``"""
+ td = f.total_degree()
+ result = {}
+ new_symbol = (s == len(f.terms()[0][0]))
+ for term in f.terms():
+ d = sum(term[0])
+ if d < td:
+ i = td - d
+ else:
+ i = 0
+ if new_symbol:
+ result[term[0] + (i,)] = term[1]
+ else:
+ l = list(term[0])
+ l[s] += i
+ result[tuple(l)] = term[1]
+ return DMP(result, f.dom, f.lev + int(new_symbol), f.ring)
def homogeneous_order(f):
"""Returns the homogeneous order of ``f``. """
diff --git a/sympy/polys/polytools.py b/sympy/polys/polytools.py
index c1b3dffcf993..274caa903de4 100644
--- a/sympy/polys/polytools.py
+++ b/sympy/polys/polytools.py
@@ -5,22 +5,15 @@
import sys
from sympy.core import (
- S, Basic, Expr, I, Integer, Add, Mul, Dummy, Tuple, Rational
+ S, Basic, Expr, I, Integer, Add, Mul, Dummy, Tuple
)
from sympy.core.mul import _keep_coeff
-
+from sympy.core.symbol import Symbol
from sympy.core.basic import preorder_traversal
-
from sympy.core.relational import Relational
-
-from sympy.core.sympify import (
- sympify, SympifyError,
-)
-
-from sympy.core.decorators import (
- _sympifyit,
-)
+from sympy.core.sympify import sympify
+from sympy.core.decorators import _sympifyit
from sympy.polys.polyclasses import DMP
@@ -33,17 +26,10 @@
_parallel_dict_from_expr,
)
-from sympy.polys.rationaltools import (
- together,
-)
-
-from sympy.polys.rootisolation import (
- dup_isolate_real_roots_list,
-)
-
+from sympy.polys.rationaltools import together
+from sympy.polys.rootisolation import dup_isolate_real_roots_list
from sympy.polys.groebnertools import groebner as _groebner
from sympy.polys.fglmtools import matrix_fglm
-
from sympy.polys.monomials import Monomial
from sympy.polys.orderings import monomial_key
@@ -70,6 +56,7 @@
from sympy.core.compatibility import iterable
+
@public
class Poly(Expr):
"""Generic class for representing polynomial expressions. """
@@ -335,7 +322,7 @@ def unify(f, g):
========
>>> from sympy import Poly
- >>> from sympy.abc import x, y
+ >>> from sympy.abc import x
>>> f, g = Poly(x/2 + 1), Poly(2*x + 1)
@@ -374,7 +361,7 @@ def _unify(f, g):
f.rep.to_dict(), f.gens, gens)
if f.rep.dom != dom:
- f_coeffs = [ dom.convert(c, f.rep.dom) for c in f_coeffs ]
+ f_coeffs = [dom.convert(c, f.rep.dom) for c in f_coeffs]
F = DMP(dict(list(zip(f_monoms, f_coeffs))), dom, lev)
else:
@@ -385,7 +372,7 @@ def _unify(f, g):
g.rep.to_dict(), g.gens, gens)
if g.rep.dom != dom:
- g_coeffs = [ dom.convert(c, g.rep.dom) for c in g_coeffs ]
+ g_coeffs = [dom.convert(c, g.rep.dom) for c in g_coeffs]
G = DMP(dict(list(zip(g_monoms, g_coeffs))), dom, lev)
else:
@@ -713,7 +700,7 @@ def retract(f, field=None):
========
>>> from sympy import Poly
- >>> from sympy.abc import x, y
+ >>> from sympy.abc import x
>>> f = Poly(x**2 + 1, x, domain='QQ[y]')
>>> f
@@ -765,7 +752,7 @@ def coeffs(f, order=None):
nth
"""
- return [ f.rep.dom.to_sympy(c) for c in f.rep.coeffs(order=order) ]
+ return [f.rep.dom.to_sympy(c) for c in f.rep.coeffs(order=order)]
def monoms(f, order=None):
"""
@@ -805,7 +792,7 @@ def terms(f, order=None):
all_terms
"""
- return [ (m, f.rep.dom.to_sympy(c)) for m, c in f.rep.terms(order=order) ]
+ return [(m, f.rep.dom.to_sympy(c)) for m, c in f.rep.terms(order=order)]
def all_coeffs(f):
"""
@@ -821,7 +808,7 @@ def all_coeffs(f):
[1, 0, 2, -1]
"""
- return [ f.rep.dom.to_sympy(c) for c in f.rep.all_coeffs() ]
+ return [f.rep.dom.to_sympy(c) for c in f.rep.all_coeffs()]
def all_monoms(f):
"""
@@ -857,7 +844,7 @@ def all_terms(f):
[((3,), 1), ((2,), 0), ((1,), 2), ((0,), -1)]
"""
- return [ (m, f.rep.dom.to_sympy(c)) for m, c in f.rep.all_terms() ]
+ return [(m, f.rep.dom.to_sympy(c)) for m, c in f.rep.all_terms()]
def termwise(f, func, *gens, **args):
"""
@@ -1567,7 +1554,7 @@ def rem(f, g, auto=True):
Examples
========
- >>> from sympy import Poly, ZZ, QQ
+ >>> from sympy import Poly
>>> from sympy.abc import x
>>> Poly(x**2 + 1, x).rem(Poly(2*x - 4, x))
@@ -1759,6 +1746,40 @@ def total_degree(f):
else: # pragma: no cover
raise OperationNotSupported(f, 'total_degree')
+ def homogenize(f, s):
+ """
+ Returns the homogeneous polynomial of ``f``.
+
+ A homogeneous polynomial is a polynomial whose all monomials with
+ non-zero coefficients have the same total degree. If you only
+ want to check if a polynomial is homogeneous, then use
+ :func:`Poly.is_homogeneous`. If you want not only to check if a
+ polynomial is homogeneous but also compute its homogeneous order,
+ then use :func:`Poly.homogeneous_order`.
+
+ Examples
+ ========
+
+ >>> from sympy import Poly
+ >>> from sympy.abc import x, y, z
+
+ >>> f = Poly(x**5 + 2*x**2*y**2 + 9*x*y**3)
+ >>> f.homogenize(z)
+ Poly(x**5 + 2*x**2*y**2*z + 9*x*y**3*z, x, y, z, domain='ZZ')
+
+ """
+ if not isinstance(s, Symbol):
+ raise TypeError("``Symbol`` expected, got %s" % type(s))
+ if s in f.gens:
+ i = f.gens.index(s)
+ gens = f.gens
+ else:
+ i = len(f.gens)
+ gens = f.gens + (s,)
+ if hasattr(f.rep, 'homogenize'):
+ return f.per(f.rep.homogenize(i), gens=gens)
+ raise OperationNotSupported(f, 'homogeneous_order')
+
def homogeneous_order(f):
"""
Returns the homogeneous order of ``f``.
@@ -2746,7 +2767,7 @@ def gff_list(f):
else: # pragma: no cover
raise OperationNotSupported(f, 'gff_list')
- return [ (f.per(g), k) for g, k in result ]
+ return [(f.per(g), k) for g, k in result]
def sqf_norm(f):
"""
@@ -2827,7 +2848,7 @@ def sqf_list(f, all=False):
else: # pragma: no cover
raise OperationNotSupported(f, 'sqf_list')
- return f.rep.dom.to_sympy(coeff), [ (f.per(g), k) for g, k in factors ]
+ return f.rep.dom.to_sympy(coeff), [(f.per(g), k) for g, k in factors]
def sqf_list_include(f, all=False):
"""
@@ -2860,7 +2881,7 @@ def sqf_list_include(f, all=False):
else: # pragma: no cover
raise OperationNotSupported(f, 'sqf_list_include')
- return [ (f.per(g), k) for g, k in factors ]
+ return [(f.per(g), k) for g, k in factors]
def factor_list(f):
"""
@@ -2887,7 +2908,7 @@ def factor_list(f):
else: # pragma: no cover
raise OperationNotSupported(f, 'factor_list')
- return f.rep.dom.to_sympy(coeff), [ (f.per(g), k) for g, k in factors ]
+ return f.rep.dom.to_sympy(coeff), [(f.per(g), k) for g, k in factors]
def factor_list_include(f):
"""
@@ -2914,7 +2935,7 @@ def factor_list_include(f):
else: # pragma: no cover
raise OperationNotSupported(f, 'factor_list_include')
- return [ (f.per(g), k) for g, k in factors ]
+ return [(f.per(g), k) for g, k in factors]
def intervals(f, all=False, eps=None, inf=None, sup=None, fast=False, sqf=False):
"""
@@ -3182,15 +3203,15 @@ def nroots(f, n=15, maxsteps=50, cleanup=True, error=False):
if f.degree() <= 0:
return []
- coeffs = [ coeff.evalf(n=n).as_real_imag()
- for coeff in f.all_coeffs() ]
+ coeffs = [coeff.evalf(n=n).as_real_imag()
+ for coeff in f.all_coeffs()]
dps = sympy.mpmath.mp.dps
sympy.mpmath.mp.dps = n
try:
try:
- coeffs = [ sympy.mpmath.mpc(*coeff) for coeff in coeffs ]
+ coeffs = [sympy.mpmath.mpc(*coeff) for coeff in coeffs]
except TypeError:
raise DomainError(
"numerical domain expected, got %s" % f.rep.dom)
@@ -4560,7 +4581,7 @@ def subresultants(f, g, *gens, **args):
result = F.subresultants(G)
if not opt.polys:
- return [ r.as_expr() for r in result ]
+ return [r.as_expr() for r in result]
else:
return result
@@ -4890,7 +4911,7 @@ def terms_gcd(f, *gens, **args):
Examples
========
- >>> from sympy import terms_gcd, cos, pi
+ >>> from sympy import terms_gcd, cos
>>> from sympy.abc import x, y
>>> terms_gcd(x**6*y**2 + x**3*y, x, y)
x**3*y*(x**3*y + 1)
@@ -4970,7 +4991,7 @@ def terms_gcd(f, *gens, **args):
else:
coeff = S.One
- term = Mul(*[ x**j for x, j in zip(f.gens, J) ])
+ term = Mul(*[x**j for x, j in zip(f.gens, J)])
if clear:
return _keep_coeff(coeff, term*f.as_expr())
@@ -5074,7 +5095,7 @@ def primitive(f, *gens, **args):
========
>>> from sympy.polys.polytools import primitive
- >>> from sympy.abc import x, y
+ >>> from sympy.abc import x
>>> primitive(6*x**2 + 8*x + 12)
(2, 3*x**2 + 4*x + 6)
@@ -5166,7 +5187,7 @@ def decompose(f, *gens, **args):
result = F.decompose()
if not opt.polys:
- return [ r.as_expr() for r in result ]
+ return [r.as_expr() for r in result]
else:
return result
@@ -5196,7 +5217,7 @@ def sturm(f, *gens, **args):
result = F.sturm(auto=opt.auto)
if not opt.polys:
- return [ r.as_expr() for r in result ]
+ return [r.as_expr() for r in result]
else:
return result
@@ -5231,7 +5252,7 @@ def gff_list(f, *gens, **args):
factors = F.gff_list()
if not opt.polys:
- return [ (g.as_expr(), k) for g, k in factors ]
+ return [(g.as_expr(), k) for g, k in factors]
else:
return factors
@@ -5324,7 +5345,7 @@ def key(obj):
def _factors_product(factors):
"""Multiply a list of ``(expr, exp)`` pairs. """
- return Mul(*[ f.as_expr()**k for f, k in factors ])
+ return Mul(*[f.as_expr()**k for f, k in factors])
def _symbolic_factor_list(expr, opt, method):
@@ -5362,7 +5383,7 @@ def _symbolic_factor_list(expr, opt, method):
if exp is S.One:
factors.extend(_factors)
elif exp.is_integer or len(_factors) == 1:
- factors.extend([ (f, k*exp) for f, k in _factors ])
+ factors.extend([(f, k*exp) for f, k in _factors])
else:
other = []
@@ -5389,9 +5410,9 @@ def _symbolic_factor(expr, opt, method):
coeff, factors = _symbolic_factor_list(together(expr), opt, method)
return _keep_coeff(coeff, _factors_product(factors))
elif hasattr(expr, 'args'):
- return expr.func(*[ _symbolic_factor(arg, opt, method) for arg in expr.args ])
+ return expr.func(*[_symbolic_factor(arg, opt, method) for arg in expr.args])
elif hasattr(expr, '__iter__'):
- return expr.__class__([ _symbolic_factor(arg, opt, method) for arg in expr ])
+ return expr.__class__([_symbolic_factor(arg, opt, method) for arg in expr])
else:
return expr
@@ -5424,8 +5445,8 @@ def _generic_factor_list(expr, gens, args, method):
fq = _sorted_factors(fq, method)
if not opt.polys:
- fp = [ (f.as_expr(), k) for f, k in fp ]
- fq = [ (f.as_expr(), k) for f, k in fq ]
+ fp = [(f.as_expr(), k) for f, k in fp]
+ fq = [(f.as_expr(), k) for f, k in fq]
coeff = cp/cq
@@ -5443,6 +5464,7 @@ def _generic_factor(expr, gens, args, method):
opt = options.build_options(gens, args)
return _symbolic_factor(sympify(expr), opt, method)
+
def to_rational_coeffs(f):
"""
try to transform a polynomial to have rational coefficients
@@ -5467,7 +5489,7 @@ def to_rational_coeffs(f):
Examples
========
- >>> from sympy import sqrt, Poly, simplify, expand
+ >>> from sympy import sqrt, Poly, simplify
>>> from sympy.polys.polytools import to_rational_coeffs
>>> from sympy.abc import x
>>> p = Poly(((x**2-1)*(x-2)).subs({x:x*(1 + sqrt(2))}), x, domain='EX')
@@ -5566,8 +5588,6 @@ def _has_square_roots(p):
return has_sq
if f.get_domain().is_EX and _has_square_roots(f):
- rescale_x = None
- translate_x = None
r = _try_rescale(f)
if r:
return r[0], r[1], None, r[2]
@@ -5577,6 +5597,7 @@ def _has_square_roots(p):
return None, None, r[0], r[1]
return None
+
def _torational_factor_list(p, x):
"""
helper function to factor polynomial using to_rational_coeffs
@@ -5612,12 +5633,12 @@ def _torational_factor_list(p, x):
r1 = simplify(1/r)
a = []
for z in factors[1:][0]:
- a.append((simplify(z[0].subs({x:x*r1})), z[1]))
+ a.append((simplify(z[0].subs({x: x*r1})), z[1]))
else:
c = factors[0]
a = []
for z in factors[1:][0]:
- a.append((z[0].subs({x:x - t}), z[1]))
+ a.append((z[0].subs({x: x - t}), z[1]))
return (c, a)
@@ -6096,19 +6117,19 @@ def reduced(f, G, *gens, **args):
Q, r = polys[0].div(polys[1:])
- Q = [ Poly._from_dict(dict(q), opt) for q in Q ]
+ Q = [Poly._from_dict(dict(q), opt) for q in Q]
r = Poly._from_dict(dict(r), opt)
if retract:
try:
- _Q, _r = [ q.to_ring() for q in Q ], r.to_ring()
+ _Q, _r = [q.to_ring() for q in Q], r.to_ring()
except CoercionFailed:
pass
else:
Q, r = _Q, _r
if not opt.polys:
- return [ q.as_expr() for q in Q ], r.as_expr()
+ return [q.as_expr() for q in Q], r.as_expr()
else:
return Q, r
@@ -6184,6 +6205,7 @@ def is_zero_dimensional(F, *gens, **args):
"""
return GroebnerBasis(F, *gens, **args).is_zero_dimensional
+
@public
class GroebnerBasis(Basic):
"""Represents a reduced Groebner basis. """
@@ -6204,7 +6226,7 @@ def __new__(cls, F, *gens, **args):
polys[i] = ring.from_dict(poly.rep.to_dict())
G = _groebner(polys, ring, method=opt.method)
- G = [ Poly._from_dict(g, opt) for g in G ]
+ G = [Poly._from_dict(g, opt) for g in G]
return cls._new(G, opt)
@@ -6223,7 +6245,7 @@ def args(self):
@property
def exprs(self):
- return [ poly.as_expr() for poly in self._basis ]
+ return [poly.as_expr() for poly in self._basis]
@property
def polys(self):
@@ -6362,10 +6384,10 @@ def fglm(self, order):
polys[i] = _ring.from_dict(poly)
G = matrix_fglm(polys, _ring, dst_order)
- G = [ Poly._from_dict(dict(g), opt) for g in G ]
+ G = [Poly._from_dict(dict(g), opt) for g in G]
if not domain.has_Field:
- G = [ g.clear_denoms(convert=True)[1] for g in G ]
+ G = [g.clear_denoms(convert=True)[1] for g in G]
opt.domain = domain
return self._new(G, opt)
@@ -6419,19 +6441,19 @@ def reduce(self, expr, auto=True):
Q, r = polys[0].div(polys[1:])
- Q = [ Poly._from_dict(dict(q), opt) for q in Q ]
+ Q = [Poly._from_dict(dict(q), opt) for q in Q]
r = Poly._from_dict(dict(r), opt)
if retract:
try:
- _Q, _r = [ q.to_ring() for q in Q ], r.to_ring()
+ _Q, _r = [q.to_ring() for q in Q], r.to_ring()
except CoercionFailed:
pass
else:
Q, r = _Q, _r
if not opt.polys:
- return [ q.as_expr() for q in Q ], r.as_expr()
+ return [q.as_expr() for q in Q], r.as_expr()
else:
return Q, r
| diff --git a/sympy/polys/tests/test_polytools.py b/sympy/polys/tests/test_polytools.py
index 44e259e0e1f0..4b3f699e392c 100644
--- a/sympy/polys/tests/test_polytools.py
+++ b/sympy/polys/tests/test_polytools.py
@@ -1192,6 +1192,12 @@ def test_Poly_total_degree():
assert Poly(x**3 + x + 1).total_degree() == 3
+def test_Poly_homogenize():
+ assert Poly(x**2+y).homogenize(z) == Poly(x**2+y*z)
+ assert Poly(x+y).homogenize(z) == Poly(x+y, x, y, z)
+ assert Poly(x+y**2).homogenize(y) == Poly(x*y+y**2)
+
+
def test_Poly_homogeneous_order():
assert Poly(0, x, y).homogeneous_order() == -1
assert Poly(1, x, y).homogeneous_order() == 0
| [
{
"components": [
{
"doc": "Return homogeneous polynomial of ``f``",
"lines": [
509,
526
],
"name": "DMP.homogenize",
"signature": "def homogenize(f, s):",
"type": "function"
}
],
"file": "sympy/polys/polyclasses.py"
},
... | [
"test_Poly_homogenize"
] | [
"test_Poly_from_dict",
"test_Poly_from_list",
"test_Poly_from_poly",
"test_Poly_from_expr",
"test_Poly__new__",
"test_Poly__args",
"test_Poly__gens",
"test_Poly_zero",
"test_Poly_one",
"test_Poly__unify",
"test_Poly_free_symbols",
"test_PurePoly_free_symbols",
"test_Poly__eq__",
"test_Pure... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Homogeneous polynomial
Convert a polynomial to homogeneous polynomial, modification including a new function and pyflakes.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/polys/polyclasses.py]
(definition of DMP.homogenize:)
def homogenize(f, s):
"""Return homogeneous polynomial of ``f``"""
[end of new definitions in sympy/polys/polyclasses.py]
[start of new definitions in sympy/polys/polytools.py]
(definition of Poly.homogenize:)
def homogenize(f, s):
"""Returns the homogeneous polynomial of ``f``.
A homogeneous polynomial is a polynomial whose all monomials with
non-zero coefficients have the same total degree. If you only
want to check if a polynomial is homogeneous, then use
:func:`Poly.is_homogeneous`. If you want not only to check if a
polynomial is homogeneous but also compute its homogeneous order,
then use :func:`Poly.homogeneous_order`.
Examples
========
>>> from sympy import Poly
>>> from sympy.abc import x, y, z
>>> f = Poly(x**5 + 2*x**2*y**2 + 9*x*y**3)
>>> f.homogenize(z)
Poly(x**5 + 2*x**2*y**2*z + 9*x*y**3*z, x, y, z, domain='ZZ')"""
[end of new definitions in sympy/polys/polytools.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2355 | 2,355 | sympy/sympy | 0.7 | e968bc7ab7d61413eab67bb662c5094ccd3f357f | 2013-08-05T19:57:50Z | diff --git a/doc/src/modules/rewriting.rst b/doc/src/modules/rewriting.rst
index 058d051a36bc..3318b0b08c77 100644
--- a/doc/src/modules/rewriting.rst
+++ b/doc/src/modules/rewriting.rst
@@ -84,9 +84,24 @@ in the ``cse`` function. Examples::
⎝[(x₀, sin(x + 1) + cos(y))], ⎣╲╱ x₀ + 4 ⋅╲╱ x₀ + 5 ⎦⎠
>>> pprint(cse((x-y)*(z-y) + sqrt((x-y)*(z-y))), use_unicode=True)
+ ⎛ ⎡ ____ ⎤⎞
+ ⎝[(x₀, -y), (x₁, (x + x₀)⋅(x₀ + z))], ⎣╲╱ x₁ + x₁⎦⎠
+
+Optimizations to be performed before and after common subexpressions
+elimination can be passed in the``optimizations`` optional argument. A set of
+predefined basic optimizations can be applied by passing
+``optimizations='basic'``::
+
+ >>> pprint(cse((x-y)*(z-y) + sqrt((x-y)*(z-y)), optimizations='basic'),
+ ... use_unicode=True)
⎛ ⎡ ____ ⎤⎞
⎝[(x₀, -(x - y)⋅(y - z))], ⎣╲╱ x₀ + x₀⎦⎠
+However, these optimizations can be very slow for large expressions. Moreover,
+if speed is a concern, one can pass the option ``order='none'``. Order of
+terms will then be dependent on hashing algorithm implementation, but speed
+will be greatly improved.
+
More information:
.. autofunction:noindex: cse
diff --git a/doc/src/modules/simplify/simplify.rst b/doc/src/modules/simplify/simplify.rst
index e83cc73dfda7..1b29620791dc 100644
--- a/doc/src/modules/simplify/simplify.rst
+++ b/doc/src/modules/simplify/simplify.rst
@@ -109,6 +109,14 @@ cse
^^^
.. autofunction:: cse
+opt_cse
+^^^^^^^
+.. autofunction:: sympy.simplify.cse_main.opt_cse
+
+tree_cse
+^^^^^^^^
+.. autofunction:: sympy.simplify.cse_main.tree_cse
+
Hypergeometric Function Expansion
---------------------------------
.. module:: sympy.simplify.hyperexpand
diff --git a/sympy/simplify/cse_main.py b/sympy/simplify/cse_main.py
index 81a7c377e244..200ad9f07c02 100644
--- a/sympy/simplify/cse_main.py
+++ b/sympy/simplify/cse_main.py
@@ -4,9 +4,11 @@
import difflib
-from sympy.core import Basic, Mul, Add, sympify
+from sympy.core import Basic, Mul, Add, Pow, sympify, Tuple
+from sympy.core.singleton import S
from sympy.core.basic import preorder_traversal
from sympy.core.function import _coeff_isneg
+from sympy.core.exprtools import factor_terms
from sympy.core.compatibility import iterable, xrange
from sympy.utilities.iterables import numbered_symbols, \
sift, topological_sort, ordered
@@ -25,7 +27,9 @@
# ``None`` can be used to specify no transformation for either the preprocessor or
# postprocessor.
-cse_optimizations = list(cse_opts.default_optimizations)
+
+basic_optimizations = [(cse_opts.sub_pre, cse_opts.sub_post),
+ (factor_terms, None)]
# sometimes we want the output in a different format; non-trivial
# transformations can be put here for users
@@ -135,82 +139,246 @@ def postprocess_for_cse(expr, optimizations):
return expr
-def _remove_singletons(reps, exprs):
+def opt_cse(exprs, order='canonical'):
+ """Find optimization opportunities in Adds, Muls, Pows and negative
+ coefficient Muls
+
+ Parameters
+ ----------
+ exprs : list of sympy expressions
+ The expressions to optimize.
+ order : string, 'none' or 'canonical'
+ The order by which Mul and Add arguments are processed. For large
+ expressions where speed is a concern, use the setting order='none'.
+
+ Returns
+ -------
+ opt_subs : dictionary of expression substitutions
+ The expression substitutions which can be useful to optimize CSE.
+
+ Examples
+ --------
+ >>> from sympy.simplify.cse_main import opt_cse
+ >>> from sympy.abc import x
+ >>> opt_subs = opt_cse([x**-2])
+ >>> print(opt_subs)
+ {x**(-2): 1/(x**2)}
"""
- Helper function for cse that will remove expressions that weren't
- used more than once.
+ from sympy.matrices import Matrix
+
+ opt_subs = dict()
+
+ adds = set()
+ muls = set()
+
+ seen_subexp = set()
+
+ def _find_opts(expr):
+
+ if expr.is_Atom:
+ return
+
+ if iterable(expr):
+ list(map(_find_opts, expr))
+ return
+
+ if expr in seen_subexp:
+ return expr
+ seen_subexp.add(expr)
+
+ list(map(_find_opts, expr.args))
+
+ if _coeff_isneg(expr):
+ neg_expr = -expr
+ if not neg_expr.is_Atom:
+ opt_subs[expr] = Mul(S.NegativeOne, neg_expr, evaluate=False)
+ seen_subexp.add(neg_expr)
+ expr = neg_expr
+
+ if expr.is_Mul:
+ muls.add(expr)
+
+ elif expr.is_Add:
+ adds.add(expr)
+
+ elif expr.is_Pow:
+ if _coeff_isneg(expr.exp):
+ opt_subs[expr] = Pow(Pow(expr.base, -expr.exp), S.NegativeOne,
+ evaluate=False)
+
+ for e in exprs:
+ if isinstance(e, Basic):
+ _find_opts(e)
+
+ ## Process Adds and commutative Muls
+
+ def _match_common_args(Func, funcs):
+ if order != 'none':
+ funcs = list(ordered(funcs))
+ else:
+ funcs = sorted(funcs, key=lambda x: len(x.args))
+
+ func_args = [set(e.args) for e in funcs]
+ for i in xrange(len(func_args)):
+ for j in xrange(i + 1, len(func_args)):
+ com_args = func_args[i].intersection(func_args[j])
+ if len(com_args) > 1:
+ com_func = Func(*com_args)
+
+ # for all sets, replace the common symbols by the function
+ # over them, to allow recursive matches
+
+ diff_i = func_args[i].difference(com_args)
+ func_args[i] = diff_i | set([com_func])
+ if diff_i:
+ opt_subs[funcs[i]] = Func(Func(*diff_i), com_func,
+ evaluate=False)
+
+ diff_j = func_args[j].difference(com_args)
+ func_args[j] = diff_j | set([com_func])
+ opt_subs[funcs[j]] = Func(Func(*diff_j), com_func,
+ evaluate=False)
+
+ for k in xrange(j + 1, len(func_args)):
+ if not com_args.difference(func_args[k]):
+ diff_k = func_args[k].difference(com_args)
+ func_args[k] = diff_k | set([com_func])
+ opt_subs[funcs[k]] = Func(Func(*diff_k), com_func,
+ evaluate=False)
+
+ # split muls into commutative
+ comutative_muls = set()
+ for m in muls:
+ c, nc = m.args_cnc(cset=True)
+ if c:
+ c_mul = Mul(*c)
+ if nc:
+ opt_subs[m] = Mul(c_mul, Mul(*nc), evaluate=False)
+ if len(c) > 1:
+ comutative_muls.add(c_mul)
+
+ _match_common_args(Add, adds)
+ _match_common_args(Mul, comutative_muls)
+
+ return opt_subs
+
+
+def tree_cse(exprs, symbols, opt_subs=None, order='canonical'):
+ """Perform raw CSE on expression tree, taking opt_subs into account.
+
+ Parameters
+ ==========
+
+ exprs : list of sympy expressions
+ The expressions to reduce.
+ symbols : infinite iterator yielding unique Symbols
+ The symbols used to label the common subexpressions which are pulled
+ out.
+ opt_subs : dictionary of expression substitutions
+ The expressions to be substituted before any CSE action is performed.
+ order : string, 'none' or 'canonical'
+ The order by which Mul and Add arguments are processed. For large
+ expressions where speed is a concern, use the setting order='none'.
"""
- u_reps = [] # the useful reps that are used more than once
- for i, ui in enumerate(reps):
- used = [] # where it was used
- ri, ei = ui
-
- # keep track of whether the substitution was used more
- # than once. If used is None, it was never used (yet);
- # if used is an int, that is the last place where it was
- # used (>=0 in the reps, <0 in the expressions) and if
- # it is True, it was used more than once.
-
- used = None
-
- tot = 0 # total times used so far
-
- # search through the reps
- for j in range(i + 1, len(reps)):
- c = reps[j][1].count(ri)
- if c:
- tot += c
- if tot > 1:
- u_reps.append(ui)
- used = True
- break
- else:
- used = j
-
- if used is not True:
-
- # then search through the expressions
-
- for j, rj in enumerate(exprs):
- c = rj.count(ri)
- if c:
- # append a negative so we know that it was in the
- # expression that used it
- tot += c
- if tot > 1:
- u_reps.append(ui)
- used = True
- break
- else:
- used = j - len(exprs)
-
- if type(used) is int:
-
- # undo the change
-
- rep = {ri: ei}
- j = used
- if j < 0:
- exprs[j] = exprs[j].subs(rep)
- else:
- reps[j] = reps[j][0], reps[j][1].subs(rep)
-
- # reuse unused symbols so a contiguous range of symbols is returned
-
- if len(u_reps) != len(reps):
- for i, ri in enumerate(u_reps):
- if u_reps[i][0] != reps[i][0]:
- rep = (u_reps[i][0], reps[i][0])
- u_reps[i] = rep[1], u_reps[i][1].subs(*rep)
- for j in range(i + 1, len(u_reps)):
- u_reps[j] = u_reps[j][0], u_reps[j][1].subs(*rep)
- for j, rj in enumerate(exprs):
- exprs[j] = exprs[j].subs(*rep)
-
- reps[:] = u_reps # change happens in-place
-
-
-def cse(exprs, symbols=None, optimizations=None, postprocess=None):
+ from sympy.matrices import Matrix
+
+ if opt_subs is None:
+ opt_subs = dict()
+
+ ## Find repeated sub-expressions
+
+ to_eliminate = set()
+
+ seen_subexp = set()
+
+ def _find_repeated(expr):
+ if expr.is_Atom:
+ return
+
+ if iterable(expr):
+ args = expr
+
+ else:
+ if expr in seen_subexp:
+ to_eliminate.add(expr)
+ return
+
+ seen_subexp.add(expr)
+
+ if expr in opt_subs:
+ expr = opt_subs[expr]
+
+ args = expr.args
+
+ list(map(_find_repeated, args))
+
+ for e in exprs:
+ if isinstance(e, Basic):
+ _find_repeated(e)
+
+ ## Rebuild tree
+
+ replacements = []
+
+ subs = dict()
+
+ def _rebuild(expr):
+
+ if expr.is_Atom:
+ return expr
+
+ if iterable(expr):
+ new_args = [_rebuild(arg) for arg in expr]
+ return expr.func(*new_args)
+
+ if expr in subs:
+ return subs[expr]
+
+ orig_expr = expr
+ if expr in opt_subs:
+ expr = opt_subs[expr]
+
+ # If enabled, parse Muls and Adds arguments by order to ensure
+ # replacement order independent from hashes
+ if order != 'none':
+ if expr.is_Mul:
+ c, nc = expr.args_cnc()
+ args = list(ordered(c)) + nc
+ elif expr.is_Add:
+ args = list(ordered(expr.args))
+ else:
+ args = expr.args
+ else:
+ args = expr.args
+
+ new_args = list(map(_rebuild, args))
+ if new_args != args:
+ new_expr = expr.func(*new_args)
+ else:
+ new_expr = expr
+
+ if orig_expr in to_eliminate:
+ sym = next(symbols)
+ subs[orig_expr] = sym
+ replacements.append((sym, new_expr))
+ return sym
+
+ else:
+ return new_expr
+
+ reduced_exprs = []
+ for e in exprs:
+ if isinstance(e, Basic):
+ reduced_e = _rebuild(e)
+ else:
+ reduced_e = e
+ reduced_exprs.append(reduced_e)
+
+ return replacements, reduced_exprs
+
+
+def cse(exprs, symbols=None, optimizations=None, postprocess=None,
+ order='canonical'):
""" Perform common subexpression elimination on an expression.
Parameters
@@ -221,22 +389,32 @@ def cse(exprs, symbols=None, optimizations=None, postprocess=None):
symbols : infinite iterator yielding unique Symbols
The symbols used to label the common subexpressions which are pulled
out. The ``numbered_symbols`` generator is useful. The default is a
- stream of symbols of the form "x0", "x1", etc. This must be an infinite
- iterator.
- optimizations : list of (callable, callable) pairs, optional
- The (preprocessor, postprocessor) pairs. If not provided,
- ``sympy.simplify.cse.cse_optimizations`` is used.
+ stream of symbols of the form "x0", "x1", etc. This must be an
+ infinite iterator.
+ optimizations : list of (callable, callable) pairs
+ The (preprocessor, postprocessor) pairs of external optimization
+ functions. Optionally 'basic' can be passed for a set of predefined
+ basic optimizations. Such 'basic' optimizations were used by default
+ in old implementation, however they can be really slow on larger
+ expressions. Now, no pre or post optimizations are made by default.
postprocess : a function which accepts the two return values of cse and
returns the desired form of output from cse, e.g. if you want the
replacements reversed the function might be the following lambda:
lambda r, e: return reversed(r), e
+ order : string, 'none' or 'canonical'
+ The order by which Mul and Add arguments are processed. If set to
+ 'canonical', arguments will be canonically ordered. If set to 'none',
+ ordering will be faster but dependent on expressions hashes, thus
+ machine dependent and variable. For large expressions where speed is a
+ concern, use the setting order='none'.
Returns
=======
replacements : list of (Symbol, expression) pairs
All of the common subexpressions that were replaced. Subexpressions
- earlier in this list might show up in subexpressions later in this list.
+ earlier in this list might show up in subexpressions later in this
+ list.
reduced_exprs : list of sympy expressions
The reduced expressions with all of the replacements above.
"""
@@ -248,15 +426,11 @@ def cse(exprs, symbols=None, optimizations=None, postprocess=None):
# In case we get passed an iterable with an __iter__ method instead of
# an actual iterator.
symbols = iter(symbols)
- seen_subexp = set()
- muls = set()
- adds = set()
- to_eliminate = set()
if optimizations is None:
- # Pull out the default here just in case there are some weird
- # manipulations of the module-level list in some other thread.
- optimizations = list(cse_optimizations)
+ optimizations = list()
+ elif optimizations == 'basic':
+ optimizations = basic_optimizations
# Handle the case if just one expression was passed.
if isinstance(exprs, Basic):
@@ -265,143 +439,19 @@ def cse(exprs, symbols=None, optimizations=None, postprocess=None):
# Preprocess the expressions to give us better optimization opportunities.
reduced_exprs = [preprocess_for_cse(e, optimizations) for e in exprs]
- # Find all of the repeated subexpressions.
- for expr in reduced_exprs:
- if not isinstance(expr, Basic):
- continue
- pt = preorder_traversal(expr)
- for subtree in pt:
-
- inv = 1/subtree if subtree.is_Pow else None
-
- if subtree.is_Atom or iterable(subtree) or inv and inv.is_Atom:
- # Exclude atoms, since there is no point in renaming them.
- continue
-
- if subtree in seen_subexp:
- if inv and _coeff_isneg(subtree.exp):
- # save the form with positive exponent
- subtree = inv
- to_eliminate.add(subtree)
- pt.skip()
- continue
-
- if inv and inv in seen_subexp:
- if _coeff_isneg(subtree.exp):
- # save the form with positive exponent
- subtree = inv
- to_eliminate.add(subtree)
- pt.skip()
- continue
- elif subtree.is_Mul:
- muls.add(subtree)
- elif subtree.is_Add:
- adds.add(subtree)
-
- seen_subexp.add(subtree)
-
- # process adds - any adds that weren't repeated might contain
- # subpatterns that are repeated, e.g. x+y+z and x+y have x+y in common
- adds = [set(a.args) for a in ordered(adds)]
- for i in xrange(len(adds)):
- for j in xrange(i + 1, len(adds)):
- com = adds[i].intersection(adds[j])
- if len(com) > 1:
- to_eliminate.add(Add(*com))
-
- # remove this set of symbols so it doesn't appear again
- adds[i] = adds[i].difference(com)
- adds[j] = adds[j].difference(com)
- for k in xrange(j + 1, len(adds)):
- if not com.difference(adds[k]):
- adds[k] = adds[k].difference(com)
-
- # process muls - any muls that weren't repeated might contain
- # subpatterns that are repeated, e.g. x*y*z and x*y have x*y in common
-
- # use SequenceMatcher on the nc part to find the longest common expression
- # in common between the two nc parts
- sm = difflib.SequenceMatcher()
-
- muls = [a.args_cnc(cset=True) for a in ordered(muls)]
- for i in xrange(len(muls)):
- if muls[i][1]:
- sm.set_seq1(muls[i][1])
- for j in xrange(i + 1, len(muls)):
- # the commutative part in common
- ccom = muls[i][0].intersection(muls[j][0])
-
- # the non-commutative part in common
- if muls[i][1] and muls[j][1]:
- # see if there is any chance of an nc match
- ncom = set(muls[i][1]).intersection(set(muls[j][1]))
- if len(ccom) + len(ncom) < 2:
- continue
-
- # now work harder to find the match
- sm.set_seq2(muls[j][1])
- i1, _, n = sm.find_longest_match(0, len(muls[i][1]),
- 0, len(muls[j][1]))
- ncom = muls[i][1][i1:i1 + n]
- else:
- ncom = []
-
- com = list(ccom) + ncom
- if len(com) < 2:
- continue
-
- to_eliminate.add(Mul(*com))
-
- # remove ccom from all if there was no ncom; to update the nc part
- # would require finding the subexpr and then replacing it with a
- # dummy to keep bounding nc symbols from being identified as a
- # subexpr, e.g. removing B*C from A*B*C*D might allow A*D to be
- # identified as a subexpr which would not be right.
- if not ncom:
- muls[i][0] = muls[i][0].difference(ccom)
- for k in xrange(j, len(muls)):
- if not ccom.difference(muls[k][0]):
- muls[k][0] = muls[k][0].difference(ccom)
-
- # make to_eliminate canonical; we will prefer non-Muls to Muls
- # so select them first (non-Muls will have False for is_Mul and will
- # be first in the ordering.
- to_eliminate = list(ordered(to_eliminate, lambda _: _.is_Mul))
-
- # Substitute symbols for all of the repeated subexpressions.
- replacements = []
- reduced_exprs = list(reduced_exprs)
- hit = True
- for i, subtree in enumerate(to_eliminate):
- if hit:
- sym = next(symbols)
- hit = False
- if subtree.is_Pow and subtree.exp.is_Rational:
- update = lambda x: x.xreplace({subtree: sym, 1/subtree: 1/sym})
- else:
- update = lambda x: x.subs(subtree, sym)
- # Make the substitution in all of the target expressions.
- for j, expr in enumerate(reduced_exprs):
- old = reduced_exprs[j]
- reduced_exprs[j] = update(expr)
- hit = hit or (old != reduced_exprs[j])
- # Make the substitution in all of the subsequent substitutions.
- for j in range(i + 1, len(to_eliminate)):
- old = to_eliminate[j]
- to_eliminate[j] = update(to_eliminate[j])
- hit = hit or (old != to_eliminate[j])
- if hit:
- replacements.append((sym, subtree))
+ # Find other optimization opportunities.
+ opt_subs = opt_cse(reduced_exprs, order)
+
+ # Main CSE algorithm.
+ replacements, reduced_exprs = tree_cse(reduced_exprs, symbols, opt_subs,
+ order)
# Postprocess the expressions to return the expressions to canonical form.
for i, (sym, subtree) in enumerate(replacements):
subtree = postprocess_for_cse(subtree, optimizations)
replacements[i] = (sym, subtree)
reduced_exprs = [postprocess_for_cse(e, optimizations)
- for e in reduced_exprs]
-
- # remove replacements that weren't used more than once
- _remove_singletons(replacements, reduced_exprs)
+ for e in reduced_exprs]
if isinstance(exprs, Matrix):
reduced_exprs = [Matrix(exprs.rows, exprs.cols, reduced_exprs)]
diff --git a/sympy/simplify/cse_opts.py b/sympy/simplify/cse_opts.py
index 465876b29289..bf4b50fdd7f0 100644
--- a/sympy/simplify/cse_opts.py
+++ b/sympy/simplify/cse_opts.py
@@ -5,7 +5,6 @@
from sympy.core import Add, Basic, Expr, Mul
from sympy.core.basic import preorder_traversal
-from sympy.core.exprtools import factor_terms
from sympy.core.singleton import S
from sympy.utilities.iterables import default_sort_key
@@ -42,9 +41,3 @@ def sub_post(e):
e = e.xreplace({node: replacement})
return e
-
-
-default_optimizations = [
- (sub_pre, sub_post),
- (factor_terms, None),
-]
| diff --git a/sympy/core/tests/test_expand.py b/sympy/core/tests/test_expand.py
index 23051f04e36f..ab5829e3021f 100644
--- a/sympy/core/tests/test_expand.py
+++ b/sympy/core/tests/test_expand.py
@@ -133,12 +133,12 @@ def test_expand_frac():
def test_issue_3022():
eq = -I*exp(-3*I*pi/4)/(4*pi**(S(3)/2)*sqrt(x))
- assert cse((eq).expand(complex=True)) == S('''
- ([(x0, re(x)), (x1, im(x)), (x2, sin(atan2(x1, x0)/2)), (x3,
- cos(atan2(x1, x0)/2)), (x4, x0**2 + x1**2), (x5, sin(atan2(0, x4)/4)),
- (x6, cos(atan2(0, x4)/4)), (x7, x2*x5), (x8, x3*x5), (x9, x2*x6),
- (x10, x3*x6)], [sqrt(2)*(-x10 + I*x10 + x7 - I*x7 + x8 + I*x8 + x9 +
- I*x9)/(8*pi**(3/2)*x4**(1/4))])''')
+ assert cse((eq).expand(complex=True), optimizations='basic') \
+ == S(''' ([(x0, re(x)), (x1, im(x)), (x2, x0**2 + x1**2), (x3,
+ atan2(x1, x0)/2), (x4, sin(x3)), (x5, atan2(0, x2)/4), (x6, sin(x5)),
+ (x7, x4*x6), (x8, cos(x3)), (x9, x6*x8), (x10, cos(x5)), (x11,
+ x10*x4), (x12, x10*x8)], [sqrt(2)*(x11 + I*x11 - x12 + I*x12 + x7 -
+ I*x7 + x9 + I*x9)/(8*pi**(3/2)*x2**(1/4))])''')
def test_expand_power_base():
diff --git a/sympy/core/tests/test_subs.py b/sympy/core/tests/test_subs.py
index c3076d2af9a9..1975b4ce5cc0 100644
--- a/sympy/core/tests/test_subs.py
+++ b/sympy/core/tests/test_subs.py
@@ -549,7 +549,7 @@ def test_issue_3460():
e = -log(-12*sqrt(2) + 17)/24 - log(-2*sqrt(2) + 3)/12 + sqrt(2)/3
# XXX modify cse so x1 is eliminated and x0 = -sqrt(2)?
assert cse(e) == (
- [(x0, sqrt(2)), (x1, -x0)], [x0/3 - log(2*x1 + 3)/12 - log(12*x1 + 17)/24])
+ [(x0, sqrt(2))], [x0/3 - log(-12*x0 + 17)/24 - log(-2*x0 + 3)/12])
def test_issue_2162():
diff --git a/sympy/simplify/tests/test_cse.py b/sympy/simplify/tests/test_cse.py
index 7c2ee113cb98..8bfadad5ba72 100644
--- a/sympy/simplify/tests/test_cse.py
+++ b/sympy/simplify/tests/test_cse.py
@@ -8,7 +8,7 @@
from sympy.utilities.pytest import XFAIL
w, x, y, z = symbols('w,x,y,z')
-x0, x1, x2, x3, x4, x5, x6, x7, x8, x9, x10, x11 = symbols('x:12')
+x0, x1, x2, x3, x4, x5, x6, x7, x8, x9, x10, x11, x12 = symbols('x:13')
def test_numbered_symbols():
@@ -55,7 +55,7 @@ def test_postprocess_for_cse():
def test_cse_single():
# Simple substitution.
e = Add(Pow(x + y, 2), sqrt(x + y))
- substs, reduced = cse([e], optimizations=[])
+ substs, reduced = cse([e])
assert substs == [(x0, x + y)]
assert reduced == [sqrt(x0) + x0**2]
@@ -63,7 +63,7 @@ def test_cse_single():
def test_cse_single2():
# Simple substitution, test for being able to pass the expression directly
e = Add(Pow(x + y, 2), sqrt(x + y))
- substs, reduced = cse(e, optimizations=[])
+ substs, reduced = cse(e)
assert substs == [(x0, x + y)]
assert reduced == [sqrt(x0) + x0**2]
assert isinstance(cse(Matrix([[1]]))[1][0], Matrix)
@@ -72,7 +72,7 @@ def test_cse_single2():
def test_cse_not_possible():
# No substitution possible.
e = Add(x, y)
- substs, reduced = cse([e], optimizations=[])
+ substs, reduced = cse([e])
assert substs == []
assert reduced == [x + y]
# issue 3230
@@ -84,7 +84,7 @@ def test_cse_not_possible():
def test_nested_substitution():
# Substitution within a substitution.
e = Add(Pow(w*x + y, 2), sqrt(w*x + y))
- substs, reduced = cse([e], optimizations=[])
+ substs, reduced = cse([e])
assert substs == [(x0, w*x + y)]
assert reduced == [sqrt(x0) + x0**2]
@@ -96,18 +96,22 @@ def test_subtraction_opt():
[e], optimizations=[(cse_opts.sub_pre, cse_opts.sub_post)])
assert substs == [(x0, (x - y)*(y - z))]
assert reduced == [-x0 + exp(-x0)]
- assert cse(-(x - y)*(z - y) + exp(-(x - y)*(z - y))) == \
- ([(x0, (x - y)*(y - z))], [x0 + exp(x0)])
+ e = -(x - y)*(z - y) + exp(-(x - y)*(z - y))
+ substs, reduced = cse(
+ [e], optimizations=[(cse_opts.sub_pre, cse_opts.sub_post)])
+ assert substs == [(x0, (x - y)*(y - z))]
+ assert reduced == [x0 + exp(x0)]
# issue 978
n = -1 + 1/x
e = n/x/(-n)**2 - 1/n/x
- assert cse(e) == ([], [0])
+ assert cse(e, optimizations=[(cse_opts.sub_pre, cse_opts.sub_post)]) == \
+ ([], [0])
def test_multiple_expressions():
e1 = (x + y)*z
e2 = (x + y)*w
- substs, reduced = cse([e1, e2], optimizations=[])
+ substs, reduced = cse([e1, e2])
assert substs == [(x0, x + y)]
assert reduced == [x0*z, x0*w]
l = [w*x*y + z, w*y]
@@ -123,22 +127,40 @@ def test_multiple_expressions():
l = [(x - z)*(y - z), x - z, y - z]
substs, reduced = cse(l)
rsubsts, _ = cse(reversed(l))
- substitutions = [(x0, x - z), (x1, y - z)]
- assert substs == substitutions
- assert rsubsts == substitutions
- assert reduced == [x0*x1, x0, x1]
+ assert substs == [(x0, -z), (x1, x + x0), (x2, x0 + y)]
+ assert rsubsts == [(x0, -z), (x1, x0 + y), (x2, x + x0)]
+ assert reduced == [x1*x2, x1, x2]
l = [w*y + w + x + y + z, w*x*y]
assert cse(l) == ([(x0, w*y)], [w + x + x0 + y + z, x*x0])
assert cse([x + y, x + y + z]) == ([(x0, x + y)], [x0, z + x0])
assert cse([x + y, x + z]) == ([], [x + y, x + z])
assert cse([x*y, z + x*y, x*y*z + 3]) == \
([(x0, x*y)], [x0, z + x0, 3 + x0*z])
+
+@XFAIL # CSE of non-commutative Mul terms is disabled
+def test_non_commutative_cse():
A, B, C = symbols('A B C', commutative=False)
l = [A*B*C, A*C]
assert cse(l) == ([], l)
l = [A*B*C, A*B]
assert cse(l) == ([(x0, A*B)], [x0*C, x0])
+# Test if CSE of non-commutative Mul terms is disabled
+def test_bypass_non_commutatives():
+ A, B, C = symbols('A B C', commutative=False)
+ l = [A*B*C, A*C]
+ assert cse(l) == ([], l)
+ l = [A*B*C, A*B]
+ assert cse(l) == ([], l)
+ l = [B*C, A*B*C]
+ assert cse(l) == ([], l)
+
+@XFAIL # CSE fails when replacing non-commutative sub-expressions
+def test_non_commutative_order():
+ A, B, C = symbols('A B C', commutative=False)
+ x0 = symbols('x0', commutative=False)
+ l = [B+C, A*(B+C)]
+ assert cse(l) == ([(x0, B+C)], [x0, A*x0])
@XFAIL
def test_powers():
@@ -146,12 +168,13 @@ def test_powers():
def test_issues_1399():
- assert cse(w/(x - y) + z/(y - x)) == ([], [(w - z)/(x - y)])
+ assert cse(w/(x - y) + z/(y - x), optimizations='basic') == \
+ ([], [(w - z)/(x - y)])
def test_issue_921():
- assert cse(
- x**5 + x**4 + x**3 + x**2) == ([(x0, x**2)], [x0*(x**3 + x + x0 + 1)])
+ assert cse(x**5 + x**4 + x**3 + x**2, optimizations='basic') \
+ == ([(x0, x**2)], [x0*(x**3 + x + x0 + 1)])
def test_issue_1104():
@@ -160,7 +183,7 @@ def test_issue_1104():
def test_issue_3164():
e = Eq(x*(-x + 1) + x*(x - 1), 0)
- assert cse(e) == ([], [True])
+ assert cse(e, optimizations='basic') == ([], [True])
def test_dont_cse_tuples():
@@ -189,11 +212,11 @@ def test_pow_invpow():
assert cse(1/x**2 + x**2) == \
([(x0, x**2)], [x0 + 1/x0])
assert cse(x**2 + (1 + 1/x**2)/x**2) == \
- ([(x0, x**2)], [x0 + (1 + 1/x0)/x0])
+ ([(x0, x**2), (x1, 1/x0)], [x0 + x1*(x1 + 1)])
assert cse(1/x**2 + (1 + 1/x**2)*x**2) == \
- ([(x0, x**2)], [x0*(1 + 1/x0) + 1/x0])
+ ([(x0, x**2), (x1, 1/x0)], [x0*(x1 + 1) + x1])
assert cse(cos(1/x**2) + sin(1/x**2)) == \
- ([(x0, x**2)], [sin(1/x0) + cos(1/x0)])
+ ([(x0, x**(-2))], [sin(x0) + cos(x0)])
assert cse(cos(x**2) + sin(x**2)) == \
([(x0, x**2)], [sin(x0) + cos(x0)])
assert cse(y/(2 + x**2) + z/x**2/y) == \
@@ -201,7 +224,7 @@ def test_pow_invpow():
assert cse(exp(x**2) + x**2*cos(1/x**2)) == \
([(x0, x**2)], [x0*cos(1/x0) + exp(x0)])
assert cse((1 + 1/x**2)/x**2) == \
- ([(x0, x**2)], [(1 + 1/x0)/x0])
+ ([(x0, x**(-2))], [x0*(x0 + 1)])
assert cse(x**(2*y) + x**(-2*y)) == \
([(x0, x**(2*y))], [x0 + 1/x0])
@@ -211,7 +234,7 @@ def test_postprocess():
assert cse([eq, Eq(x, z + 1), z - 2, (z + 1)*(x + 1)],
postprocess=cse_main.cse_separate) == \
[[(x1, y + 1), (x2, z + 1), (x, x2), (x0, x + 1)],
- [x0 + exp(x0/x1) + cos(x1), x2 - 3, x0*x2]]
+ [x0 + exp(x0/x1) + cos(x1), z - 2, x0*x2]]
def test_issue1400():
@@ -228,13 +251,13 @@ def test_issue1400():
(sqrt(z)/2)**(-2*a + 1)*B(b, sqrt(z))*B(2*a - b + 1,
sqrt(z))*G(b)*G(2*a - b + 1), 1, 0, S(1)/2, z/2, -b + 1, -2*a + b,
-2*a))
-
c = cse(t)
ans = (
- [(x0, sqrt(z)), (x1, -b + 1), (x2, B(b, x0)), (x3, B(-x1, x0)), (x4,
- 2*a + x1), (x5, B(x4 - 1, x0)), (x6, B(x4, x0)), (x7, (x0/2)**(-2*a +
- 1)*G(b)*G(x4))], [(a, a + S(1)/2, 2*a, b, x4, x3*x5*x7, x0*x2*x5*x7,
- x0*x3*x6*x7, x2*x6*x7, 1, 0, S(1)/2, z/2, x1, -x4 + 1, -2*a)])
+ [(x0, 2*a), (x1, -b), (x2, x1 + 1), (x3, x0 + x2), (x4, sqrt(z)), (x5,
+ B(x0 + x1, x4)), (x6, G(b)), (x7, G(x3)), (x8, -x0), (x9,
+ (x4/2)**(x8 + 1)), (x10, x6*x7*x9*B(b - 1, x4)), (x11, x6*x7*x9*B(b,
+ x4)), (x12, B(x3, x4))], [(a, a + S(1)/2, x0, b, x3, x10*x5,
+ x11*x4*x5, x10*x12*x4, x11*x12, 1, 0, S(1)/2, z/2, x2, b + x8, x8)])
assert ans == c
| diff --git a/doc/src/modules/rewriting.rst b/doc/src/modules/rewriting.rst
index 058d051a36bc..3318b0b08c77 100644
--- a/doc/src/modules/rewriting.rst
+++ b/doc/src/modules/rewriting.rst
@@ -84,9 +84,24 @@ in the ``cse`` function. Examples::
⎝[(x₀, sin(x + 1) + cos(y))], ⎣╲╱ x₀ + 4 ⋅╲╱ x₀ + 5 ⎦⎠
>>> pprint(cse((x-y)*(z-y) + sqrt((x-y)*(z-y))), use_unicode=True)
+ ⎛ ⎡ ____ ⎤⎞
+ ⎝[(x₀, -y), (x₁, (x + x₀)⋅(x₀ + z))], ⎣╲╱ x₁ + x₁⎦⎠
+
+Optimizations to be performed before and after common subexpressions
+elimination can be passed in the``optimizations`` optional argument. A set of
+predefined basic optimizations can be applied by passing
+``optimizations='basic'``::
+
+ >>> pprint(cse((x-y)*(z-y) + sqrt((x-y)*(z-y)), optimizations='basic'),
+ ... use_unicode=True)
⎛ ⎡ ____ ⎤⎞
⎝[(x₀, -(x - y)⋅(y - z))], ⎣╲╱ x₀ + x₀⎦⎠
+However, these optimizations can be very slow for large expressions. Moreover,
+if speed is a concern, one can pass the option ``order='none'``. Order of
+terms will then be dependent on hashing algorithm implementation, but speed
+will be greatly improved.
+
More information:
.. autofunction:noindex: cse
diff --git a/doc/src/modules/simplify/simplify.rst b/doc/src/modules/simplify/simplify.rst
index e83cc73dfda7..1b29620791dc 100644
--- a/doc/src/modules/simplify/simplify.rst
+++ b/doc/src/modules/simplify/simplify.rst
@@ -109,6 +109,14 @@ cse
^^^
.. autofunction:: cse
+opt_cse
+^^^^^^^
+.. autofunction:: sympy.simplify.cse_main.opt_cse
+
+tree_cse
+^^^^^^^^
+.. autofunction:: sympy.simplify.cse_main.tree_cse
+
Hypergeometric Function Expansion
---------------------------------
.. module:: sympy.simplify.hyperexpand
| [
{
"components": [
{
"doc": "Find optimization opportunities in Adds, Muls, Pows and negative\ncoefficient Muls\n\nParameters\n----------\nexprs : list of sympy expressions\n The expressions to optimize.\norder : string, 'none' or 'canonical'\n The order by which Mul and Add arguments are pro... | [
"test_issue_3022",
"test_issue_3460",
"test_multiple_expressions",
"test_bypass_non_commutatives",
"test_issues_1399",
"test_issue_921",
"test_issue_3164",
"test_pow_invpow",
"test_postprocess",
"test_issue1400"
] | [
"test_expand_no_log",
"test_expand_no_multinomial",
"test_expand_negative_integer_powers",
"test_expand_non_commutative",
"test_expand_radicals",
"test_expand_modulus",
"test_issue_2644",
"test_expand_frac",
"test_expand_power_base",
"test_expand_arit",
"test_power_expand",
"test_issues_2820_3... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Faster CSE
Here is a proposal for a faster CSE.
For small expressions the CSE computation time is the same, but there is a huge improvement on large expressions, e.g., http://nbviewer.ipython.org/5986996.
I've let previous cse function, as prev_cse(), inside the file for easier compare.
I plan to do some comments on code in this PR tomorrow, explaining some decisions and asking for some help.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/simplify/cse_main.py]
(definition of opt_cse:)
def opt_cse(exprs, order='canonical'):
"""Find optimization opportunities in Adds, Muls, Pows and negative
coefficient Muls
Parameters
----------
exprs : list of sympy expressions
The expressions to optimize.
order : string, 'none' or 'canonical'
The order by which Mul and Add arguments are processed. For large
expressions where speed is a concern, use the setting order='none'.
Returns
-------
opt_subs : dictionary of expression substitutions
The expression substitutions which can be useful to optimize CSE.
Examples
--------
>>> from sympy.simplify.cse_main import opt_cse
>>> from sympy.abc import x
>>> opt_subs = opt_cse([x**-2])
>>> print(opt_subs)
{x**(-2): 1/(x**2)}"""
(definition of opt_cse._find_opts:)
def _find_opts(expr):
(definition of opt_cse._match_common_args:)
def _match_common_args(Func, funcs):
(definition of tree_cse:)
def tree_cse(exprs, symbols, opt_subs=None, order='canonical'):
"""Perform raw CSE on expression tree, taking opt_subs into account.
Parameters
==========
exprs : list of sympy expressions
The expressions to reduce.
symbols : infinite iterator yielding unique Symbols
The symbols used to label the common subexpressions which are pulled
out.
opt_subs : dictionary of expression substitutions
The expressions to be substituted before any CSE action is performed.
order : string, 'none' or 'canonical'
The order by which Mul and Add arguments are processed. For large
expressions where speed is a concern, use the setting order='none'."""
(definition of tree_cse._find_repeated:)
def _find_repeated(expr):
(definition of tree_cse._rebuild:)
def _rebuild(expr):
[end of new definitions in sympy/simplify/cse_main.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | |
sympy__sympy-2340 | 2,340 | sympy/sympy | 0.7 | 92777896edfe8c30d8bb634ab316727c3a2485f5 | 2013-07-30T13:33:18Z | diff --git a/sympy/core/sets.py b/sympy/core/sets.py
index 33be2bd71398..0ae429038481 100644
--- a/sympy/core/sets.py
+++ b/sympy/core/sets.py
@@ -228,6 +228,38 @@ def measure(self):
"""
return self._measure
+ def transform(self, *args):
+ """ Image of set under transformation ``f``
+
+ .. math::
+ { f(x) | x \in self }
+
+ Examples
+ ========
+
+ >>> from sympy import Interval, Symbol
+ >>> x = Symbol('x')
+
+ >>> Interval(0, 2).transform(x, 2*x)
+ [0, 4]
+
+ >>> Interval(0, 2).transform(lambda x: 2*x)
+ [0, 4]
+
+ See Also:
+ TransformationSet
+ """
+ if len(args) == 2:
+ from sympy import Lambda
+ f = Lambda(*args)
+ else:
+ f, = args
+ return self._transform(f)
+
+ def _transform(self, f):
+ from sympy.sets.fancysets import TransformationSet
+ return TransformationSet(f, self)
+
@property
def _measure(self):
raise NotImplementedError("(%s)._measure" % self)
@@ -631,6 +663,17 @@ def _contains(self, other):
return expr
+ def _transform(self, f):
+ # TODO: manage left_open and right_open better
+ from sympy.functions.elementary.miscellaneous import Min, Max
+ _left, _right = f(self.left), f(self.right)
+ left, right = Min(_left, _right), Max(_left, _right)
+ if _right == left: # switch happened
+ left_open, right_open = self.right_open, self.left_open
+ else:
+ left_open, right_open = self.left_open, self.right_open
+ return Interval(left, right, left_open, right_open)
+
@property
def _measure(self):
return self.end - self.start
@@ -853,6 +896,9 @@ def _measure(self):
parity *= -1
return measure
+ def _transform(self, f):
+ return Union(arg.transform(f) for arg in self.args)
+
def as_relational(self, symbol):
"""Rewrite a Union in terms of equalities and logic operators. """
return Or(*[set.as_relational(symbol) for set in self.args])
@@ -951,6 +997,9 @@ def _sup(self):
def _complement(self):
raise NotImplementedError()
+ def _transform(self, f):
+ return Intersection(arg.transform(f) for arg in self.args)
+
def _contains(self, other):
from sympy.logic.boolalg import And
return And(*[set.contains(other) for set in self.args])
@@ -1078,6 +1127,8 @@ def _union(self, other):
def __iter__(self):
return iter([])
+ def _transform(self, f):
+ return self
class UniversalSet(with_metaclass(Singleton, Set)):
"""
@@ -1212,6 +1263,9 @@ def _contains(self, other):
"""
return other in self._elements
+ def _transform(self, f):
+ return FiniteSet(*map(f, self))
+
@property
def _complement(self):
"""
diff --git a/sympy/functions/elementary/miscellaneous.py b/sympy/functions/elementary/miscellaneous.py
index 8f6e21fed7d3..03eceb63a62d 100644
--- a/sympy/functions/elementary/miscellaneous.py
+++ b/sympy/functions/elementary/miscellaneous.py
@@ -11,6 +11,7 @@
from sympy.core.singleton import Singleton
from sympy.core.rules import Transform
from sympy.core.compatibility import as_int, with_metaclass, xrange
+from sympy.core.logic import fuzzy_and
class IdentityFunction(with_metaclass(Singleton, Lambda)):
@@ -369,6 +370,10 @@ def _eval_derivative(self, s):
l.append(df * da)
return Add(*l)
+ @property
+ def is_real(self):
+ return fuzzy_and(*[arg.is_real for arg in self.args])
+
class Max(MinMaxBase, Application):
"""
Return, if possible, the maximum value of the list.
| diff --git a/sympy/core/tests/test_sets.py b/sympy/core/tests/test_sets.py
index 66441286c2ba..272a790971ad 100644
--- a/sympy/core/tests/test_sets.py
+++ b/sympy/core/tests/test_sets.py
@@ -501,3 +501,33 @@ def test_universalset():
def test_Interval_free_symbols():
x = Symbol('x', real=True)
assert set(Interval(0, x).free_symbols) == set((x,))
+
+def test_transform_interval():
+ x = Symbol('x', real=True)
+ assert Interval(-2, 1).transform(x, 2*x) == Interval(-4, 2)
+ assert Interval(-2, 1, True, False).transform(x, 2*x) == \
+ Interval(-4, 2, True, False)
+ assert Interval(-2, 1, True, False).transform(x, x**2) == \
+ Interval(1, 4, False, True)
+ assert Interval(-2, 1).transform(x, x**2) == Interval(1, 4)
+ assert Interval(-2, 1, True, False).transform(x, x**2) == \
+ Interval(1, 4, False, True)
+
+def test_transform_FiniteSet():
+ x = Symbol('x', real=True)
+ assert FiniteSet(1, 2, 3).transform(x, 2*x) == FiniteSet(2, 4, 6)
+
+def test_transform_Union():
+ x = Symbol('x', real=True)
+ assert (Interval(-2, 0) + FiniteSet(1, 2, 3)).transform(x, x**2) == \
+ (Interval(0, 4) + FiniteSet(9))
+
+def test_transform_Intersection():
+ x = Symbol('x', real=True)
+ y = Symbol('y', real=True)
+ assert Interval(-2, 0).intersect(Interval(x, y)).transform(x, x**2) == \
+ Interval(0, 4).intersect(Interval(Min(x**2, y**2), Max(x**2, y**2)))
+
+def test_transform_EmptySet():
+ x = Symbol('x', real=True)
+ assert S.EmptySet.transform(x, 2*x) == S.EmptySet
diff --git a/sympy/functions/elementary/tests/test_miscellaneous.py b/sympy/functions/elementary/tests/test_miscellaneous.py
index 67f7ef4aa97a..b217151aa454 100644
--- a/sympy/functions/elementary/tests/test_miscellaneous.py
+++ b/sympy/functions/elementary/tests/test_miscellaneous.py
@@ -98,6 +98,10 @@ def test_Min():
assert Min(0,-x,1-2*x).diff(x) == -Heaviside(x + Min(0, -2*x + 1)) \
- 2*Heaviside(2*x + Min(0, -x) - 1)
+ a, b = Symbol('a', real=True), Symbol('b', real=True)
+ # a and b are both real, Min(a, b) should be real
+ assert Min(a, b).is_real
+
def test_Max():
from sympy.abc import x, y, z
@@ -144,6 +148,9 @@ def test_Max():
assert Max(x**2, 1+x, 1).diff(x) == 2*x*Heaviside(x**2 - Max(1,x+1)) \
+ Heaviside(x - Max(1,x**2) + 1)
+ a, b = Symbol('a', real=True), Symbol('b', real=True)
+ # a and b are both real, Max(a, b) should be real
+ assert Max(a, b).is_real
def test_root():
from sympy.abc import x, y, z
diff --git a/sympy/sets/tests/test_fancysets.py b/sympy/sets/tests/test_fancysets.py
index ecbdb52e048d..b79f62501747 100644
--- a/sympy/sets/tests/test_fancysets.py
+++ b/sympy/sets/tests/test_fancysets.py
@@ -65,6 +65,9 @@ def test_TransformationSet():
assert harmonics.is_iterable
+def test_transform_is_TransformationSet():
+ assert isinstance(Range(5).transform(x, sqrt(sin(x))), TransformationSet)
+
@XFAIL
def test_halfcircle():
| [
{
"components": [
{
"doc": "Image of set under transformation ``f``\n\n.. math::\n { f(x) | x \\in self }\n\nExamples\n========\n\n>>> from sympy import Interval, Symbol\n>>> x = Symbol('x')\n\n>>> Interval(0, 2).transform(x, 2*x)\n[0, 4]\n\n>>> Interval(0, 2).transform(lambda x: 2*x)\n[0, 4]\n... | [
"test_transform_interval",
"test_transform_FiniteSet",
"test_transform_Union",
"test_transform_Intersection",
"test_Min",
"test_Max",
"test_transform_is_TransformationSet"
] | [
"test_interval_arguments",
"test_interval_symbolic_end_points",
"test_union",
"test_difference",
"test_complement",
"test_intersect",
"test_intersection",
"test_interval_subs",
"test_interval_to_mpi",
"test_measure",
"test_subset",
"test_contains",
"test_interval_symbolic",
"test_union_con... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Transform set
Adds `transform` method to sets in the style of `union`, and `complement`.
Adds methods for Union, Intersection, Interval, FiniteSet and defaults to the construction of a TransformationSet if these fail.
----------
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/core/sets.py]
(definition of Set.transform:)
def transform(self, *args):
"""Image of set under transformation ``f``
.. math::
{ f(x) | x \in self }
Examples
========
>>> from sympy import Interval, Symbol
>>> x = Symbol('x')
>>> Interval(0, 2).transform(x, 2*x)
[0, 4]
>>> Interval(0, 2).transform(lambda x: 2*x)
[0, 4]
See Also:
TransformationSet"""
(definition of Set._transform:)
def _transform(self, f):
(definition of Interval._transform:)
def _transform(self, f):
(definition of Union._transform:)
def _transform(self, f):
(definition of Intersection._transform:)
def _transform(self, f):
(definition of EmptySet._transform:)
def _transform(self, f):
(definition of FiniteSet._transform:)
def _transform(self, f):
[end of new definitions in sympy/core/sets.py]
[start of new definitions in sympy/functions/elementary/miscellaneous.py]
(definition of MinMaxBase.is_real:)
def is_real(self):
[end of new definitions in sympy/functions/elementary/miscellaneous.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc | ||
sympy__sympy-2327 | 2,327 | sympy/sympy | 0.7 | 6da418e1c87606c3ae16ea52c0754b9f59ddfa75 | 2013-07-26T09:48:50Z | diff --git a/sympy/functions/special/elliptic_integrals.py b/sympy/functions/special/elliptic_integrals.py
index 82933f40f17c..034a0cfab7a8 100644
--- a/sympy/functions/special/elliptic_integrals.py
+++ b/sympy/functions/special/elliptic_integrals.py
@@ -32,6 +32,8 @@ class elliptic_k(Function):
pi/2
>>> elliptic_k(1.0 + I)
1.50923695405127 + 0.625146415202697*I
+ >>> elliptic_k(z).series(z, n=3)
+ pi/2 + pi*z/8 + 9*pi*z**2/128 + O(z**3)
References
==========
@@ -70,6 +72,10 @@ def _eval_conjugate(self):
if (z.is_real and (z - 1).is_positive) is False:
return self.func(z.conjugate())
+ def _eval_nseries(self, x, n, logx):
+ from sympy.simplify import hyperexpand
+ return hyperexpand(self.rewrite(hyper)._eval_nseries(x, n=n, logx=logx))
+
def _eval_rewrite_as_hyper(self, z):
return (pi/2)*hyper((S.Half, S.Half), (S.One,), z)
@@ -164,6 +170,8 @@ class elliptic_e(Function):
>>> from sympy.abc import z, m
>>> elliptic_e(z, m).series(z)
z + z**5*(-m**2/40 + m/30) - m*z**3/6 + O(z**6)
+ >>> elliptic_e(z).series(z, n=4)
+ pi/2 - pi*z/8 - 3*pi*z**2/128 - 5*pi*z**3/512 + O(z**4)
>>> elliptic_e(1 + I, 2 - I/2).n()
1.55203744279187 + 0.290764986058437*I
>>> elliptic_e(0)
@@ -227,6 +235,12 @@ def _eval_conjugate(self):
if (m.is_real and (m - 1).is_positive) is False:
return self.func(z.conjugate(), m.conjugate())
+ def _eval_nseries(self, x, n, logx):
+ from sympy.simplify import hyperexpand
+ if len(self.args) == 1:
+ return hyperexpand(self.rewrite(hyper)._eval_nseries(x, n=n, logx=logx))
+ return super(elliptic_e, self)._eval_nseries(x, n=n, logx=logx)
+
def _eval_rewrite_as_hyper(self, *args):
if len(args) == 1:
z = args[0]
@@ -258,8 +272,8 @@ class elliptic_pi(Function):
>>> from sympy import elliptic_pi, I, pi, O, S
>>> from sympy.abc import z, n, m
- >>> elliptic_pi(n, z, m).series(z)
- z + z**3*(m/6 + n/3) + z**5*(3*m**2/40 + m*n/10 - m/30 + n**2/5 - n/15) + O(z**6)
+ >>> elliptic_pi(n, z, m).series(z, n=4)
+ z + z**3*(m/6 + n/3) + O(z**4)
>>> elliptic_pi(0.5 + I, 1.0 - I, 1.2)
2.50232379629182 - 0.760939574180767*I
>>> elliptic_pi(0, 0)
diff --git a/sympy/simplify/hyperexpand.py b/sympy/simplify/hyperexpand.py
index 206535f856d1..a1b65f0ed62e 100644
--- a/sympy/simplify/hyperexpand.py
+++ b/sympy/simplify/hyperexpand.py
@@ -1922,7 +1922,7 @@ def hyperexpand_special(ap, bq, z):
z_ = z
z = unpolarify(z)
if z == 0:
- return S.Zero
+ return S.One
if p == 2 and q == 1:
# 2F1
a, b, c = ap + bq
@@ -1955,6 +1955,10 @@ def _hyperexpand(func, z, ops0=[], z0=Dummy('z0'), premult=1, prem=0,
is multiplied by premult. Then ops0 is applied.
premult must be a*z**prem for some a independent of z.
"""
+
+ if z is S.Zero:
+ return S.One
+
z = polarify(z, subs=False)
if rewrite == 'default':
rewrite = 'nonrepsmall'
| diff --git a/sympy/functions/special/tests/test_elliptic_integrals.py b/sympy/functions/special/tests/test_elliptic_integrals.py
index 5bb0b1897186..4580f14245c5 100644
--- a/sympy/functions/special/tests/test_elliptic_integrals.py
+++ b/sympy/functions/special/tests/test_elliptic_integrals.py
@@ -1,5 +1,5 @@
from sympy import (S, Symbol, pi, I, oo, zoo, sin, sqrt, tan, atan, gamma,
- atanh, hyper, meijerg)
+ atanh, hyper, meijerg, O)
from sympy.functions.special.elliptic_integrals import (elliptic_k as K,
elliptic_f as F, elliptic_e as E, elliptic_pi as P)
from sympy.utilities.randtest import (test_derivative_numerically as td,
@@ -36,6 +36,9 @@ def test_K():
meijerg(((S.Half, S.Half), []), ((S.Zero,), (S.Zero,)), -z)/2
assert tn(K(z), meijerg(((S.Half, S.Half), []), ((S.Zero,), (S.Zero,)), -z)/2)
+ assert K(z).series(z) == pi/2 + pi*z/8 + 9*pi*z**2/128 + \
+ 25*pi*z**3/512 + 1225*pi*z**4/32768 + 3969*pi*z**5/131072 + O(z**6)
+
def test_F():
assert F(z, 0) == z
@@ -58,6 +61,9 @@ def test_F():
mr = Symbol('m', real=True, negative=True)
assert F(z, mr).conjugate() == F(z.conjugate(), mr)
+ assert F(z, m).series(z) == \
+ z + z**5*(3*m**2/40 - m/30) + m*z**3/6 + O(z**6)
+
def test_E():
assert E(z, 0) == z
@@ -92,6 +98,11 @@ def test_E():
-meijerg(((S.Half, S(3)/2), []), ((S.Zero,), (S.Zero,)), -z)/4
assert tn(E(z), -meijerg(((S.Half, S(3)/2), []), ((S.Zero,), (S.Zero,)), -z)/4)
+ assert E(z, m).series(z) == \
+ z + z**5*(-m**2/40 + m/30) - m*z**3/6 + O(z**6)
+ assert E(z).series(z) == pi/2 - pi*z/8 - 3*pi*z**2/128 - \
+ 5*pi*z**3/512 - 175*pi*z**4/32768 - 441*pi*z**5/131072 + O(z**6)
+
def test_P():
assert P(0, z, m) == F(z, m)
@@ -134,3 +145,6 @@ def test_P():
assert td(P(n, rx, ry), n)
assert td(P(rx, z, ry), z)
assert td(P(rx, ry, m), m)
+
+ assert P(n, z, m).series(z) == z + z**3*(m/6 + n/3) + \
+ z**5*(3*m**2/40 + m*n/10 - m/30 + n**2/5 - n/15) + O(z**6)
diff --git a/sympy/simplify/tests/test_hyperexpand.py b/sympy/simplify/tests/test_hyperexpand.py
index d980cabca509..7e32cabd39c0 100644
--- a/sympy/simplify/tests/test_hyperexpand.py
+++ b/sympy/simplify/tests/test_hyperexpand.py
@@ -624,7 +624,7 @@ def test_hyperexpand_special():
assert hyperexpand(meijerg([1 - z - a/2], [1 - z + a/2], [b/2], [-b/2], 1)) == \
gamma(1 - 2*z)*gamma(z + a/2 + b/2)/gamma(1 - z + a/2 - b/2) \
/gamma(1 - z - a/2 + b/2)/gamma(1 - z + a/2 + b/2)
- assert hyperexpand(hyper([a], [b], 0)) == 0
+ assert hyperexpand(hyper([a], [b], 0)) == 1
assert hyper([a], [b], 0) != 0
| [
{
"components": [
{
"doc": "",
"lines": [
75,
77
],
"name": "elliptic_k._eval_nseries",
"signature": "def _eval_nseries(self, x, n, logx):",
"type": "function"
},
{
"doc": "",
"lines": [
238,
... | [
"test_K",
"test_E",
"test_hyperexpand_special"
] | [
"test_F",
"test_branch_bug",
"test_hyperexpand",
"test_roach",
"test_polynomial",
"test_hyperexpand_bases",
"test_hyperexpand_parametric",
"test_shifted_sum",
"test_formulae",
"test_meijerg_formulae",
"test_plan",
"test_plan_derivatives",
"test_reduction_operators",
"test_shift_operators",... | This is a feature request which requires a new feature to add in the code repository.
<<NEW FEATURE REQUEST>>
<request>
Implement series expansion for elliptic_e(z) and elliptic_k(z)
This pull also adds auto evaluation: hyper([a1,..,ap], [b1,..,bq], 0) = 1, see [this](http://functions.wolfram.com/HypergeometricFunctions/HypergeometricPFQ/03/01/01/). Maybe we should check for R of convergence?
----------
hyper by design never autoevaluates. But admittedly I have not been very active lately, so if this breaks nothing and is generally desired, I'm fine with it (just wanted to let you know why this is not there yet).
```
In [12]: hyper([1, 2, 3], [4, 5, 6, 7], 0)
Out[12]:
┌─ ⎛ 1, 2, 3 │ ⎞
├─ ⎜ │ 0⎟
3╵ 4 ⎝4, 5, 6, 7 │ ⎠
In [13]: hyperexpand(_)
Out[13]: 0
```
> hyper by design never autoevaluates.
@ness01, I think hyperexpand() is ok for me, but it gave us wrong answers:
```
In [12]: h = hyper([-1, 1], [-2, 2], 0)
In [13]: h.convergence_statement
Out[13]: True
In [14]: hyperexpand(h)
Out[14]: nan
In [25]: hyper([1, 2, 3], [4, 5, 6, 7], z).radius_of_convergence
Out[25]: ∞
In [26]: hyperexpand(hyper([1, 2, 3], [4, 5, 6, 7], 0))
Out[26]: 0
```
Well these look like bugs in hyperexpand.
```
In [17]: h = hyper([-1, 1], [-2, 2], x)
In [18]: hyperexpand(_)
Out[18]:
⎛ x 3⎞ x
⎜- ─ + ─⎟⋅ℯ
⎝ 2 2⎠ 3
──────────── - ───
x 2⋅x
In [19]: limit(_, x, 0)
Out[19]: 1
```
I think it may make sense to check at the very top of `_hyperexpand` for zero argument. The underlying problem is in lines 1971, 2000 etc where we just substitute into an expression (this obviously yields nans in bad cases). Maybe we should have (or actually have, or just write, or ...) some sort of "smart-subs" which does a limit if the argument is an actual number.
The other one is an even more definite bug, in line 1923 of hyperexpand.py:
```
z_ = z
z = unpolarify(z)
if z == 0:
- return S.Zero
+ return S.One
if p == 2 and q == 1:
# 2F1
a, b, c = ap + bq
```
I must have been asleep while I wrote that. Sorry.
> The underlying problem is in lines 1971, 2000 etc where we just substitute into an expression (this obviously yields nans in bad cases).
The real problem (for this example) is a lines:
```
2040 # Now carry out the plan.
2041 r = carryout_plan(formula, ops) + p
```
We should take limit here.
> I think it may make sense to check at the very top of _hyperexpand for zero argument.
Ok. I'll try to do this.
</request>
There are several new functions or classes that need to be implemented, using the definitions below:
<<NEW DEFINITIONS>>
There are several new functions or classes that need to be implemented, using the definitions below:
<definitions>
[start of new definitions in sympy/functions/special/elliptic_integrals.py]
(definition of elliptic_k._eval_nseries:)
def _eval_nseries(self, x, n, logx):
(definition of elliptic_e._eval_nseries:)
def _eval_nseries(self, x, n, logx):
[end of new definitions in sympy/functions/special/elliptic_integrals.py]
</definitions>
Please note that in addition to the newly added components mentioned above, you also need to make other code changes to ensure that the new feature can be executed properly.
<<END>> | 3555a90ee3ac3c7082df19d96a922630100c69cc |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.